Research Findings

Numerous analyses from across the country have found that ShotSpotter generates a huge proportion of unfounded deployments that turn up no evidence of gun crime.

The Findings in Chicago

MJC analysis from July 1, 2019 through April 14, 2021 found:

  • 89% of ShotSpotter deployments did not lead police to find evidence of gun-related crime on arrival.
  • 61 unfounded ShotSpotter-initiated police deployments every day, on average.
  • More than 40,000 unfounded ShotSpotter deployments over 21.5 months.

MJC analysis from April 15, 2021 through April 13, 2022 found:

  • 90.4% of ShotSpotter dispatches did not lead police to find evidence of gun-related crime on arrival.
  • 87 unfounded ShotSpotter deployments every day, on average.
  • 31,640 unfounded ShotSpotter deployments over the course of the year.
“ShotSpotter alerts rarely produce documented evidence of a gun-related crime, investigatory stop, or recovery of a firearm.”
–Chicago OIG

Chicago OIG report from January 1, 2020 through May 21, 2021 found:

  • 50,176 ShotSpotter-initiated police dispatches.
  • 9.1% of ShotSpotter dispatches report evidence of a gun-related criminal offense.
  • More than 2,400 stop-and-frisks associated with ShotSpotter alerts.
  • Some officers are relying on ShotSpotter results in the aggregate to provide an additional rationale to initiate a stop or conduct a pat down
“[T]he introduction of ShotSpotter technology in Chicago has changed the way some CPD members perceive and interact with individuals present in areas where ShotSpotter alerts are frequent.”
–Chicago OIG

Findings in Other Cities

In Atlanta, Georgia, between January and July 2019, an official city analysis found:

  • Only 3% of ShotSpotter alerts led police to find shell casings.
  • ShotSpotter led to only 5 arrests and 5 guns recovered over 6 months, a cost-ratio of $56,000 per arrest or gun recovered.

In Dayton, Ohio between December 11, 2020 and June 30, 2021 an investigative report found:

  • Less than 2% of ShotSpotter alerts resulted in an arrest.
  • Only 5% of ShotSpotter alerts led police to report incidents of any crime.

In Houston, Texas, between December 2020 and September 2021 an official police analysis found:

  • Less than 2% of ShotSpotter alerts resulted in an arrest.
  • Of 2,330 ShotSpotter alerts, 54 resulted in or were linked to an arrest.

ShotSpotter claims its system is “97% accurate.” This claim is misleading and deceptive.

ShotSpotter has never actually tested its system to determine its accuracy. ShotSpotter just assumes that every alert is an actual gunshot. It only counts an alert as an error if a police department happens to send in a voluntary complaint about a particular alert.

ShotSpotter’s “accuracy” claim is just a tally of customer complaints.

In Chicago, police never complain to ShotSpotter when they chase down a ShotSpotter alert and turn up nothing. So ShotSpotter counts every one of the 90%+ unfounded ShotSpotter alerts as “accurate.”

ShotSpotter is supposed to be able to tell the difference between gunshots and other loud noises like firecrackers, cars backfiring, construction noises, helicopters, and other harmless sounds.

Even though ShotSpotter has been on the market 20 years, it has never published a scientific study actually testing how reliably it can tell the difference between the sound of gunfire and other loud noises.

Nobody has ever done an empirical study of how readily the system is fooled by noises that are not gunfire (i.e. how often it generates false-positive alerts).

ShotSpotter’s gunshot detected system is a black box.

ShotSpotter refuses to make public the key document that tells its operators how they should decide whether to trigger an alert and dispatch police. ShotSpotter is fighting in court to keep that document secret.

ShotSpotter has not allowed anyone to audit or validate its artificial intelligence algorithms that purport to filter out non-gunfire sounds.

ShotSpotter’s contracts incentivize it to over-report loud noises as gunfire. ShotSpotter’s contracts do not hold ShotSpotter accountable for false alerts to noises other than gunfire.

A small number of academic studies have been done to try to determine whether ShotSpotter reduces gun crime. Among the most recent are:

Doucette et al (2021) In a study of 68 counties across 17 years, “[r]esults suggest that implementing ShotSpotter technology has no significant impact on firearm related homicides or arrest outcomes.”

Mares & Blackburn (2020) In a study of St. Louis’ use of ShotSpotter “results indicate no reductions in serious violent crimes, yet [ShotSpotter] increases demands on police resources.” The study also found “citizen-initiated calls for service are over seven times more efficient in uncovering and responding to criminal behavior” than ShotSpotter alerts, and that ShotSpotter did “not appear to deliver a consistent improvement in the response time to calls for shots fired.”

Ratcliffe et al. (2018). Another gunshot detection technology (not ShotSpotter) used in Philadelphia “did not significantly affect the number of confirmed shootings, but it did increase the workload of police attending incidents for which no evidence of a shooting was found.”

ShotSpotter is a publicly-traded company that provides surveillance technology to law enforcement and other customers. In addition to gunshot detection, ShotSpotter provides so-called “predictive policing” or “patrol management’ software, and other products. ShotSpotter has contracts with more than 100 police departments. The City of Chicago is one of ShotSpotter’s two largest customers, accounting for 18% of its annual revenue in 2020.