Daily Beast:Analysts for ShotSpotter, a system that uses microphones to identify the location of gunshots and alert police across America, “frequently modify alerts at the request of police departments—some of which appear to be grasping for evidence that supports their narrative of events,” according to court filings reviewed by Motherboard.
In one instance involving a murder in Chicago, ShotSpotter initially classified a suspected gunshot as a firecracker. But a ShotSpotter analyst allegedly overrode the algorithm and “reclassified” it as a gunshot. Later, another ShotSpotter analyst reportedly altered the supposed shot’s coordinates to a point closer to where a suspect’s car had been seen in surveillance footage.
In another case, ShotSpotter analysts overruled the algorithm—which initially classified a sound picked up by microphones as helicopter rotors, not gunshots—at the request of Rochester, New York police. The suspect’s conviction was later tossed after a judge questioned the reliability of ShotSpotter’s technology.
Any claims of accuracy by ShotSpotter, the company admitted in court, were generated not by its engineers but by its marketing team. ShotSpotter is reportedly used in more than 100 U.S. cities. read more..