Join us for an engaging discussion on how to navigate college as an undocumented, DACAmented, and immigrant student. We'll be joined by current and former students to hear their stories and talk about overcoming the barriers to higher education.

Event Date

Wednesday, September 8, 2021 - 6:00pm to
Thursday, September 9, 2021 - 6:45pm

Featured image

More information / register

Website

Tweet Text

[node:title]

Date

Wednesday, September 8, 2021 - 7:00pm

Menu parent dynamic listing

18

Matt Coles, Former Deputy Legal Director and Director of Center for Equality

James C. Hormel, the first openly gay U.S. ambassador, died on Friday, August 13. Former director of the ACLU’s LGBTQ & HIV Project Matt Coles reflected on the long friendship between Hormel and the ACLU.

Jim Hormel was a fine human being. In a lot of circles, Jim was known as a philanthropist identified with LGBTQ causes. That was true enough, but he was so much more than that. Jim could have given a fraction of what he did and he’d still have been a very generous supporter of human rights; that and one of the most important supporters of LGBTQ work of all time. He could have concentrated his generosity on LGBTQ issues and still have been an extraordinary supporter of liberty; but he had a much broader vision. He could have just given money and counted it as more than enough; but he did the hands-on work as well.

Take his support of LGBTQ work. At a time when the LGBTQ community had nothing — no federal discrimination protection, protection in one (count it) state, and virtually no relationship protection anywhere — Jim, along with the late Brooks McCormick, gave the ACLU the money to start what is now the LGBTQ & HIV Project. He kept supporting it, year after year — and not just with cash. He held fundraisers at his home. He personally introduced it to other people who might want to help. Look at almost every major LGBTQ group in the Unites States over the last 40 years and you’ll find Jim Hormel was a major supporter, often one of the first. That early support was crucial to the survival of so many groups that made a difference.

Jim wasn’t narrow in his support of civil and human rights. He wholeheartedly supported all of the ACLU’s work (even the work he disagreed with). He was an early major supporter of the San Francisco AIDS Foundation, the San Francisco Public Library, People for the American Way, broadly of education, and gave generously and strategically to many, many progressive political candidates throughout the nation.

Jim didn’t just give; he worked. The first time I met Jim was when we were both getting trained to walk precincts to try to convince the voters of San Francisco not to repeal the city’s newly passed domestic partnership law. We lost that election, but Jim was a force in making sure it was a temporary loss. When President Bill Clinton nominated Jim to be ambassador to Luxembourg, Jim was subjected to a torrent of vicious abuse. He put up with it for over two years. When he shared some of the harassment he was enduring with me, I asked why he didn’t just walk away.

“You’ve done your part,” I said. He looked at me with astonishment. “I can’t stop now,” he said. “That would set us further back than when we started.” Jim didn’t need the ambassadorship for validation; he certainly didn’t need it to get a trip to Europe. Jim did it to show that we are worthy, we are all worthy, to serve our country.

Jim was a smart lawyer with an incisive mind and so much LGBTQ work was and is the better for the way he brought it to bear. I always enjoyed talking with Jim, but I also always knew I had to be on my toes. Sooner or later, even running into him on the 38 bus in San Francisco, I’d get tough questions about just how we were going to get equality for LGBTQ people and ideas about ways we could do it better. I always came away from those conversations with ideas about how to do just that. (Yes, he rode the bus; he flew coach on planes as well. I once ran into him on a plane to JFK; he asked me to join him on the trip into town. “Oh boy,” I thought, “A Towncar.” No such luck; he joined me on the AirTrain and the number 7 into the city).

Don’t get the wrong impression though; if the questions were tough, they always came in a spirit of collaboration. We were in this together. He was unfailingly kind to everyone he dealt with and his empathy was limitless. But he also had a wicked sense of humor that was truly disarming. You can tell a lot about a person who runs a business by the atmosphere in their workplace. The people who worked for Jim seemed to absolutely adore him.

It’s a great mistake to think those of earlier generations were better than the people of today. Still, in this case I think it’s fair to say we won’t see Jim Hormel’s like again. Goodbye Jim. I am so very glad I knew you.

Date

Tuesday, August 24, 2021 - 12:45pm

Featured image

Black-and-white photo of James Hormel (left) and Matt Coles from the ACLU archives.

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

LGBTQ+ Rights

Show related content

Imported from National NID

42387

Menu parent dynamic listing

22

Imported from National VID

44881

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

Remembering James Hormel, a pioneering champion and philanthropist of LGBTQ rights

Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project

(Updated below)

A critical report on the ShotSpotter gunshot detection system issued today by the City of Chicago’s Inspector General (IG) is the latest indication of deep problems with the gunshot detection company and its technology, including its methodology, effectiveness, impact on communities of color, and relationship with law enforcement. The report questioned the “operational value” of the technology and found that it increases the incidence of stop and frisk tactics by police officers in some neighborhoods.

The IG’s report follows a similarly critical report and legal filing by the Northwestern School of Law’s MacArthur Justice Center and devastating investigative reporting by Vice News and the Associated Press. Last week, the AP profiled Michael Williams, a man who spent a year in jail on murder charges based on evidence from ShotSpotter before having his charges dismissed when prosecutors admitted they had insufficient evidence against him.

Shotspotter installs 20 to 25 microphones per square mile in the cities where it is installed, and uses those microphones to try to identify and locate the sound of gunshots. In the past, we have scrutinized this company and its technology from a privacy perspective. Placing live microphones in public places raises significant privacy concerns. After looking at the details of ShotSpotter’s system, we didn’t think it posed an active threat to privacy, but we were concerned about the precedent it set (and others agreed).

But aural privacy is not the main problem with ShotSpotter, it turns out. There are several other very significant civil liberties problems with the technology.

First, as the MacArthur Justice Center details, ShotSpotter is deployed overwhelmingly in communities of color, which already disproportionately bear the brunt of a heavy police presence. The police say they pick neighborhoods for deployment based on where the most shootings are, but there are several problems with that:

  • ShotSpotter false alarms send police on numerous trips (in Chicago, more than 60 times a day) into communities for no reason and on high alert expecting to potentially confront a dangerous situation. Given the already tragic number of shootings of Black people by police, that is a recipe for trouble.
  • Indeed, the Chicago IG’s analysis of Chicago police data found that the “perceived aggregate frequency of ShotSpotter alerts” in some neighborhoods leads officers to engage in more stops and pat downs.
  • The placement of sensors in some neighborhoods but not others means that the police will detect more incidents (real or false) in places where the sensors are located. That can distort gunfire statistics and create a circular statistical justification for over-policing in communities of color.

Second, ShotSpotter’s methodology is used to provide evidence against defendants in criminal cases, but isn’t transparent and hasn’t been peer-reviewed or otherwise independently evaluated. That simply isn’t acceptable for data that is used in court.

The company’s sensors automatically send audio files to human analysts when those sensors detect gunshot-like sounds. Those analysts then decide whether the sounds are gunshots or other loud noises such as firecrackers, car backfires, or construction noises. They also triangulate the timing of when sounds reach different microphones to try to establish a location for the noise, and if it is believed to be the sound of gunshot, they make an effort to figure out how many shots were fired and what kind of gun is involved (such as a pistol versus a fully automatic weapon).

ShotSpotter portrays all of this as a straightforward and objective process, but it is anything but. Vice News and the AP note examples of the company’s analysts changing their judgments on all of the above types of results (which ShotSpotter disputes). In addition, the company uses AI algorithms to assist in the analysis — and as with all AI algorithms, that raises questions about reliability, transparency, and the reproducibility of results. The company turned down a request by the independent security technology research publication IPVM to carry out independent tests of its methodologies.

Further calling into question the appropriateness of ShotSpotter evidence for use in court is a third problem: the company’s apparent tight relationship with law enforcement. A ShotSpotter expert admitted in a 2016 trial, for example, that the company reclassified sounds from a helicopter to a bullet at the request of a police department customer, saying such changes occur “all the time” because “we trust our law enforcement customers to be really upfront and honest with us.” ShotSpotter also uses reports from police officers as “ground truth” in training its AI algorithm not to make errors. A close relationship between ShotSpotter and police isn’t surprising — police departments are the company’s customers and the company needs to keep them happy. But that isn’t compatible with the use of its tool as “objective data” used to convict people of crimes.

Finally, still up for debate is whether ShotSpotter’s technology is even effective. We can argue over a technology’s civil liberties implications until the end of time, but if it’s not effective there’s no reason to bother. A number of cities have stopped using the technology after deciding that ShotSpotter creates too many false positives (reporting gunshots where there were none) and false negatives (missing gunshots that did take place). The MacArthur Justice Center’s report found that in Chicago, initial police responses to 88.7 percent of ShotSpotter alerts found no incidents involving a gun. The company disputes whether this means its technology is inaccurate, pointing out that someone can shoot a gun but leave no evidence behind. But a review of the accuracy debate by IPVM concluded that “while public data does not enable a definitive estimation of false alerts,” the problem “is likely significantly greater than what ShotSpotter insinuates” because the company “uses misleading assumptions and a misleading accuracy calculation” in their advertised accuracy rates.

Given all of these problems, communities and the police departments serving them should reject this technology, at least until these problems are addressed, including through full transparency into its operation and efficacy.

Update (10/14/21):

Shotspotter CEO Ralph Clarke reached out to us to vigorously dispute the sources of information that we relied upon for this post, and also pushed back on the company’s critics in a piece published last month in the Buffalo News. Most recently, his company filed a defamation lawsuit against Vice News; their complaint is a voluminous argument for the company’s technology. Two points in particular seem worth highlighting:

First, pressed on reports that the company has changed its evaluation of the details of gunshots in court, Clarke told me that the company provides two kinds of data about gunshots: an initial, real-time alert sent to police shortly after a gunshot is detected, and a much more thorough “detailed forensic report” that is prepared for court cases. Clarke said that what has been reported as Shotspotter “changing its story” reflects the differences between the real-time and detailed forensic reports.

Second, one of the elements in the reporting on Shotspotter that alarmed me the most were references to the fact that the company was using AI as part of its system. The use of evidence in court derived from AI algorithms raises severe issues of transparency, accuracy, and fairness. Clarke said that they have algorithms that are used to “do the math” in triangulating the location of gunshots based on the timing of acoustic data from their sensors but, pressed on what that meant, he said they are not opaque deep-learning black boxes, but simply algorithms doing math that could otherwise be done by hand. Clarke said a more complex AI algorithm is used to filter out “pops, booms, and bangs” picked up by the company’s sensors that are believed to actually be gunshots before the audio is sent to human analysts for review. That’s not as much of a concern; inaccuracies in such an algorithm might result in some missed gunshots but aren’t going to lead to unfair evidentiary judgments.

Clarke also pushed back on criticisms of Shotspotter’s efficacy and cost-benefit value. Those involve complex assessments of real-world data as well as value judgments that experts and communities will have to monitor, evaluate, and debate. As always, we don’t think any police technology should be deployed or used unless affected communities clearly want them.

Date

Tuesday, August 24, 2021 - 12:45pm

Featured image

Street camera with building in background

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Criminal Justice Police Practices Privacy

Show related content

Imported from National NID

42392

Menu parent dynamic listing

22

Imported from National VID

133212

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

A new investigation points to yet another problematic outcome of the technology’s use and the company’s lack of transparency.

Pages

Subscribe to ACLU of Florida RSS