Marissa Gerchick, she/her/hers, Data Scientist and Algorithmic Justice Specialist, ACLU

Tobi Jegede, Data Scientist, ACLU

Tarak Shah, Data Scientist, Human Rights Data Analysis Group,

Ana Gutiérrez, Special Assistant for Digital, Tech, and Analytics, ACLU

Sophie Beiers, Data Journalist, ACLU Analytics

Noam Shemtov, Paralegal, ACLU Speech, Privacy, and Technology Project

Kath Xu, Skadden Fellow, ACLU Women's Rights Project

Anjana Samant, Senior Staff Attorney, Women’s Rights Project

Aaron Horowitz, Head of Analytics, ACLU

You hear a knock on your door. Expecting a neighbor or perhaps a delivery, you open it, only to find a child welfare worker demanding entry. It doesn’t seem like you can refuse so you let them in and watch as they search every room, rummaging through closets, drawers, cabinets, the fridge — all without a warrant. They ask questions making it sound like you’re a bad parent and, finally, say they need to do a visual inspection of your kids, undressed, without you in the room, and take pictures.

The agency receives many reports of ordinary neglect, which are distinct from physical abuse or severe neglect allegations, but it doesn’t investigate all of them. Instead, the agency had started using an algorithm to help decide who gets the knock on the door and who doesn’t. But you can’t get any information about what it said about you or your child, or how it played a role in the decision to investigate you.

Though an algorithm may sound neutral, predictive tools are designed by people. And the choices people make when creating the tool aren’t just decisions about what statistical method is better or what data is necessary to make its calculations. The same people can be flagged as more or less in need of investigation based on how a tool was designed. One recurring concern is that the use of these tools in systems marked by discriminatory treatment and outcomes will result in those outcomes being replicated. But this time, if that history repeats itself, the disparate results will be deemed unquestionable truths supported by science and math, and not the result of residual or ongoing discrimination, let alone the policy decisions that resulted when tool designers decided to choose model A instead of model B.

To better understand whether this concern is warranted, two years ago, the ACLU requested data and documents from Allegheny County, Pennsylvania related to the Allegheny Family Screening Tool (AFST) so we, together with researchers from the Human Rights Data Analysis Group, could independently evaluate its design and practical impact. We found, among other things, that the AFST could result in inequities in screen-in rates — the percentage of reports (i.e., neglect allegations received by the county child welfare agency) that are forwarded for investigation (“screened in”) out of the total number of reports received. We found that the tool could result in screen-in rate disparities between Black and non-Black families (i.e., the percentage of Black families flagged for investigation out of all allegations about Black families received could be greater than the same percentage for non-Black families). We also found that households where people with disabilities live could be labeled as higher risk than households without a disabled resident. What really stood out though was that we found the AFST’s algorithm, or the way its conclusions about a family were conveyed to a screener, could have been built in different ways that may have had a less discriminatory impact. And this alternative method didn’t change the algorithm’s “accuracy” in any meaningful way, even if we accept the tool’s developer’s definition of that term. We asked the county and tool designers for feedback on a paper describing our analysis, but never received a response. We share our findings for the first time today. But first, a quick overview of how the AFST works.

Allegheny County has been using the AFST to help screening workers decide whether to investigate or dismiss neglect allegations. (The AFST is not used to make screening decisions about physical abuse or severe neglect allegations because state law requires that those be investigated.) The tool calculates a “risk score” from 0 to 20 based on the underlying algorithm’s estimation of the likelihood that the county will, within two years, remove a child from the family involved in the report. In other words, the tool generates a prediction of the “risk” that the agency will place the child in foster care. The county and tool designers treat removal as a sign that the child may be harmed, so that the higher the likelihood of removal, the higher the score, and the greater the presumed need for child welfare intervention. Call-in screeners are instructed to consider the AFST’s output as one factor among many in deciding whether to forward the report for agency action.

However, in a child welfare system already plagued by inequities based on race, gender, income, and disability, using historical data to predict future action by the agency only serves to reinforce those disparities. And when reborn through an algorithm, people are liable to interpret the disparities as hard truths because, well, a mathematical equation told us so.

In this way, the AFST creators are doing more than math when building a tool. They also have the ability to become shadow policymakers — because unless the practical impact of their design decisions is evaluated and made public, this power can be wielded with little transparency or accountability, even though these are two of the reasons why the county adopted the tool.

Here is a summary of the design decisions and resulting policies and value judgments that we shared with the county as cause for concern:

“Risky” by Association

When Allegheny County receives a report alleging child neglect, the AFST generates an individualized “risk score” for every child in the household. However, call screeners don’t see individual-level scores. Instead, the AFST shows an output based on only the highest score of all the children in the household. For referrals where the maximum score falls between 11 and 18, the AFST displays the score’s numeric value. For maximum scores of 18 and up, the AFST displays a “High Risk Protocol” label as long as at least one child in the household is under 17. This subset of referrals is subject to mandatory investigation, which only a supervisor can override. For referrals with a maximum score less than 11 and no children under 7, screening workers see a “Low Risk Protocol” label.

We found that the decision to communicate only the AFST’s predictions of the highest-scoring child could have created inequitable outcomes. We say “could have” because we could not run our analysis on the actual numbers of Black and non-Black families so instead, as is common practice including by the county and its tool developers, we looked at Allegheny County data collected before the AFST was deployed to model what the risk scores would have been.

Compared to other ways of conveying the AFST’s scores, the method in use could have resulted in the AFST classifying Black families as having a greater need for agency scrutiny than non-Black families. Through our analysis of data from 2010-2014, we found that the AFST’s method of showing just one score would have resulted in roughly 33% of Black households being labeled “high risk,” thereby triggering the mandatory screen-in protocol, but only 20% of non-Black households would have been so labeled.

A graph associated with the report.

Fig. 1. Distribution of AFST Scores by Race Under Different Scoring Policies, using testing data from 2010-2014. Under policies that assign a single score or screening recommendation to the entire household, AFST scores generally increase for all families, and Black households receive the highest scores more often than non-Black households. Under the current “Single Household Score” policy, nearly 35% of Black households are labeled as “high risk” for future separation while only 20% of non-Black households are labeled as “high risk.”

The More Data, the Better?

To build the tool’s algorithm, its designers needed to look at historical records to identify circumstances and individual characteristics most associated with child removals, since the tool bases its risk scores essentially on whether and how those factors are present in the incoming report. Thus to further one of the county’s stated goals in adopting the AFST — to “make decisions [about whether to screen in a report] based on as much information as possible” — the county gave the AFST designers access to government databases beyond the county’s child welfare records, such as juvenile probation and behavioral health records. The problem is that these databases do not reflect a random sample or cross-section of the county’s population. Rather, they reflect the lives of people who have more contact with government agencies than others. As a result, using such a database to identify the characteristics of households more likely to have a child removed means selecting from a pool of factors that over-represents some groups of people and underrepresent others, making it more likely that the tool will classify the same overrepresented populations as higher risk, not because they are more likely to be harmed or to cause harm, but because the government has access to data about them but little or no access to data about others.

Take for instance the county’s juvenile probation database, which was used to construct the AFST. A recent study found that Black girls in the county were 10 times more likely and Black boys were seven times more likely than their white counterparts to end up in the juvenile justice system. As a result, in using the related juvenile probation database to build the tool, the tool developers are mining records that overrepresent Black children as compared to white children.

The behavioral health databases the county used to create the AFST are similarly problematic. Because they expressly include information about people seeking disability-related care, these databases will inevitably contain information about people with disabilities, but not necessarily others. These databases are also skewed along another axis: Because the county doesn’t record information about privately accessed health care, data about individuals with higher incomes is far less likely to be reflected.

Marked Forever

By partly basing the AFST’s removal prediction on factors that families can never change, such as whether someone has been held in the Allegheny County Jail at any time or for any reason, the AFST effectively offers families no way to escape their pasts, compounding the impacts of systemic bias in the criminal legal system. We found that households with more children are more likely to include somebody with a record in the county jail system or with HealthChoices, Allegheny’s managed care program for behavioral health services. We found that by including information that tracks whether someone has ever been associated with these systems, the AFST could have produced greater disparities in Black-white family screen-in rates than an alternate model design that did not take these factors into consideration.

A graph regarding the report.

Fig. 3. As household sizes increase, the likelihood that at least one member of the household will have some history with the Allegheny County Jail, the HealthChoices program, or juvenile probation increases as well. Black households are disproportionately likely to have involvement with these systems.

Previous reporting has shown that many families do not even know the county is using the AFST, much less how it functions or how to raise concerns about it. Furthermore, government databases, from public benefits databases to criminal justice databases, are rife with errors. And what happens if the tool itself “glitches,” as has already happened to the AFST?

These challenges demonstrate the urgent need for transparency, independent oversight, and meaningful recourse when algorithms are deployed in high-stakes decision-making contexts like child welfare. Families have no knowledge of the policies embedded in the tools imposed upon them, no ability to know how the tool was used to make life-altering decisions, and are ultimately limited in their ability to fight for their civil liberties, cementing long-standing traditions of how family regulation agencies operate.

Read the full report, The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool below:

 

Date

Tuesday, March 14, 2023 - 6:00pm

Featured image

A case work supervisor looks over the first screen of software used by workers who field calls at an intake call screening center for the Allegheny County Children.

Show featured image

Hide banner image

Override default banner image

A case work supervisor looks over the first screen of software used by workers who field calls at an intake call screening center for the Allegheny County Children.

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Criminal Justice Racial Justice Students & Youth Rights

Show related content

Imported from National NID

99566

Menu parent dynamic listing

22

Imported from National VID

134755

Imported from National Link

Show PDF in viewer on page

Style

Centered single-column (no sidebar)

Teaser subhead

The ACLU and Human Rights Data Analysis Group evaluate one of the first algorithmic tools used in the child welfare system.

Show list numbers

Rose Mackenzie, She/Her, Campaign Strategist, ACLU

It was one of the first things anti-abortion politicians bellowed in the immediate backlash to Roe v. Wade being overturned: Women won’t face prosecution for seeking an abortion. They said it because they know that punishing people for making the best decisions for themselves is deeply unpopular. And yet, less than a year later, they are introducing bills that would do exactly that — bills that would allow prosecutors to send women to jail for ending their pregnancies. We know this is a strategy they’re using to push the envelope and experiment with how big the backlash will be.

If we’ve learned anything over the last several decades in the fight for reproductive freedom, it’s that we should watch what politicians do, not what they say. Unfortunately, their actions reveal the cruel reality they want to force on all of us. It’s clear as day that these politicians want to create a world where making decisions about your own body is a crime.

Pregnancy outcomes shouldn’t be criminalized.

Since long before Roe was overturned, overzealous prosecutors have distorted laws to pursue an anti-abortion agenda by targeting pregnant people for making the deeply personal medical decision to end their pregnancy, disproportionately policing the bodies of people of color. Prosecutions of people who ended their pregnancy or helped someone access abortion occurred even with the protections of Roe in place. If/When/How: Lawyering for Reproductive Justice, which has worked for decades to help people facing criminal charges for accessing abortion for decades, catalogued 61 such cases between 2000 and 2020. And in the last few weeks, authorities arrested a South Carolina woman for allegedly self-managing her abortion.

Right now, prosecutors twist unrelated laws in order to prosecute people for ending their pregnancies, because only a handful of states give them the legal authority to prosecute someone for their own abortion. But anti-abortion legislators want to make it far easier for prosecutors to send women to jail for getting the health care they need. They are pushing new laws that would make it a felony for a woman to have an abortion. Some would even allow prosecutors to seek the death penalty.

mytubethumb play
%3Ciframe%20class%3D%22media-youtube-player%22%20width%3D%22580%22%20height%3D%22324%22%20title%3D%22ACLU%20Abortion%20Criminal%20Defense%20Initiative%22%20src%3D%22https%3A%2F%2Fwww.youtube-nocookie.com%2Fembed%2FP5nf_w52lZs%3Fwmode%3Dopaque%26amp%3Bcontrols%3D1%26amp%3Bmodestbranding%3D1%26amp%3Brel%3D0%26amp%3Bshowinfo%3D0%26amp%3Bcolor%3Dwhite%26autoplay%3D1%26version%3D3%26playsinline%3D1%22%20name%3D%22ACLU%20Abortion%20Criminal%20Defense%20Initiative%22%20frameborder%3D%220%22%20allowfullscreen%3D%22%22%20id%3D%22ACLU%20Abortion%20Criminal%20Defense%20Initiative%22%20allow%3D%22autoplay%22%3EVideo%20of%20ACLU%20Abortion%20Criminal%20Defense%20Initiative%3C%2Fiframe%3E
Privacy statement. This embed will serve content from youtube-nocookie.com.

Alabama State Rep. Ernie Yarbrough introduced such a bill in Alabama, saying “ If you look at Alabama law, you will see there is an exemption that says that abortion is not murder in our state. It’s time we change that.” And this is just one example of the trend we’re seeing across the country. Arkansas, Georgia, Kentucky, South Carolina, and Oklahoma also have bills that would allow extreme politicians to charge those who have had an abortion and try to put them in prison for doing so. Facing criminal charges for obtaining essential health care is horrifying, especially in states like Kentucky and South Carolina which allow the death penalty as a potential punishment.

Ashley Lidow, director at our colleague organization WREN, put it perfectly: “Pregnancy outcomes shouldn’t be criminalized.” We agree, and if anti-abortion politicians think they can push these laws without us fighting back, they are sorely mistaken.

With your help, we will do everything we can to stop these bills and other bans on abortion from passing. At the same time, we are determined to be prepared for any prosecutions that may come either as a result of abortion bans, which put doctors at risk for felony prosecution, or as a result of the new laws which threaten people needing care themselves. That’s why the ACLU just launched the Abortion Criminal Defense Initiative to ensure that those facing prosecution for providing, supporting, or seeking abortion care will not have to face these unjust attacks alone. We are working in collaboration with If/When/How and the Repro Legal Hotline, which has long supported people who are investigated or prosecuted for self-managing their abortions, as well as those who help them get the care they need.

We cannot allow these extremist politicians to succeed. We must fight for the world we want to see — one where everyone has the right and the ability to control their own bodies and shape their own future. The ACLU is working in the courts and state legislatures across the country to stop bills like these (and many more), and to work toward a world where everyone can get the care they need. But we need you to join us. Whether you live in a state where politicians are pushing laws to further criminalize abortion care, you can make a difference. Join the ACLU in fighting back against attacks on abortion access and other civil liberties across the country by signing up for information and action below.

Date

Tuesday, March 14, 2023 - 4:15pm

Featured image

Abortion Rights protestor holds a sign reading "WE WILL NOT GOT QUIETLY, BANS OFF OUR BODIES"

Show featured image

Hide banner image

Override default banner image

Abortion Rights protestor holds a sign reading "WE WILL NOT GOT QUIETLY, BANS OFF OUR BODIES"

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Gender Equity & Reproductive Freedom Criminal Justice

Show related content

Imported from National NID

102951

Menu parent dynamic listing

22

Imported from National VID

102972

Imported from National Link

Show PDF in viewer on page

Style

Centered single-column (no sidebar)

Teaser subhead

If anti-abortion politicians think they can push laws to criminalize abortion without a fight, they are sorely mistaken.

Show list numbers

Pages

Subscribe to ACLU of Florida RSS