AI tool used to spot child abuse allegedly targets parents with disabilities
/var/www/vhosts/lawyersinamerica.com/httpdocs/app/views/singleBlog/singleBlogView.php on line 59
">
">
Policy
Jan 2023
INNOCEDAR Home Gym Bar Kit with Resistance Bands,Portable Gym Full Body Workout,Adjustable Pilates Bar System,Safe Exercise Weight Set,Home Exercise E...
$49.98-
BALEAF Women's Capris Yoga Pants Cotton Wide Leg Loose Comfy Lounge Workout Capri Sweatpants with Pockets
$27.99
SUUKSESS Women Contour Scrunch Butt Lifting Leggings Seamless Workout Yoga Pants
$16.95 - $19.95
Gyozelem Women 2 in 1 Flowy Running Athletic Shorts with Pocket Workout Yoga Biker Shorts
$18.98
Since 2016, social workers in a Pennsylvania county have relied on an algorithm to help them determine which child welfare calls warrant further investigation. Now, the Justice Department is reportedly scrutinizing the controversial family-screening tool over concerns that using the algorithm may be violating the Americans with Disabilities Act by allegedly discriminating against families with disabilities, the Associated Press reported, including families with mental health issues.
The county describes its predictive risk modeling tool as a preferred resource to reduce human error for social workers benefiting from the algorithm's rapid analysis of "hundreds of data elements for each person involved in an allegation of child maltreatment." That includes "data points tied to disabilities in children, parents, and other members of local households," Allegheny County told AP. Those data points contribute to an overall risk score that helps determine if a child should be removed from their home.
Although the county told AP that social workers can override the tool's recommendations and that the algorithm has been updated "several times" to remove disabilities-related data points, critics worry that the screening tool may still be automating discrimination. This is particularly concerning because the Pennsylvania algorithm has inspired similar tools used in California and Colorado, AP reported. Oregon stopped using its family-screening tool over similar concerns that its algorithm may be exacerbating racial biases in its child welfare data.
The Justice Department has not yet commented on its alleged interest in the tool, but AP reported that the department's scrutiny could possibly turn a moral argument against using child welfare algorithms into a legal argument.
A University of Minnesota expert on child welfare and disabilities, Traci LaLiberte, told AP that it's unusual for the Justice Department to get involved with child welfare issues. "It really has to rise to the level of pretty significant concern to dedicate time and get involved," LaLiberte told AP.
Ars could not immediately reach developers of the algorithm or the Allegheny County Department of Human Services for comment, but a county spokesperson, Mark Bertolet, told AP that the agency was unaware of the Justice Department's interest in its screening tool.
Problems with predicting child maltreatment
On its website, Allegheny County said that the family-screening tool was developed in 2016 to "enhance our child welfare call screening decision making process with the singular goal of improving child safety." That year, the county reported that prior to using the algorithm, human error led child protective services to investigate 48 percent of the lowest-risk cases, while overlooking 27 percent of the highest-risk cases. A 2016 external ethical analysis supported the county's use of the algorithm as an "inevitably imperfect" but a comparatively more accurate and transparent method for assessing risk rather than relying on clinical judgment alone.
"We reasoned that by using technology to gather and weigh all available pertinent information we could improve the basis for these critical decisions and reduce variability in staff decision-making," the county said on its website, promising to continue to refine the model as more analysis of the tool was conducted.
Although the county told AP that risk scores alone never trigger investigations, the county website still says that "when the score is at the highest levels, meeting the threshold for 'mandatory screen in,' the allegations in a call must be investigated." Because disability-related data points contribute to that score, critics suggest that families with disabilities are more likely to be targeted for investigations.
The same year that the family-screening tool was introduced, the Christopher & Dana Reeve Foundation and the National Council on Disability released a toolkit to help parents with disabilities know their rights when fighting in the courts over child welfare concerns.
"For many of the 4.1 million parents with disabilities in the United States, courts have decided they aren't good parents just because they have disabilities," the organization wrote in the toolkit's introduction. "In fact, as of 2016, 35 states still said that if you had a disability, you could lose your right to be a parent, even if you didn't hurt or ignore your child."
Allegheny County told AP that "it should come as no surprise that parents with disabilities... may also have a need for additional supports and services." Neither the county's ethical analysis nor its FAQ directly discusses how the tool could be disadvantaging these families, though.
Ars could not reach LaLiberte for additional comment, but she told AP that her research has also shown that parents with disabilities are already disproportionately targeted by the child welfare system. She suggested that incorporating disability-related data points into the algorithm is seemingly inappropriate because it directs social workers to consider "characteristics people can't change," instead of exclusively assessing problematic behavior.