Support for LAist comes from
Audience-funded nonprofit news
Stay Connected
Audience-funded nonprofit news
Listen
Podcasts Take Two
New algorithm could help child protection agencies
solid orange rectangular banner
()
Jan 23, 2018
Listen 5:45
New algorithm could help child protection agencies
A USC professor incorporated data about a family's history with the child welfare and criminal justice systems to create a risk score for children.
Los Angeles County social workers display a banner calling for children's safety in Los Angeles on October 28, 2013, where members of United Voices for Children and Families, a coalition of labor and child welfare advocates, testified before the county's Blue Ribbon Commission on Child Protection.     AFP PHOTO/Frederic J. BROWN        (Photo credit should read FREDERIC J. BROWN/AFP/Getty Images)
Los Angeles County social workers display a banner calling for children's safety in Los Angeles.
(
FREDERIC J. BROWN/AFP/Getty Images
)

A USC professor incorporated data about a family's history with the child welfare and criminal justice systems to create a risk score for children.

For workers at child protection agencies, deciding whether or not to check on a family that's been reported for possible abuse is complicated.

Emily Putnam-Hornstein, an associate professor at the University of Southern California's school of social work, is one of the creators of a new tool to help in these situations. It's called the Allegheny Family Screening Tool.

The tool is an algorithm made for the office of Children, Youth and Families (CYF) in Allegheny County, Pennsylvania.

The algorithm uses data from the county, like a family's history with the child welfare or criminal justice systems, to create a risk score for a child. This score provides an additional piece of information to help social workers decide which families might need support services or intervention.



"The algorithm in no way should replace that clinical intuition and the learned experience of workers," Putnam-Hornstein.

The tool comes into play when Children, Youth and Families receives calls reporting children in potentially dangerous homes, Putnam-Hornstein explained. She pointed out that in these moments, call screeners have to respond to the specific report they are hearing and it is hard for them to also look back at historical data. 

So far, the effectiveness of the tool, which was implemented in August, 2017, is still being studied but early results are positive, Putnam-Hornstein said. 



"It is reducing some of the unwarranted variation on the part of workers. What we don't want is a decision around screening in or screening out [a family] for an investigation to be subject to the randomness of which worker or which supervisor that is assigned to, and we're starting to see there is more consistency in the response."

Other algorithms have been used for similar purposes by other child welfare agencies, but Putnam-Hornstein said this one was different because of the transparency that was involved in its creation, partially do to the actions of Allegheny County.



"The county assumed a tremendous leadership role in making sure that the algorithm belonged to them. They've commissioned an ethical review [and] an independent evaluation."

Other similar tools have also received criticism for being racially biased, and Putnam-Horstein said that this was something they were careful to monitor with their results in Allegheny. 

Right now, this tool is being used in Allegheny and was tailored to that county's specific data but Putnam-Hornstein said research is being done to adapt it to other child welfare offices in different areas.