California uses algorithms to predict whether incarcerated people will commit crimes again. It has used predictive technology to deny 600,000 people unemployment benefits. Nonetheless, state administrators have concluded that not a single agency uses high-risk forms of automated decisionmaking technology.
That’s according to a report the California Department of Technology provided to CalMatters after surveying nearly 200 state entities. The agencies are required by legislation signed into law in 2023 to report annually if they use high-risk automated systems that can make decisions about people’s lives. “High-risk” means any system that can assist or replace human decisionmakers when it comes to encounters with the criminal justice system or whether people get access to housing, education, employment, credit and health care.
Keep Reading This Article at CalMatters