CBA Record January-February 2026
THE LEGAL PROFESSION AND THE JUDICIARY IN THE AGE OF ARTIFICIAL INTELLIGENCE
unintentionally influence plea negotia tions, although pretrial assessments do not predict fair case outcomes. AI may not reliably distinguish between a crimi nalized survivor and an abuser, nor cap ture complexities such as self-defense, financial control, isolation, strangulation, stalking, immigration concerns, or cus tody issues. Additionally, because DV is underreported, AI models trained on offi cial data may undervalue real danger. Ultimately, no algorithm can replace trauma-informed empathetic judgment. DV courts must ensure that AI serves as a support tool, and never a substitute, for judicial discretion. Biometric Privacy and Surveillance The Illinois Biometric Information Pri vacy Act (BIPA) regulates the collection and use of biometric data such as finger prints, facial scans, or voice recognition. BIPA requires informed consent and dis closure of data storage policies. For DV survivors, BIPA’s safeguards are essential in preventing abusers from exploiting AI-driven tracking or facial recognition to locate victims. Any AI use of DV data must follow BIPA and survivor-safety principles, using biometrics only when necessary and with informed consent. AI and Pretrial Practice Since 2023, pretrial decisions in Illinois courts focus on risk-based factors. Illinois courts use the Public Safety Assessment (PSA) to help judges determine detention or release conditions. The PSA measures data, such as age, charges, criminal his tory, and prior failures to appear. It also generates scores for "Failure to Appear" and "New Criminal Activity" and flags the risk of "New Violent Activity." DV courts review PSA results alongside a Domestic Violence Screening Instru ment (DVSI), which assesses DV-specific risks such as prior abuse, previous orders of protection, substance abuse, whether chil dren were present, weapon use, and any recent separation. The DVSI calculates a standardized risk score, although judges must exercise discretion in determining
release and supervision conditions. Pretrial conditions in DV cases often include orders of protection and GPS monitoring. AI can evaluate GPS data for risky patterns, such as repeated proximity to protected locations; predict potential
breaches, and overreliance on technology in decision-making. AI-driven monitor ing raises privacy, accuracy and due pro cess concerns. Location data can expose survivors or produce false alerts, while constant surveillance may infringe on
violations; and issue alerts. Other benefits of AI can include its ability to distinguish accidental proximity from stalking; adjust for context (time of day, travel routes, prior conduct); and integrate GPS data with DVSI or PSA scores for adaptive supervision. Other uses of AI include analyzing police reports or other materials for signs of abuse, which can potentially improve consistency and case management. Nonetheless, AI use is coupled with risk, such as algorithmic bias, privacy
defendants’ rights. Key questions remain about who controls this data and who decides when to alert law enforcement. Used carefully, AI and GPS can strengthen pretrial enforcement and enhance victim safety, but without strong privacy protec tions and human oversight, these tools risk creating new harms and excessive sur veillance. AI and Firearms Considerations Statistics surrounding firearms in DV sit uations are staggering. When a domestic
CBA RECORD 27
Made with FlippingBook. PDF to flipbook with ease