CBA Record January-February 2026

questions, including whether AI prompts, search history, or data sources qualify as work product or Brady material. The Con frontation Clause may be compromised if AI outputs cannot be cross-examined. Further, AI-enhanced surveillance may implicate Fourth Amendment protections against unreasonable searches. In DV cases, these issues intensify. AI may misinterpret trauma responses as deception, expose sensitive survivor data, or be wea ponized through fabricated evidence. AI can assist justice, but it cannot replace constitutional protections, human judgment, or trauma-informed insight. AI in Sentencing Used properly and with human review, AI can enhance consis tency across courts, protect confidentiality, and uphold due pro cess in DV sentencing. AI can identify high-risk offenders and expand access to justice. AI can inform sentencing decisions and guide the issuance of conditions such as GPS monitoring or counseling. However, because DV cases involve trauma and human com plexities, AI use in these circumstances must remain strictly advi sory and subject to judicial discretion. Judges and practitioners must remember that AI models may reflect biased policing or charging patterns and can misclassify survivors acting in self defense. Judges must interpret AI-generated results through a trauma-informed perspective, treat them as advisory, and vigi lantly assess reliability. Future Use of AI in DV Courts AI can identify high-risk cases and expand access to justice, but careless use can amplify bias, enable synthetic evidence, and undermine due process. Moving forward, DV-aware AI use must remain advisory, trauma-informed and supported by judicial training, authentication of evidence, independent oversight, and strict privacy protections. As AI use expands, Illinois may consider requiring disclosure of AI tools and certification of accuracy (with protective orders and judicial review to balance trade secrets with the right to challenge AI evidence, if necessary). Courts may need stronger authentication and chain-of-custody rules for AI-generated evi dence. Ongoing education is essential to ensure that AI serves fairness and safety in Illinois courts. In DV courtrooms: Let AI assist justice—but never replace it.

Used properly and with human review, AI can enhance consistency across courts, protect confidential ity, and uphold due process in DV sentencing.

abuser has access to a firearm, the risk of homicide for women increases by 500%. In Illinois, between 2019 and 2023, gun related DV deaths increased by 63%. The U.S. Supreme Court’s decision in United States v. Rahimi, 602 U.S. 680 (2024) and Karina’s Law in Illinois (Pub. Act 103-1065) aim to address the deadly link between DV cases and firearms. In Rahimi, the Court upheld 18 USC 922(g)(8), confirming that individuals subject to DV restrain ing orders and who pose credible safety risks may be temporarily disarmed. Karina’s Law requires respondents under orders of pro tection to surrender weapons and authorizes search warrants for firearm seizure within 96 hours. AI could reinforce firearm removal efforts by cross-referencing FOID revocations, orders of protection, and law-enforcement databases to identify noncompliance. AI can flag risk pat terns, track firearm surrender deadlines, and alert authorities of attempts to reacquire firearms, helping prioritize high-risk cases for prompt intervention. Still, due process remains paramount. Respondents must receive notice and an opportunity to be heard. Because firearm data is highly sensitive, AI platforms must ensure data security and survivor confidentiality in all enforcement actions. AI as Evidence AI can gather evidence, reconstruct scenes, analyze digital foot prints, audit transcripts, or monitor stalkers. However, misuse through deepfakes or digital forgeries can complicate authentica tion, retraumatize survivors, compromise privacy, or distort truth finding. Under Illinois Evidentiary Rules 901 and 1002, AI-generated evidence must be authenticated. Attorneys may challenge AI enhanced evidence under Rule 403 if it risks prejudice or confu sion. The Digital Forgeries Act (HB 2123, 2024) provides civil remedies for survivors whose likenesses are exploited in AI-altered intimate images. AI use can also raise significant constitutional concerns. Due process requires that defendants be able to examine and contest AI evidence. Proprietary systems with hidden error rates can vio late Frye’s standards of reliability, and AI raises ongoing discovery

Judge Meghan Goldish sits in the Domestic Violence Division of the Circuit Court of Cook County and is Co-Chair of Executive Committee for AI 2035: The Legal Profession and the Judiciary in the Age of Artificial Intelligence

28 January/February 2026

Made with FlippingBook. PDF to flipbook with ease