The ACLU is sounding the alarm on the nation’s police forces increasingly leaning on artificial intelligence.
In a six-page white paper released Dec. 10, the nation’s largest civil rights group warns that the implementation of popular generative AI tech, like chatbots and drafting tools, is an overstep of technological advancement and poses a threat to American civil liberties.
The organization specifically singles out the use of Draft One, a controversial generative AI tool that assists police officers in drafting reports based on body cam audio and relies on OpenAI’s GPT-4 model. Several police departments around the nation have gradually tested AI tools over the last year, including Draft One. That number is on the rise, with cities around the U.S. pitching generative AI features as a solution to budget and staffing constraints.
What OpenAI’s Sora means for the future of truth
Experts have criticized the implementation of this tech, citing the essential role police reports play across judicial decision making, from investigation and discovery to sentencing.
Mashable Light Speed
“The forcing function of writing out a justification, and then swearing to its truth, and publicizing that record to other legal professionals (prosecutors/judges) is a check on police power,” wrote law expert Andrew Guthrie Ferguson in one of the first law reviews of AI-drafted police reports. The report is cited by the ACLU in its white paper.
The ACLU outlines four major areas of concern in its white paper, including the necessary accountability process of human-written reports Ferguson describes. In its analysis, the organization emphasizes the uncertainty stoked by biases and hallucinations inherent to the tech itself, and questions the transparency of these processes to the public at large — not to mention its data privacy implications. According to the organization, relying more on the interpretation of generative AI tools than human memory and a police officer’s subjective observations is to the detriment of a fair judicial process.
If AI is the next frontier for police officers, the ACLU urges, it should be used only after human memory has been recorded. AI tools could transcribe audio-recorded verbal narratives created by the officers involved, the organization explains, which could be submitted in tandem for review.
But, contrary to the recommendations of civil rights groups, AI’s leaders are steadily investing in police and military applications of their technology.
“Police reports play a crucial role in our justice system. They are central to the criminal proceedings that determine people’s innocence, guilt, and punishment, and are often the only official account of what took place during a particular incident,” the ACLU writes. “AI report-writing technology removes important human elements from police procedures and is too new, too untested, too unreliable, too opaque, and too biased to be inserted into our criminal justice system.”
Topics
Artificial Intelligence
Social Good