Wednesday, November 26, 2025
DIGESTWIRE
Contribute
CONTACT US
  • Home
  • World
  • UK
  • US
  • Breaking News
  • Technology
  • Entertainment
  • Health Care
  • Business
  • Sports
    • Sports
    • Cricket
    • Football
  • Defense
  • Crypto
    • Crypto News
    • Crypto Calculator
    • Coins Marketcap
    • Top Gainers and Loser of the day
    • Crypto Exchanges
  • Politics
  • Opinion
  • Blog
  • Founders
No Result
View All Result
  • Home
  • World
  • UK
  • US
  • Breaking News
  • Technology
  • Entertainment
  • Health Care
  • Business
  • Sports
    • Sports
    • Cricket
    • Football
  • Defense
  • Crypto
    • Crypto News
    • Crypto Calculator
    • Coins Marketcap
    • Top Gainers and Loser of the day
    • Crypto Exchanges
  • Politics
  • Opinion
  • Blog
  • Founders
No Result
View All Result
DIGESTWIRE
No Result
View All Result
Home Breaking News

Judge’s footnote on immigration agents using AI raises accuracy and privacy concerns

by DigestWire member
November 26, 2025
in Breaking News, World
0
Judge’s footnote on immigration agents using AI raises accuracy and privacy concerns
74
SHARES
1.2k
VIEWS
Share on FacebookShare on Twitter

Tucked in a two-sentence footnote in a voluminous court opinion, a federal judge recently called out immigration agents using artificial intelligence to write use-of-force reports, raising concerns that it could lead to inaccuracies and further erode public confidence in how police have handled the immigration crackdown in the Chicago area and ensuing protests.

U.S. District Judge Sara Ellis wrote the footnote in a 223-page opinion issued last week, noting that the practice of using ChatGPT to write use-of-force reports undermines the agents’ credibility and “may explain the inaccuracy of these reports.” She described what she saw in at least one body camera video, writing that an agent asks ChatGPT to compile a narrative for a report after giving the program a brief sentence of description and several images.

The judge noted factual discrepancies between the official narrative about those law enforcement responses and what body camera footage showed. But experts say the use of AI to write a report that depends on an officer’s specific perspective without using an officer’s actual experience is the worst possible use of the technology and raises serious concerns about accuracy and privacy.

An officer’s needed perspective

Law enforcement agencies across the country have been grappling with how to create guardrails that allow officers to use the increasingly available AI technology while maintaining accuracy, privacy and professionalism. Experts said the example recounted in the opinion didn’t meet that challenge.

“What this guy did is the worst of all worlds. Giving it a single sentence and a few pictures — if that’s true, if that’s what happened here — that goes against every bit of advice we have out there. It’s a nightmare scenario,” said Ian Adams, assistant criminology professor at the University of South Carolina who serves on a task force on artificial intelligence through the Council for Criminal Justice, a nonpartisan think tank.

The Department of Homeland Security did not respond to requests for comment, and it was unclear if the agency had guidelines or policies on the use of AI by agents. The body camera footage cited in the order has not yet been released.

Adams said few departments have put policies in place, but those that have often prohibit the use of predictive AI when writing reports justifying law enforcement decisions, especially use-of-force reports. Courts have established a standard referred to as objective reasonableness when considering whether a use of force was justified, relying heavily on the perspective of the specific officer in that specific scenario.

“We need the specific articulated events of that event and the specific thoughts of that specific officer to let us know if this was a justified use of force,” Adams said. “That is the worst case scenario, other than explicitly telling it to make up facts, because you’re begging it to make up facts in this high-stakes situation.”

Private information and evidence

Besides raising concerns about an AI-generated report inaccurately characterizing what happened, the use of AI also raises potential privacy concerns.

Katie Kinsey, chief of staff and tech policy counsel at the Policing Project at NYU School of Law, said if the agent in the order was using a public ChatGPT version, he probably didn’t understand he lost control of the images the moment he uploaded them, allowing them to be part of the public domain and potentially used by bad actors.

Kinsey said from a technology standpoint most departments are building the plane as it’s being flown when it comes to AI. She said it’s often a pattern in law enforcement to wait until new technologies are already being used and in some cases mistakes being made to then talk about putting guidelines or policies in place.

“You would rather do things the other way around, where you understand the risks and develop guardrails around the risks,” Kinsey said. “Even if they aren’t studying best practices, there’s some lower hanging fruit that could help. We can start from transparency.”

Kinsey said while federal law enforcement considers how the technology should be used or not used, it could adopt a policy like those put in place in Utah or California recently, where police reports or communications written using AI have to be labeled.

Careful use of new tools

The photographs the officer used to generate a narrative also caused accuracy concerns for some experts.

Well-known tech companies like Axon have begun offering AI components with their body cameras to assist in writing incident reports. Those AI programs marketed to police operate on a closed system and largely limit themselves to using audio from body cameras to produce narratives because the companies have said programs that attempt to use visuals are not effective enough for use.

“There are many different ways to describe a color, or a facial expression or any visual component. You could ask any AI expert and they would tell you prompts return very different results between different AI applications, and that gets complicated with a visual component,” said Andrew Guthrie Ferguson, a law professor at George Washington University Law School.

“There’s also a professionalism questions. Are we OK with police officers using predictive analytics?” he added. “It’s about what the model thinks should have happened, but might not be what actually happened. You don’t want it to be what ends up in court, to justify your actions.”

Read Entire Article
Tags: BangordailynewsBreaking NewsWorld
Share30Tweet19
Next Post
Simu Liu Says the ‘Backslide’ of Asian Representation in Hollywood Is ‘F—ing Appalling’: ‘Studios Think We Are Risky’

Simu Liu Says the ‘Backslide’ of Asian Representation in Hollywood Is ‘F—ing Appalling’: ‘Studios Think We Are Risky’

Everything ‘DWTS’ Pros and Stars Have Said About Skipping Live Tour

Everything 'DWTS' Pros and Stars Have Said About Skipping Live Tour

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

I agree to the Terms & Conditions and Privacy Policy.

No Result
View All Result
Coins MarketCap Live Updates Coins MarketCap Live Updates Coins MarketCap Live Updates
ADVERTISEMENT

Highlights

FBI surge leads to charges in 2020 killing in Native American community

Kilauea displays lava fountains for the 37th time since its eruption began last year

Where Jared Golden’s leftover campaign money is going

Letter: No, voter ID is not needed in Maine

The King of Robb Mountain: A Maine deer hunting story

Young former player now head coach for Maine regional hockey champs

Trending

Everything ‘DWTS’ Pros and Stars Have Said About Skipping Live Tour
Entertainment

Everything ‘DWTS’ Pros and Stars Have Said About Skipping Live Tour

by DigestWire member
November 26, 2025
0

While many Dancing With the Stars pros and contestants choose to lace up their dancing shoes one...

Simu Liu Says the ‘Backslide’ of Asian Representation in Hollywood Is ‘F—ing Appalling’: ‘Studios Think We Are Risky’

Simu Liu Says the ‘Backslide’ of Asian Representation in Hollywood Is ‘F—ing Appalling’: ‘Studios Think We Are Risky’

November 26, 2025
Judge’s footnote on immigration agents using AI raises accuracy and privacy concerns

Judge’s footnote on immigration agents using AI raises accuracy and privacy concerns

November 26, 2025
FBI surge leads to charges in 2020 killing in Native American community

FBI surge leads to charges in 2020 killing in Native American community

November 26, 2025
Kilauea displays lava fountains for the 37th time since its eruption began last year

Kilauea displays lava fountains for the 37th time since its eruption began last year

November 26, 2025
DIGEST WIRE

DigestWire is an automated news feed that utilizes AI technology to gather information from sources with varying perspectives. This allows users to gain a comprehensive understanding of different arguments and make informed decisions. DigestWire is dedicated to serving the public interest and upholding democratic values.

Privacy Policy     Terms and Conditions

Recent News

  • Everything ‘DWTS’ Pros and Stars Have Said About Skipping Live Tour November 26, 2025
  • Simu Liu Says the ‘Backslide’ of Asian Representation in Hollywood Is ‘F—ing Appalling’: ‘Studios Think We Are Risky’ November 26, 2025
  • Judge’s footnote on immigration agents using AI raises accuracy and privacy concerns November 26, 2025

Categories

  • Blockchain
  • Blog
  • Breaking News
  • Business
  • Cricket
  • Crypto Market
  • Cryptocurrency
  • Defense
  • Entertainment
  • Football
  • Founders
  • Health Care
  • Opinion
  • Politics
  • Sports
  • Strange
  • Technology
  • UK News
  • Uncategorized
  • US News
  • World

© 2020-23 Digest Wire. All rights belong to their respective owners.

No Result
View All Result
  • Home
  • World
  • UK
  • US
  • Breaking News
  • Technology
  • Entertainment
  • Health Care
  • Business
  • Sports
    • Sports
    • Cricket
    • Football
  • Defense
  • Crypto
    • Crypto News
    • Crypto Calculator
    • Blockchain
    • Coins Marketcap
    • Top Gainers and Loser of the day
    • Crypto Exchanges
  • Politics
  • Opinion
  • Strange
  • Blog
  • Founders
  • Contribute!

© 2024 Digest Wire - All right reserved.

Privacy Policy   Terms and Conditions

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.