Utah police trust AI that identified a human as a frog
The future of policing apparently includes enchanted forests.
Buried in a local Fox News story about police using AI to keep streets safer is the part that actually matters: the software once decided a person had turned into a frog. Not metaphorically. Not as a joke. The system flagged a human being as an amphibian, logged it, and moved on, because that is what happens when you give pattern recognition software authority without insisting it understand reality.
“The body cam software and the AI report writing software picked up on the movie that was playing in the background, which happened to be ‘The Princess and the Frog,’” Sgt. Keel told FOX 13 News. “That’s when we learned the importance of correcting these AI-generated reports.”
Earlier this month, the department began testing two pieces of AI software, Draft One and Code Four. Code Four was created by George Cheng and Dylan Nguyen, both 19 years old and MIT dropouts, kicked off earlier this year. The software generates police reports from body camera footage in hopes of reducing paperwork and allowing officers to be out in the field more.
According to FOX 13, Utah law enforcement agencies are deploying AI tools that analyze body-cam video to identify people, vehicles, and behavior. In one example intended to demonstrate how advanced the technology is, the system confidently classified a person as a frog based on the content playing in the background. This happened because the AI does not know what a person is. It knows pixels, shapes, and probability buckets, and when those fail, it does not stop. It guesses. In this case, it guessed an amphibian.
If your algorithm cannot reliably tell a human from a frog, maybe it should not be helping decide who gets stopped, searched, or questioned.



Not just deciding who gets stopped, but writing the post facto reasons that a court will accept as testimony in any subsequent legal action. If the AI says that wallet is a gun, too bad.