AI Mistakes Chips for Gun in High School

Oct 27, 2025 | AI News

An artificial intelligence system apparently mistook a high school student’s bag of Doritos for a firearm and alerted local police to the perceived threat. Taki Allen was relaxing with friends outside Kenwood High School in Baltimore on Monday evening when police officers approached him with guns drawn, instructing him to get on the ground.

“At first, I didn’t know where they were going until they started walking toward me with guns, talking about, ‘Get on the ground,’ and I was like, ‘What?’” said Allen in an interview with the WBAL-TV 11 News. Following their commands, Allen was made to kneel, handcuffed, and subjected to a search, with officers later showing him the image that instigated the alarm. “I was just holding a Doritos bag – it was two hands and one finger out, and they said it looked like a gun,” he recounted.

This incident is part of the implementation of a gun detection system at Baltimore County high schools, which utilizes school cameras and AI technology to identify potential weapons. Alerts are triggered when the system identifies something deemed suspicious, prompting notifications to school administrations and law enforcement.

In a letter to school families obtained by WBAL TV 11 News, school officials acknowledged the distress caused by this event for both the student involved and the witnesses. They expressed, “We understand how upsetting this was for the individual that was searched as well as the other students who witnessed the incident. Our counselors will provide direct support to the students who were involved in this incident and are also available to speak with any student who may need support.”

According to Baltimore County police, officers from Precinct 11-Essex responded promptly to the report of a suspicious person allegedly armed. Upon arrival and after conducting a search, it was confirmed that the individual did not possess any weapons.

Lamont Davis, Allen’s grandfather, shared his concern with the television station, stating, “Nobody wants this to happen to their child. No one wants this to happen.” This incident raises significant questions about the reliability and accuracy of AI systems in high-stress environments, especially when public safety is at stake. Without proper calibration and oversight, the consequences of misinterpretation can be serious, as shown in this unnerving encounter.

This article was amended on 24 October 2025 to clarify earlier reporting regarding police jurisdiction.