AI in Schools: Overinvestment in Technology and Underinvestment in Students

AI in Schools: Overinvestment in Technology and Underinvestment in Students

By: Suzy Champlin, Development and Communications Intern

As AI has proliferated, some school districts are turning to AI-powered “weapons detection” systems in an attempt to curb violence at schools. These systems ultimately offer the illusion of increased safety due to their visibility, but research shows that there is a lack of proven results. These technologies, which resemble ordinary metal detectors, are advertised and sold as being able to distinguish between ordinary school items, such as chromebooks and binders, and weapons, such as guns and knives. However, this software has proven to be unreliable, resulting in their removal from multiple school districts after failing to identify weapons and issues with high levels of ‘false positives.’ These failures contribute to the proliferation of  “security theater” approaches and educational environments that can feel unwelcoming and unsafe to students. Factoring in the significant costs associated with the implementation and maintenance of this technology, some districts have experienced "buyers remorse” in investing in this technology.  

Evolv, the company in charge of creating new AI-powered “weapons detection” systems has marketed this technology as a way to increase school safety by preventing weapons from entering schools. Despite this promise, these systems have proven to be grossly ineffective. Some research has shown that these systems have a 60% false alarm rate¹. Everyday school items, such as Chromebooks, umbrellas, binders, water bottles, and lunch boxes are flagged by these systems as weapons. This ineffectiveness poses several issues for schools. First, high false alarm rates create alarm fatigue, in which staff monitoring these detectors are inundated with false alarms and therefore become less responsive to them. This allows real alerts to be missed and for weapons to be more likely to make their way into schools². Second, while these systems can sometimes detect weapons like guns and bombs, they are mostly likely to allow knives to pass undetected. Given that knives are the most common weapon found in schools, these systems create the illusion of safety while, in reality, not making schools any safer. This concept is known as security theater. Third, focusing time and attention in schools on using ineffective “weapons detection” systems distracts from other safety measures that should be taken. Students may be able to bypass these systems altogether by entering the school through side doors. It is word of mouth and person-to-person interactions, not weapons detection, that prevent violence in these situations¹. In short, ineffective “weapons detection” systems do not prevent weapons from entering schools and are a time and attention strain on school staff. 

Despite these systems’ ineffectiveness, they come at a steep cost to counties and school districts. Jefferson County, Kentucky, spent $12 million to put these detectors in schools. Similarly, in Utica, New York, $3.7 million was spent on these “weapons detection” systems, only for them to be removed after they failed to detect a student’s knife that was subsequently used to attack another student¹. Given the high price tag of these systems, one would expect data showing an improvement in overall school safety. However, this is not the case. “Weapon detection” systems are costly, ineffective, and therefore a gross misallocation of financial resources. 

AI systems also serve to exacerbate existing ethnic and racial disparities in the education system. It is a grave error to believe that AI systems and algorithms are race-neutral. Rather, these systems have existing racial and ethnic biases baked into them and have the potential to perpetuate disparities that harm students of color. AI is increasingly being utilized in schools for disciplinary purposes, rather than to support student learning. For example, AI technologies implemented to identify disciplinary infractions are far more likely to target Black students than their white peers due to biases in the algorithms. Educators and policymakers must recognize the egregious shortcomings of AI systems in schools, and seek human-centered solutions rather than quick technological fixes. 

These systems also contribute to an atmosphere of surveillance and criminalization in schools. Research has shown that when security scanners are introduced in schools, their presence contributes to feelings of fear and surveillance in students, particularly marginalized students¹. Marginalized students, particularly Black and Latino students, already face a heightened risk of criminalization at school. When weapon detection systems are implemented, this risk is further increased, as these students are more likely to be scrutinized when one of their everyday school items is wrongly flagged as a weapon. Increased scrutiny causes these students to miss class time, thus contributing to learning loss and further exacerbating opportunity gaps between students of color and their white peers. This scrutiny also invites criminalization, which makes students more susceptible to falling into the school-to-prison pipeline. Students of color are also more likely to encounter this kind of scrutiny because they are more likely to attend schools that utilize metal detectors and “weapons detection” systems¹.  It is evident that “weapons detection” systems are not only ineffective but exacerbate harm done to students of color. 

Utilizing “weapons detection” systems in schools is a misallocation of financial and staffing resources. Fostering safer school environments requires investing in and supporting students, not aggressively surveilling and monitoring their every move. The money spent on acquiring and implementing these detectors could be better used in improving school programs, investing in at-risk students, and training school staff to effectively recognize and diffuse threats to school safety. AI systems in schools are presented as a quick fix for violence in schools but are ultimately ineffective. Implementing these systems obscures proven human-based solutions, such as investing in school districts and supporting at-risk youth. The desire to circumvent proven solutions to school violence by utilizing quick-fix AI technology showcases the unwillingness of policymakers to devote the necessary time and resources to improving school safety. “Weapons detection” systems provide the illusion of safety while draining school district resources. Creating tangible school safety must be accomplished through other avenues and by investing in students and communities. 

  1. “Evolving backward: The company behind JCPS’ new weapon detectors accused of unreliability”, Manual RedEye

  2. The Latest School ‘Weapons Detection’ Tech Can Miss Serious Threats, Experts Say”, The74

  3. Metal Detectors, “Security Theater, Not Safer Schools”, CfJJ

  4. “AI Technology Threatens Educational Equity for Marginalized Students”, The Progressive