Unproven Vape Detection Tech Expands Surveillance in Schools, Threatening Privacy
Todd Whitney, Yung-Hsuan Wu, Greta Byrum, Eloïse Gabadou, Clarence Okoh, Marika Pfefferkorn / Oct 16, 2025Concerns about student safety are front of mind this fall across the United States. Faced with managing an expanding pool of school-related risks from gun violence to disease outbreaks — and dealing with political cross-currents including existential threats to public education — many superintendents and administrators are turning to AI-powered technologies, even in the absence of coherent safety standards and regulations.
Meanwhile, the White House and the tech industry are touting AI as the future of the education system. Backed by venture capital and valued at $146 billion globally in 2023 with a growth rate of 14% annually, the educational technology (“edtech”) market offers plug-and-play answers not only to learning challenges but also complex school safety and health dilemmas. Increasingly, the role of AI in education includes not only learning platforms but rights-impacting surveillance devices, applications, and systems – leading to cameras, sensors, and other data collection and processing systems littering school hallways, grounds, classrooms, and yes, even bathrooms.
The use of AI-powered surveillance technologies for school safety and student discipline means installing high-tech spyware in hallways and classrooms — and also school bathrooms, locker rooms, and other sensitive areas. New research reveals how schools that choose AI surveillance may be buying into systems that sacrifice students’ privacy and civil rights in exchange for technocratic “solutions” which in reality fail to deliver the outcomes promised to anxious parents and school officials.
Vape Detection Technology (VDT) systems comprise arrays of sensing devices installed in “private” spaces in schools. While VDTs do not collect visual data, manufacturers claim the devices have the ability to monitor for aggression, bullying, and gunshots, in addition to air quality and disease transmission. The systems also purport to allow school officials and law enforcement to cross-reference device alerts against data from hallway CCTV, electronic hall passes, and other data to circumstantially identify students allegedly involved in vaping incidents.
For instance, HALO, a division of Motorola, sells VDTs as all-in-one solutions. Its products are not only hardware devices, including sensors (called “snitch pucks” in WIRED’s recent reporting on VDTs, which found that hackers could easily break into these systems), but also a software-as-a-service (SAAS) cloud-hosted monitoring platform that digests data and spits out environmental, behavioral, and security risk trend analysis used by school officials to facilitate student discipline. With these purported solutions, HALO invites schools to outsource and streamline disciplinary systems via the company’s proprietary infrastructure, ingesting student data into its cloud servers to generate trend analyses based on continuous monitoring of private spaces.
A new report by our group of independent journalists, researchers, and legal experts, along with the NOTICE Coalition and the Twin Cities Innovation Alliance, investigates HALO’s marketing claims as a case study of AI-enabled edtech. Despite the company’s claims about its products’ benefits, we were unable to locate any publicly available evidence of a direct link between the use of HALO’s VDT systems and consistent, ongoing reductions in the rate of student vaping. HALO provides testimonials from its customers, but we found no replicable studies or data showing change over time in places where its systems are in use.
As with many kinds of AI-powered systems, VDTs depend on the ingestion of data from different sources with varying levels of reliability and sophistication, which are digested through black-box machine-driven calculations that make the results appear rational, streamlined, and neat. Without clear external testing and evidence of accuracy and reliability, there is no way to tell whether these systems — which carry an outsized influence on the lives of students — generate false positives, misidentify students, or attribute characteristics or behaviors that are not grounded in empirical reality.
To conduct our analysis, we obtained one of the first independent datasets from a public school district related to the performance of VDT systems actively in use. The data we received from the Minneapolis Public School District emerged from a three-year pilot program in four local schools to test the efficacy of VDTs as a deterrent to student e-cigarette use. Our analysis shows that the HALO VDTs generated over 20,000 alerts in the 2024-2025 school year alone.
The data suggest that HALO VDTs produce a high and widely fluctuating volume of notifications, potentially creating an inflated estimate of vape use among students, as well as a false sense of certainty for administrators, while offering little actual insight. In the most extreme case, the data show that in one high school, officials received more than 600 alerts in a single day — translating to more than one alert per minute during school hours demanding school administrators' attention.
Moreover, the marketing literature makes what appear to be misleading claims about VDT capabilities, often using seemingly scientific language to make unrealistic promises about the ability of these devices to monitor indoor air quality, reduce disease transmission, identify the source of gunshots, and identify bullying incidents. Without independent research-based evidence, the many questions about validity and accuracy of HALO’s devices, systems, and capacities raise serious concerns about the use of these technologies, especially without regulation or safeguards.
In fact, we found that VDTs appear to reduce students’ sense of privacy and safety, creating an environment of surveillance and suspicion and eroding trust between students and teachers. In previous research and interviews, students describe the systems as “reactive, punitive, and ineffective.” Rather than stopping vaping, this research suggests that VDTs may shift the behavior to less-monitored areas.
Earlier this year, the co-authors of this article from the NOTICE Coalition and Twin Cities Innovation Alliance (TCIA) engaged in community consultation with key stakeholders on the impact of VDTs in public schools in Minnesota. Early feedback from those conversations indicates that educators experience increasing pressure to report on student activity, and are held responsible for reporting on student activity to confirm or supplement data generated by VDTs. This pressure creates a cycle in which the very adults responsible for students’ social and emotional learning are also asked to report on young people in ways that send them into a web of outsourced disciplinary action, breaking trust and limiting the ability of responsible adults to exercise independent judgment.
Policy takeaways
1. Transparency and governance policies are essential, but missing.
The marketing of AI-driven solutions to school systems is outpacing administrators’ and school boards’ capacity to establish reasonable and clear policies to ensure that new technology is used safely. These devices generate large volumes of sensitive data — and yet many schools do not gain consent from parents when the tech is deployed, nor do they publish (nor, in many cases, have) policies and plans for protecting student data or evaluating system performance.
Schools need reliable guidelines and rules to effectively manage the purchasing, application, and ongoing monitoring and evaluation of AI technologies procured for public use; and communities should be notified and aware when experimental technologies are being applied with tax dollars and on vulnerable populations such as children. State consumer protection agencies should also consider compliance and enforcement actions to hold edtech surveillance companies accountable for potential unfair, deceptive, or abusive marketing and business practices.
2. Public health should not be a marketing gimmick.
Framing VDT surveillance hardware as a disease control or environmental health tool exploits community concerns about health and well-being without necessarily delivering medically useful results. Schools and students deserve better than misleading metrics repackaged as health protections or gun safety prevention, especially when basic monitoring systems and ventilation strategies are still lacking in many schools, despite widespread concerns raised during the COVID-19 pandemic.
The same applies to the exploitation of community concerns regarding school shootings. Based on our current analysis, VDTs do not provide meaningful services related to student safety from gun incidents, and any marketing claims to that effect are speculative.
3. VDT systems place student data at risk.
Information is scarce regarding who has access to VDT-collected data: how it is stored, and how it is used across school systems and law enforcement partners, as well as the VDT vendor itself and its third-party cloud, hardware, and data partners. Even beyond questions about how schools may use or manage student data, we do not know how or to what extent student data is anonymized in predictive models, which are built from the ingestion of data across its customer base. In the absence of regulation, publicly available policies, and community as well as parental consent, these systems risk becoming part of a broader trend toward ambient surveillance in schools, normalizing the idea that students must be surveilled to be safe. Moreover, officials must take more proactive steps to protect student data privacy from malicious attacks.
4. To truly address vaping, provide holistic prevention and cessation support.
Data from the Minnesota Department of Education indicates that while the rate of student vaping has increased by only 3% since school districts began adopting vape detection technology in 2016, the number of Minnesota students disciplined for tobacco has risen 121%. Research has consistently demonstrated that the consequences of exclusionary discipline negatively shape student learning outcomes and increase the likelihood of contact with the criminal legal system. In other communities, we have seen how the use of AI surveillance technologies in schools has led to children’s involuntary psychiatric detention, arrest, and incarceration.
Currently, the most effective and empirically proven method to stop student vaping is to invest in evidence-informed prevention and cessation interventions such as substance abuse counseling, peer support programs, and school-based mental and behavioral health services.
- - -
Based on our findings, we recommend that policymakers and school officials apply rigorous review to the claims of edtech vendors, examine contracting and data management terms closely, and work with parents proactively to co-create responsible and accountable use policies. Responsible adults involved in working with young people would do well to expand investments in practices of social innovation within systems of care rather than invent new methods of surveillance, punishment, and social control.
Authors





