Parkland school turns to experimental surveillance software that can flag students as threats
By Drew Harwell
Kimberly Krawczyk says she would do anything to keep her students safe. A year ago Thursday, the Parkland, Fla., high school math teacher barricaded students behind her classroom door during one of the deadliest mass shootings in U.S. history.
But one of the unconventional responses that Broward County Public Schools said could stop another tragedy has left her deeply unnerved: an experimental artificial-intelligence system that would surveil her students closer than ever before.
The school system in South Florida, one of the largest in the country, said last month it would install a camera-software combination called Avigilon that would allow security officials to track students based on their appearance. With one click, a guard could pull up video of everywhere else a student has been recorded on campus. ADVERTISING
The 145-camera system, which administrators said will be installed around the perimeters of the schools deemed “at highest risk,” will also automatically alert a school-monitoring officer when it senses events “that seem out of the ordinary” and people “in places they are not supposed to be.”
The supercharged surveillance network has raised major questions for some students, parents and teachers, such as Krawczyk, who voiced concerns about its accuracy, invasiveness and effectiveness. Her biggest doubt: that the technology could ever understand a school campus like a human can.
“How is this computer going to make a decision on what’s the right and wrong thing in a school with over 3,000 kids?” said Krawczyk, a 15-year teacher who was on the third floor of what’s known as the freshman building at Marjory Stoneman Douglas High School when the shooting began. “We have cameras now every two feet, but you can’t get a machine to do everything a human can do. You can’t automate the school. What are we turning these schools into?”
The specter of student violence is pushing school leaders across the country to turn their campuses into surveillance testing grounds on the hope it’ll help them detect dangerous people they’d otherwise miss. The supporters and designers of Avigilon, the AI service bought for $1 billion last year by tech giant Motorola Solutions, say its security algorithms could spot risky behavior with superhuman speed and precision, potentially preventing another attack.
But the advanced monitoring technologies ensure that the daily lives of American schoolchildren are subjected to close scrutiny from systems that will automatically flag certain students as suspicious, potentially spurring a response from security or police forces, based on the work of algorithms that are hidden from public view.
The camera software has no proven track record for preventing school violence, some technology and civil liberties experts argue. And the testing of their algorithms for bias and accuracy — how confident the systems are in identifying possible threats — has largely been conducted by the companies themselves.
If the Avigilon contract wins final approval from county leaders in the coming weeks, the school district will spend more than $600,000 in federal and local funds activating the AI-powered system around the high school campuses “with the highest security incidents,” contracting records show. The camera system will run independently alongside another 10,000 cameras already recording across the county’s schools.
Many aspects of the program, however, remain a mystery, and it’s unclear how exactly the surveillance system’s data and performance will be regulated, measured or tested for potential flaws. The school district rejected a Washington Post request to see records relating to the project, including officials’ communications with the company, citing a broad Florida statute exempting any information related to surveillance systems from public-records law.
Avigilon’s technology is not a perfect panacea. Its “appearance search,” a two-year-old feature that would allow a school official to find or highlight people based on what they’re wearing, has an accuracy rate that varies widely depending on factors like lighting and time of year, said Mahesh Saptharishi, the chief technology officer at Motorola Solutions. The system would be less accurate, for instance, in wintertime, when students are going to school in heavy coats.
Its “unusual motion detection” feature is advertised by the company as a way to automatically sense when students are running toward a brawl or away from an attack. But some students wondered just how much the computer could comprehend about the chaos of a typical high school, where frenzied movements and sudden gatherings are an everyday event. One teacher asked whether the system would know the difference between a boyfriend and girlfriend kissing each other and two people about to start a fight.
Saptharishi said the technology is a tool for security staff, not the final decision-maker itself, and that its performance in Broward schools and other early adopters could help further refine the results.
“Today, I don’t know of any quantitative results that clearly show these tools are bar-none effective or bar-none ineffective,” Saptharishi said. But he said the company has researched the systems closely and continues to train them, including with data taken from some participating schools. “We believe they have a net positive human impact,” he said.[Unproven facial-recognition companies target schools, promising an end to shootings]
No school-security measure has grown more than the use of surveillance cameras, according to survey data from the National Center for Education Statistics, expanding from nearly 20 percent of all public schools in 1999, the year of the Columbine High School shooting in Colorado, to more than 80 percent in 2015.
But it’s unclear what effect the cameras have had on mass violence. The number of school shootings every year has remained flat or grown slightly over that period; there were 25 shootings last year, in what was the worst year for mass school violence in at least two decades, a Post analysis found.
Avigilon’s technology does not use facial-recognition software that can directly match a person’s identity to images in a database. Schools and community centers across the country are installing similar software in hopes of flagging or blocking entry to unauthorized visitors.