It’s almost impossible to go a day without encountering a single video camera. After all, there are almost 50 million surveillance cameras in the United States alone. However, people usually don’t pay much attention to cameras because the average one isn’t being continuously monitored.
Now, a report by the American Civil Liberties Union has found that new technological developments may lead to a system of continuous, mass surveillance. It opens up new questions about not only privacy but the act of turning everyday surveillance into a business.
The report — “The Dawn of Robot Surveillance” — focused on the $3.2 billion industry that’s known as “video analytics.” According to the report, these systems allow computers to not only record, but to analyze footage in real time.
These machines will be able to watch people for “suspicious” behavior and their deployment is already happening. Real-time face surveillance is currently used in both Chicago and Detroit.
Seeing the possible consequences of this technology isn’t hard for Jay Stanley, senior policy analyst with the ACLU, who said:
”It doesn’t take a big stretch of the imagination to think of this technology’s darker possibilities. Life insurance companies could monitor jogging speeds to determine which plans to offer particular individuals. Political campaigns could track and monitor attendees at rallies, assessing their facial expressions and emotion levels to tailor messages and distinguish between supporters. A corrupt politician could instruct their staff to find all instances of political enemies jaywalking.”
There are problems rooted within the algorithms themselves too. Many video analytic products claim that they can read information such as gender, race, and age, the report found. These claims are concerning for a few reasons.
To start, the idea that someone’s gender can be read by looking at them or recording them is rooted in transpobia. The notion that gender is something that can be physically tracked, leaves trans and non-binary people at unique risk.
In addition, video systems that claim to read race are contentious when remembering that race is not fixed. Race itself is a sociotechnical system —as Camilla Hawthorne, Assistant Professor of Sociology at The University of California, Santa Cruz noted — or an “arrangement of humans, technologies, spaces, and policy regimes” that encompass the “biological, the sociological, the political, and the technical,” citing Laura Forlano, Director of the Digital Futures Lab, and sociology lecturer at Goldsmiths, University of London, Kat Jungnickel’s 2015 essay.
Race itself can be thought of as a technology of oppression. It was designed and given value by humans. As previously noted by AfroTech, the idea that video surveillance can “read” somebody’s race gives weight to the false notion of a biological and measurable factor.
“Video surveillance powered by artificial intelligence raises some of the same issues that AI and algorithms raise in many other contexts, such as a lack of transparency and due process and the potential to worsen existing racial disparities and biases,” Stanley said.
“`For example, these cameras do not determine what is “suspicious” by themselves. Often, Black people are regarded as suspicious for simply existing. We’ve seen this with more and more instances of white people calling the police on Black people who are doing mundane things, like barbecuing. As NPR noted, it’s a way of excluding Black people from public space.
“AI surveillance also introduces new concerns, including the possibility of widespread chilling effects and the reshaping of our behavior. We will quickly become aware that our actions are being scrutinized and evaluated on a second-by-second basis by AI watchers with consequences that can include being flagged as suspicious, questioned by the police, or worse,” Stanley went on to add.
In a time where technology is rapidly changing, U.S. policymakers are often way too late to the conversation. The ACLU has called on policymakers to take action by prohibiting this technology’s use for mass surveillance.
The image of people who are constantly being watched — by their government or each other — makes up the backbone of many dystopian tales. However, vulnerable communities in the United States have always been under surveillance. This technology has the potential to only exacerbate what many have already been dealing with.