Community News

Under Surveillance. A story of our times

Published

on

BY SIMONE J. SMITH

“The rollout and normalization of AI surveillance in these public spaces, without much consultation and conversation, is quite a concerning step.” (Jake Hurfurt, The Head of Research and Investigations at Big Brother Watch)

Imagine living every moment of your life under the unblinking gaze of a camera. Every smile, every frown, every tear – all analyzed, categorized, and stored. You might start to wonder: What does this camera see? What does it know about me? Does it know when I’m stressed? When I’m happy? When I’m scared? Every private moment, every intimate thought, reduced to data points for analysis. ​​The ethical considerations are numerous.

Who owns this data? How is it used? And what about our right to privacy? Are we ever truly alone?

We live in a world where surveillance is ubiquitous. It’s in our homes, our workplaces, our public spaces, and over the past two years, Network Rail has been trialing the use of AI-enabled “smart” CCTV cameras across eight major train stations in the United Kingdom. I came across a new set of documents — obtained by Big Brother Watch and first reported by Wired that detail the full scope of the trial.

This was first reported in February by James O’Malley. What is being reported is that this technology was broadly used to increase safety and security at train stations across the country. The software being applied to cameras throughout each test station was designed to recognize objects and activities; like if someone fell onto the tracks, or entered a restricted area, or got into a fight with another passenger. At that point station employees would receive an instant notification so they could step in. The idea was to leverage AI to run a safer, more efficient train station.

In one document, which outlines some of the use cases explored by the trial, Network Rail said that it expects no impact on individuals’ right to privacy: “The station is a public place where there is no expectation of privacy. Cameras will not be placed in sensitive areas.

Another document details feedback on the trial. In one section, the document explores a “passenger demographics” use case, which produced a statistical analysis of age range and male/female demographics. It also analyzed emotion in people (e.g., happy, sad and angry).

According to the document, there are two possible benefits to this:

  • It can be used to measure customer satisfaction.
  • “This data could be utilized to maximize advertising and retail revenue.”

This use case, according to the document, will “probably” be used sparingly and “is viewed with more caution.”

Big Brother Watch reached out to Network Rail, but they did not respond to requests for comment. They did tell Wired in a statement that it takes security “Extremely seriously,” and works closely with the police “To ensure that we’re taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.”

The report did not share if these systems are currently in use, or at what scale.

The ethical considerations of using this kind of tech are numerous. Though the U.K. is no longer a part of the European Union, the EU’s recently enacted AI Act bans emotion recognition in the workplace and in schools. Several groups, including Access Now and the European Digital Rights group (EDRi), pushed for an all-out ban on emotion recognition, saying that it is based on “pseudoscience,” and incredibly invasive. The EDRi said that emotion recognition endangers “Our rights to: privacy, freedom of expression and the right against self-incrimination,” adding that this is of particular concern for neurodiverse people.

The balance between privacy and security is delicate. Too much surveillance, and we lose our sense of freedom. Too little, and we risk our safety. Is this the price we have to pay? It’s a question we all must consider. In a world where cameras are always watching, how do we protect our humanity?

Only time will tell.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version