There’s a quotation by American historian Timothy Snyder currently circulating on social media. It comes from the opening of his pamphlet, On Tyranny, and it reads: “Do not obey in advance. Most of the power of authoritarianism is freely given. In times like these, individuals think ahead about what a more repressive government will want, and then offer themselves without being asked. A citizen who adapts in this way is teaching power what it can do.”
If you’ve walked down the street today, chances are you’ve been caught on film. There are an estimated 7.5m CCTV cameras in the UK, equivalent to one for every eleven people. And if you’re reading this in London, you’re living in one of the most surveilled places in the world, up there with cities in China, Russia, India and Singapore. For the UK – a country that prides itself on its policing system being a civilian force and not a military one – why are we being constantly watched?
Mass surveillance is nothing new: from historical headcounts in the form of a census to Julius Caesar’s networks of spies, it would be a foolish government that didn’t keep an eye on society. But from the mid-twentieth century onwards, digital technology meant the tools could reach further and gather more. Nowadays, most of what we do is subject to scrutiny, be it the paths we walk in public or the words we type on screens. Sometimes it’s the government watching; sometimes it’s the vast multinational corporations funnelling our words through their algorithms and serving adverts back to us. If we don’t notice it happening, we might not really care. But maybe we should.
We go through our lives assuming that surveillance is to catch the bad people and that we, of course, are not the bad people. But, as whistleblower Edward Snowden remarks, “this is no different than saying you don’t care about free speech because you have nothing to say”. Wanting privacy is not the same as hiding something. But if you think you have nothing to hide, where is the line drawn? Unlimited access to your phone? Camera in the bedroom? And really, how much can you trust the government – current, recent or future – to have your best interests at heart?
Every social media platform we use, every website we visit, every loyalty card we scan, a profile is built of us, bit by binary bit
Surveillance in the UK did not begin as a product of artificial intelligence but it’s certainly aided by it. Few of the CCTV cameras on the streets are joined up in any meaningful way. If you’re mugged or your bike is stolen it would be handy to have footage, but anyone who has been in that position soon finds out that there’s no single centralised system. What AI is good at is aggregating data: pulling it all together and sifting through it, making connections and finding patterns. As AI capabilities grow, so too does the fragmented profile of our individual lives.
Surveillance is spun as safety. That’s the line that police forces are using as they roll out live facial recognition technology – algorithms that can scan you, extract the key features into numerical form, and compare this mathematical version of your face to a database of suspects. Sounds convenient; shame it’s fundamentally flawed. It didn’t take long to find that human bias is amplified by the machine. From the early days of photography onwards, cameras have been optimised for white skin. If you aren’t white, you’ve a higher chance of being misidentified by facial recognition technology. If you’re also female and young, the chances of misidentification increase. If there’s a case to be made in the interests of safety, then the bar is currently low. Right now, when police forces want to use live facial recognition, they have to post notices warning it’s in use in the area. They aren’t catching criminal masterminds this way, but they have been stopping innocent people without just cause if they prefer to cover their faces as they pass through.
Meanwhile, over on social media, the price of freedom is eternal vigilantes – particularly those posting video doorbell footage on neighbourhood group chats, flagging anyone who walks past their house, often on suspicion of being foreign in a built-up area. It’s a low-effort way to become part of the surveillance system. All this doorbell footage is owned by the company that makes them – in the case of Ring, one of the most popular systems, that’s Amazon. You, the doorbell owner, are spied on in turn: Amazon now owns your info – your name, address, bank details, Wi-Fi connection, and the timestamps of you opening your front door.
Every social media platform we use, every website we visit, every loyalty card we scan, a profile is built of us, bit by binary bit. It’s the tyranny of the algorithm and it’s the trade-off we accept for the switched-on life we enjoy. We have laws to protect our data, but they only go so far. We can integrate the technology while choosing privacy. We can decide that we have nothing to hide but something to protect. We can choose to reject surveillance. Snyder’s words warn us that we should.
Kate Devlin is an academic and author of “Turned On: Science, Sex and Robots” (Bloomsbury, 2018)




