CCTV

This article is a preview from the Autumn 2016 edition of New Humanist. You can find out more and subscribe here.

If you’ve ever shop-lifted, it’s possible that your face is on a database that over 10,000 UK retailers can see. Very few people have heard of face-profiling software Facewatch, but its implications are serious. Launched in 2010, Facewatch is a searchable, shared database of CCTV images of people considered to be suspicious or criminal, which businesses add to directly. Facewatch’s website markets the database as a “watch list showing images of suspects of interest” that helps retailers to target “theft, fraud, and anti-social behaviour”. It’s also directly in use by the police.

What this means is that if a shop, restaurant or other public outlet using Facewatch considers you suspicious, whether due to alleged crime or the much more nebulous “anti-social behaviour”, video footage of you can be stored on a database, and used to profile you as a “suspect” in other places. Businesses can also fill out an online report which gets sent to the police. You don’t need to have been convicted of anything in order to be put on the database – reports are up to whoever is using the software.

Is this “precrime”? It’s a form of surveillance profiling based on reports from unaccountable private businesses: all that’s needed is that someone, at some point, thinks you’re suspicious. If you committed a crime once, you go on the watchlist: a criminal “type”, likely to reoffend. There is no online information about how to have your image removed from the database, or what to do if you are profiled without having done anything. The aim is to prevent crime before it happens: it’s a dystopian technology in action.

Philip K. Dick’s classic short story “The Minority Report”, published in 1956, made the idea of precrime famous. Dick imagines a society in which crime has been overwhelmingly abolished by the introduction of precrime intelligence. In the story crimes are predicted by a group of three “pre-cog mutants”, before they are committed, and the future perpetrators arrested and sent to a detention camp by the Precrime Division. As the story’s protagonist John Anderton states: “In our society, we have no major crimes . . . but we do have a detention camp full of would-be criminals.” Along with Orwellian concepts like the “thought police” and “Big Brother”, the idea of precrime is compelling because it is so frightening: that you may be penalised by the state for something you haven’t yet done.

Facewatch’s creators are currently working with facial recognition software companies Imagus, Herta Security and NEC to automatically compare stored images of “suspects” to people entering businesses. The website boasts that upgrading Facewatch to use facial recognition will “enable alerts to be provided to businesses the instant that someone on their watch list enters their premise”. The consequences of this are obvious: a database full of images of people profiled as suspicious, run by private companies in collaboration with the police, in which simply entering a shop can trigger a series of automatic alarms.

Precrime is a classic concept in dystopian art – and when a dystopian writer appears to have predicted the future, it hits home. It also says something about the politics and priorities of the world it was written in. Dystopian societies are nightmarish, bleak: and the fact that we interpret them as such makes clear where our values lie.

Anti-fascist speculative writers have explored variations on the rise of fascism at length: Donald Trump’s current campaign slogan “Make America Great Again!” was used by afrofuturist writer Octavia Butler in her 1998 novel Parable of the Talents, which imagined the takeover of the US by religious extremists. The slogan is used by the fictional US president Reverend Andrew Steele Jarret: a Christian fundamentalist whose followers go around murdering “heathens”. Trump’s use of the same approach is telling: he’s a speculative fiction writers’ dream of a neofascist politician. He’d seem unbelievable if only he was made-up.

As pictures of a dystopian future go, Margaret Atwood’s MaddAddam trilogy is one of the best. Her post-apocalyptic vision is also full of prescient material. The books outline how a human-made viral epidemic wipes out all of humanity but one, the main character Snowman – before, poignantly, going back in time to describe how we got there. Published in the early 2000s, Atwood’s society is not that dissimilar to our own now: the rich live in gated communities controlled by biotech companies with names like HelthWyzer, OrganInc, RejoovenEssence. Food is genetically modified for corporate profit: everyone eats artificially grown chicken breasts called Chickienobs and drinks Happicuppachinos, modified coffee beans which put smaller growers out of business. Atwood’s portrayal is staunchly anti-capitalist: corporations rule the MaddAddam world – even the police are named the CorpSeCorps – and eventually bring about its destruction. Oryx and Crake, the first volume in the trilogy, was published the year before the Googleplex was built. Now, parts of its world have become our reality.

Dystopian stories as a body of work are surprisingly accurate and rich in plot that later becomes prediction. Nalo Hopkinson’s Brown Girl in the Ring presents a view of the near future that hints at inner-city gentrification in reverse: set in Toronto after an economic collapse, the world is one in which the rich have left the city for the suburbs, leaving violence, homelessness and gang control in their wake. Lizzie Borden’s 1983 cult film Born in Flames imagines a lesbian feminist terrorist group led by a woman of colour, uprising against a violent, white supremacist, patriarchal state: the world they’re fighting against isn’t the world we live in today, but it isn’t far off.

Dystopia acts as a powerful critique of the society we live in. But there are also key points of departure that make the reading much more enjoyable. So far, and thankfully, we do not have semi-conscious pre-cog mutants chained to a crime prediction machine. Global conflict and endemic diseases have not yet killed everyone. Artificial intelligence has not entirely taken over.

But when reality starts to catch up with past dystopias, as with the spread of precrime-esque software, there is good reason for concern. Other instances of real-life precrime make it clear how and against whom profiling software might be implemented. The disproportionate targeting of young men of colour by police using stop and search laws is well documented. Reading back over Anderton’s description of a “detention camp of would-be criminals” is even more haunting as detention camps proliferate around the UK, increasingly full of people who are effectively prejudged as criminal due to being migrants or refugees. It’s also now possible for any two policemen to give someone they suspect of being a sex worker a “prostitute’s caution”, meaning that police are able to target not just those who actually are street-based sex workers, but anyone they may perceive as “undesirable”, including homeless women. For the caution to be given there doesn’t need to be any evidence of a crime having been committed, and it stays on your record with no right of appeal.

In certain ways, our present resembles the dystopian futures imagined by novelists – the move towards total surveillance is a case in point, as is the growth of far-right politics in Europe and in the US. Being able to delve into speculative fiction and contrast what hasn’t happened with what has can be frightening. But to do so is important. Using art to contextualise horror makes it easier to push back against the spread of the false pragmatism that says, “That’s just how things are.” It shows us how quickly things can get worse. And at its best, speculative fiction could point towards a better future: not one in which people are wiped out by conflict, but one in which resistance wins out.