‘You don’t get to 500 million friends without making a few enemies,’ as Facebook movie The Social Network put it. Everyone the social media giant has ever crossed has joined the pile-on this week, as Facebook CEO Mark Zuckerberg faces calls to come to parliament and explain himself. (Keep your pies at the ready.)
But ‘Zuck’ is just the smirking face of a much wider issue: the way the web has been captured by corporate profiteers who make their money from selling a simple product: you – or, more precisely, your data. The biggest technology firms, responsible for an ever-growing share of the world’s billionaires, follow the Silicon Valley mantra that ‘data is the new oil’ – and the apps and websites you use every day are the extraction method. They are not monitoring you for state-style social control, but for profit: surveillance capitalism.
Surveillance capitalism, as academic Shoshana Zuboff has defined it, is ‘constituted by unexpected and often illegible mechanisms of extraction, commodification and control that effectively exile persons from their own behaviour while producing new markets of behavioural prediction and modification’. What does data extraction mean in practice? Let’s return to Facebook. Behind the photo slideshows, cartoon smiles and birthday wishes, downloading your Facebook data shows that its app has been slurping every bit of data it can get its hands on, from the location of your phone to even who you’ve been calling and for how long. After all, what’s a little data between friends?
But it’s not just about Facebook – and not just about targeted advertising, either. Google knows everything you search for, and when you stay signed in to Gmail, it knows who made the searches. That’s not so unexpected, but I was surprised to find recently that Google had been keeping track of everywhere I’ve been for the past five years. I must have said yes once when prompted by its Maps app with some explanation about ‘improving the experience’, and that was enough for it to graciously keep track of my every footstep from then on. (See your own location history.)
Twitter, meanwhile, decided to make a list of every app I have installed on my phone, to show me ‘more relevant content’. (See yours here.) Netflix builds preference profiles based on what you watch, and then uses this data in aggregate to create entire new original TV shows, right down to the cover images, micro-targeted at sections of its audience. Amazon keeps track not only of what you buy, but everything you search for and look at – it’s all grist to its marketing mill. Every company can use what it learns from millions or even billions of people not just to target ads but to make decisions that will let it grow faster than the competition – so over time, the most data-driven firms come to dominate.
As computer security expert Bruce Schneier writes, the smartphone is ‘probably the most intimate surveillance device ever invented. It tracks our location continuously, so it knows where we live, where we work, and where we spend our time. It’s the first and last thing we check in a day, so it knows when we wake up and when we go to sleep. We all have one, so it knows who we sleep with.’
These ubiquitous ‘devices’ – the telescreens in our trouser pockets – arrived not as a state-enforced requirement, barking orders at us in the manner of an outwardly oppressive apparatus, but as our ever-present assistants, always keen to help us, and to help themselves to a little more of our data so that they might give us ‘better recommendations’. In the future rapidly approaching, when you have an automated home, a self-driving car and a city full of internet-connected sensors, their makers will be watching you too, unless we can change the path we’re on.
People have heard of ‘algorithms’, those annoying things that mean your social media news feed does its best never to appear in chronological order. But algorithm is a soft term for what is really going on: machine learning – cutting-edge artificial intelligence – is being trained on all that data being extracted from massive populations every day. Every time you scroll on Facebook, hit the heart button on Instagram or watch a video on Youtube, you are taking part in the latest round of a never-ending worldwide experiment.
You are like a rat in a maze, with a machine showing you a stimulus, noting your response (and everyone else’s response), and then showing you another. Oh, looks like that one made you feel angry! But the notification you got afterwards clearly stroked your ego. Interesting…
The machine is not attempting to make you happy – though, to be fair, it is not attempting to make you sad either. Its aim is what is called ‘engagement’: in other words, to keep you running around inside the maze for as long as possible each day, every day, for the rest of your life. Why bamboozle billions of people like so many rodents? Because every minute you spend ‘engaged’ racks up another few fractions of a cent in corporate profit.
Tristan Harris, a former Google employee and founder of the Center for Humane Technology, argues that apps’ feeds hook into the same parts of human psychology as gambling does: ‘When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got… If you want to maximise addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward.’ Such apps are essentially Skinner boxes.
The attention rat-race is what causes all of the presumably-unintended consequences that are more visible on the surface, from Facebook fake news farms to the creepy, often auto-generated kids’ videos clogging up Youtube. The tech giants’ money-bots spray out audience traffic and ad cash in the direction of anyone who produces content that captures attention – what that content might be is at best a secondary concern.
What ‘your feeds’ show you, and in what order, is no longer under your control. That doesn’t make them some kind of mind-control machine, but it does mean that, over time, the decisions of a large population could be influenced in subtle ways – as Facebook found in a 2014 study where it was able to influence users’ emotions.
Cambridge Analytica, according to whistleblower Chris Wylie, used psychological profiling to ‘know what kinds of messaging you would be susceptible to’ – so, for example, ‘a person who’s more prone towards conspiratorial thinking’ might be shown ads that play to that mindset, helping an idea to spread by starting with a ‘group of people who are more prone to adopting that idea’. Without boarding the bandwagon of blaming Facebook for all political ills, it doesn’t seem so inconceivable that a large enough advertising campaign with that kind of targeting could influence an election – or, say, a referendum – by a crucial few percent. As the whistleblowers’ evidence highlights, such methods have been quietly in use in the global South for several years.
But while Brexit and Trump – and the Cambridge Analytica affair’s contribution to the ever-growing web of connections between the two – have catalysed new interest in how our data is being collected and sifted, any potential solution has to start much further back, before the data was originally gathered. These data-mongers are no geniuses: they stumbled across Facebook’s data goldmine and filled their boots. The point is that such a motherlode should never have existed in the first place.
At its best, social media has provided an important platform for alternatives to the mainstream media, allowing people to spread the word about protests and grassroots events, giving a voice to people previously marginalised and ignored. This is surely one of the factors behind the emergence of Corbynism. The question is: how can we decommodify our everyday interactions – and even our resistance?
For the moment, while deleting your Facebook account might feel like a start, individuals opting out does little to change the overall societal problem. (Facebook is known to keep ‘shadow profiles’ of non-members, so it might not even make much difference to your own privacy.) Collectively, we need to demand regulation – the EU’s new General Data Protection Regulation (GDPR) is a good start – and, just as importantly, build alternatives that explicitly reject the data-extraction model.
There is nothing inherent about social media that requires it to plunder our personal data, other than the companies’ surveillance-capitalist business models. Open source, non-profit tools, in the original spirit of the web, could let us communicate freely and easily, giving us the positive aspects of social media without taking the mercenary-spies along for the ride. Mastodon is one existing open source effort that has been able to start reaching out beyond just techies.
Going one step further, can we imagine machine learning put to social use? It’s possible in principle – though to my knowledge it hasn’t yet been tried – to flip the switch on those mass experiments so that they don’t aim to produce engagement, profit or propaganda, but happiness instead. A ‘people’s algorithm’ could help us reject the rule of commerce and promote ideas and actions that challenge corporate power.
This week’s spotlight on Facebook will soon fade, but every data scandal – and there will be many more to come – increases the relevance and urgency of technological alternatives that let us take back our online lives from the corporations’ clutches.