The behaviour change industry
IN 2017, behavioural economist Richard Thaler won the Nobel Prize in Economics for his groundbreaking work on how people make irrational choices and how their behaviour can be modified to help them make better decisions. Cass Sunstein and Thaler’s book, Nudge, opens with an illustrative example of a woman who experiments with the layout of food in cafeterias to help kids make healthy food choices — for example, placing carrots rather than French fries at eye level.
Thaler’s work was instrumental in exploring how public policy can be leveraged to manipulate human behaviour for better outcomes. Governments around the world formed ‘nudge units’ to prod their citizens towards making better decisions — from quitting cigarettes to reducing energy consumption. However, detractors, fearing the encroachment of the state into their personal lives, criticised the idea of nudges as a policy instrument for promoting government paternalism, infantilising people and diminishing autonomy.
Thaler’s applications of nudging seem well-intentioned — notwithstanding the psychological manipulation aspect — when compared to the greater threat we now confront: the nefarious and covert behaviour modification we are constantly subject to as we interact with digital technology.
At some point, most of us have wondered whether the microphones on our devices are spying on us. For example, shortly after discussing a particular clothing store with a friend, an advertisement for that very store has popped up on your Facebook newsfeed.
So what is the endgame for the technology sector, and why is the data it has collected so precious?
This is not so much proof that our devices are recording our conversations, but rather that Big Tech has collected such vast and comprehensive data on us — the websites we visit, what we like and comment on, our physical locations, how long we spent inside a particular store or restaurant, what we search for on Google, the apps we use, the list is endless — that they are now able to predict our thoughts and behaviour with terrifying accuracy.
In the last few years, Facebook and Google have become notorious for this practice. As their business model centres on advertising revenue, they collect user data and use it to target personalised advertisements. There have been countless protests, investigations, lawsuits, and US Senate hearings highlighting these issues, but little real progress has been made on protecting user privacy and autonomy.
So what is the endgame for the technology sector, and why is the data it has collected so precious? Harvard Business School’s Shoshana Zuboff quotes a senior software engineer as saying, “The real power is that now you can modify real-time actions in the real world.” The goal is not just to use data to find patterns and predict behaviour, but ultimately to modify behaviour. This behaviour modification can be to increase advertising revenue by getting more clicks, drive more sales with targeted advertisements, sway public opinion on policy issues or manipulate electoral outcomes.
Facebook demonstrated its ability to control users for the first time back in 2014. In a secret experiment, Facebook altered newsfeeds to observe users’ responses to various stimuli — a process it referred to as “emotional contagion”. The results showed that the social media giant was effectively able to manipulate the thoughts and behaviours of its users. This underscored the immense power technology platforms hold, and their ability to conduct psychological and social experiments at any time without their users knowing.
Fast forward to 2018. Cambridge Analytica, self-described as a “behaviour change company”, secretly collected data on millions of American Facebook users to build in-depth personality profiles and understand what triggers them. This allowed them to modify behaviour, manipulate votes and sway the 2016 presidential election. As whistleblower Christopher Wiley put it, Cambridge Analytica was “playing with the psychology of an entire country without their consent”.
As this scandal was exposed, there was public outrage. People were shocked when they discovered how their data could be misused. In truth, this has been standard practice in the technology sector for years; business models are built around it.
All of us who interact with digital technologies have virtual identities or psychographic profiles that shadow us online. These are created from the vast amounts of data collected on us. This includes what we click on, what we search on Google, our GPS coordinates, what we eat, buy, read and share, our demographic data, and so on. Based on these profiles, personalised information, advertisements and behavioural nudges are targeted towards us. Hence, we often see an advertisement that is so on the mark we feel our devices must be watching and listening.
Many lay users assume these issues do not affect them — ‘I have nothing to hide’ or ‘I am not important enough for data to be collected on me’. The excitement around new technology, the research that goes into making digital interfaces more and more addictive, and the secrecy of the technology sector have lulled users into complacency.
While the idea of nudges as a public policy tool created fears of the nanny state, the power of the technology sector to modify user behaviour is far more frightening in its scale and efficacy. Technology platforms have greater reach than any national government — and their singular motive is profit maximisation.
What companies like the Big Tech are doing — manipulating thoughts and actions ranging from consumer behaviour to voting patterns — has proven successful not just in the digitally advanced Western world, but also across the developing world. As the Pakistani state expands its use of technology, social media and digital surveillance tools, citizens must be aware of the risks they face online. For anyone who believes in the basic human rights of privacy, autonomy and self-determination, this should be a wake-up call.
The writer is a development practitioner.
Published in Dawn, September 29th, 2019