By Apurva Purohit
Those of you familiar with the woes Facebook is facing today would agree when I say the Cambridge Analytica data scandal was probably the proverbial straw that broke the camel's back. It was most likely one of the first times that a tech corporation's wrongdoings had been so clearly laid bare.
Author and scholar Shoshana Zuboff calls this harvesting of personal user data for business and other ulterior motives 'surveillance capitalism'. Corporations provide free services to billions of people and in lieu of said offerings, monitor the behaviour of their users in astonishing detail - often without their explicit consent. While some of the data gathered is relevant for service improvement, the rest is classified as 'proprietary behavioural surplus'. This data is then processed via 'machine intelligence' to predict user behaviour, which is then traded in, what Zuboff calls, the 'behavioural futures markets'.
The term for the practice may be recent, but the seeds of surveillance capitalism were sown way back in 2001 by none other than Google during the dotcom bust. Google's investors were threatening to pull out, so in a bid to up the offerings, Google turned to previously discarded and ignored data logs and re-purposed them as 'behavioural surplus'. Instead of being used for product improvement, this behavioural data was directed toward an entirely new goal: predicting user behaviour. Since then, corporations have cashed in on our increasing dependence on technology and the internet.
Surveillance capitalism has spread across a range of products and services, encompassing virtually every economic sector. Nearly every 'smart' or 'personalised' product or service, every internet-enabled device, every 'digital assistant' is an enabler, a 'supply-chain interface' for the unhindered flow of behavioural data, all geared up to predict our futures in a surveillance economy, helping corporations rake in profits.
American non-profit organisation ProPublica reported late last year that breathing machines purchased by people with sleep apnoea are secretly sending usage data to health insurers, where the information can be used to justify reduced insurance payments. The Google-incubated Pokémon Go game is a brilliant example of covert surveillance. When launched, it was seen as a harmless foray into the world of augmented reality, but what it really did was collect vast amounts of data from millions of people.
Down the rabbit hole
Surveillance capitalism seeks to make society a place to be modified and controlled by undermining individual self-determination, autonomy and decision rights. We will believe we are making independent judgments, not realising we have been nudged to change our opinion of a politician or our breakfast cereal, based on the data gathered about our preferences.
For a moment, think about what this could mean for future generations who would have grown up with these new forms of technology. They might just become incapable of having the ability to choose their behaviour and beliefs, becoming pawns at the hands of corporations and governments who feed them with selective information. It is high time we understood the worth of the data we are so carelessly leaving behind all around us. It is imperative that we as consumers control what we want to share, how much we want to share and understand what purpose that data will be used for.
(The author is president, Jagran Prakashan)