Don’t Buy What Surveillance Capitalism Is Selling
If capitalism is a book of many chapters, Shoshana Zuboff believes we’re turning the page on a new one.
As human civilization transitions from a society driven by technology to one that is utterly reliant on it, Zuboff, a Harvard Business School professor, questions whether this new civilization will be one we can call home.
Zuboff is author of In the Age of the Smart Machine, a seminal study of the social, economic, and emotional consequences of computer technology in the workplace. She recently spoke at Queen’s University about her new work, The Age of Surveillance Capitalism, in a presentation sponsored by the Surveillance Studies Centre, Smith School of Business, and other faculties.
“Home is always the same thing,” Zuboff says. “It's where we’re known and where we know. Where we love and where we are beloved.”
Yet, as changes accelerate and “home” becomes increasingly unrecognizable, we’re told that the brand of technological change that is upon us is inevitable – embrace it or be left behind. The message of tech companies today echoes the motto of the Chicago World’s Fair in 1933: “Science Finds, Industry Applies, Man Adapts.”
“The word ‘inevitable’ is programmed to target our sense of human agency to delete resistance, to delete creativity from the text of human possibility,” she says. “Inevitability rhetoric is designed to render us helpless and passive in the face of implacable forces that supposedly are, and must be, indifferent to the merely human.”
Fight for Self-Determination
The power struggles of the Industrial Age were between capital and labour, Zuboff says. Today, the struggle we face is between information capital and the privacy, freedom, and self-determination of all individuals.
Zuboff chronicles the story of Google’s growth, the discovery of a new source of capital that enabled its transformative success, and the transition of an entire industry to the model of surveillance capitalism.
Google’s success derives from its ability to predict the future — specifically the future of human behaviour. At the start, Google collected data on users’ search-related behaviour to improve its search engine and serve customers with better results. But as it discovered the appetite of advertisers that would pay for improvements in ad targeting, Google in turn aggressively sought new sources of customer data.
“According to its own scientists’ accounts,” Zuboff says, “the methods that they found to seek new sources of data were prized for their ability to find data that users intentionally opted to keep private, and to infer extensive personal information that users would not, or could not, provide,” Zuboff said.
This kind of methodology could access behavioural data irrespective of users’ intentions, even when users desired to be private. Additional data from hundreds of millions of users would be used to analyze and define predictive patterns that could match a specific ad with a specific user.
Hiding Behind Terms of Service
The nature of this mass collection and sale of data is intentionally secret and opaque, says Zuboff. It’s intended to produce user ignorance and designed to circumvent user decision rights. For those who believe the service they receive is a fair exchange for their data, Zuboff argues that the obscure process results in information asymmetry, leading to consequences that users do not and cannot know and ramifications that few will understand.
Terms of service or end-user licensing agreements are designed to be convoluted legal documents many pages long to discourage user understanding. The interconnected nature of technology means that many devices, services, or websites transfer data with third-party companies, all of which have their own user agreements. Legal scholars have calculated that to buy a Nest Thermostat, a user must read 1,000 pages of contracts.
Zuboff envisions a future in which our driving behaviour is analyzed by security cameras and our emotional stability is analyzed by our social media comments
Furthermore, the ramifications of data resale eclipse the understanding of nearly all users. Zuboff envisions a future in which our driving behaviour is analyzed by security cameras and our emotional stability is analyzed by our social media comments. In turn, these data sources can then be sold to insurance companies to precisely price an individual’s level of risk.
Lastly, Zuboff argues that companies are moving from simple prediction to the act of behavioural influencing. Companies will naturally seek to not only understand and predict the actions of customers but actively influence them to perform the desired outcome, resulting in a cycle of self-fulfilling predictions.
Changing Behaviour at Scale
From interviews with more than 50 data scientists, Zuboff found that a common theme was the ultimate pursuit of changing people’s behaviour at scale. Both conscious and subconscious behaviours can be tracked, good and bad behaviours identified, and methods developed to reward the good and punish the bad. In most cases, good and bad behaviours are behaviours that are profitable and unprofitable respectively.
Zuboff regards this new form of surveillance capitalism as a danger to our social fabric and a profoundly anti-democratic threat. She believes it reaches far beyond the conventional realm of the private firm – companies are accumulating not only informational assets and capital, but the decision rights from all of us.
Despite this, Zuboff takes a hopeful view on our ability to shape the future.
“We are at the very beginning of this journey,” she says. “Life in 2050 will depend upon institutional developments like the kinds that have existed at other times in our history, which have tethered capitalism to the social, which have enforced the demand that capitalism operates and reciprocates with its populations.
“What we do now, between now and then, is ultimately what will determine the outcome of this story.”
— Henry Tian