How to advance social capital in a world increasingly governed by surveillance capitalism?
About a year ago, I read Atlas of AI by Kate Crawford, a brilliant analysis of the extractive processes that govern the field of machine learning, from environmental resource allocation to the harvesting of our political fabric to data privacy infringements. This article reflects on the learnings from Crawford’s book. I wanted to hit the ⏸ button and reflect on the implications of a trillion-dollar industry as well as the responsibilities that data practitioners have in an algorithmically governed world. How can digital users be active participants in defining the new digital world? What kind of social conscience do data-powered companies need to develop? What responsibility do they have to advance social capital even if their mandate doesn’t require them to care? I hope this piece is a conversation starter for the data community to develop a new framework to embed social engagement at the core of data practices.
If this in-depth educational content is useful for you, subscribe to our AI mailing list to be alerted when we release new material.
What is Data Capitalism?
Sarah Myers West from the AI Now Institute explains in a 2017 article that data capitalism is a system in which the commoditization of big data in favor of actors that profit from the data leads to an “asymmetric redistribution of power.” In other words, big data makes money but does not create wealth equally. It’s a closed system in which for-profit entities have successfully managed to extract, transform and capitalize on the data, our data. A more recent article published by Milner and Traub in Demos in 2021 uses the same definition but stresses the social woes caused by this redistribution of wealth, placing an especially high burden on protected classes.
Surveillance Capitalism and Algorithmic Governance
While data capitalism and surveillance capitalism have similar characteristics, they aren’t exactly the same. Surveillance capitalism relies on the foundations of data capitalism. Without the extraction of data, there is no surveillance system to build. Without the monetization of that data, there is no capital to generate from building such a surveillance system. Surveillance capitalism is a term originally coined by Shoshana Zuboff, Harvard professor and author of the Age of Surveillance Capitalism (2018). She describes surveillance capitalism as a process: human experience is used as raw material and translated into behavioral data, which can be used to build prediction products (our future behavior), in turn traded by companies that use Machine Learning (ML) algorithm customization to generate profitable returns.
Algorithmic governance is very close to surveillance capitalism in that it recognizes a system of surveillance in which our behavior is tracked and modeled by ML algorithms. However, the environment is different. ML is used to carry out public policy objectives at scale with the idea of maintaining a social order (respect of regulations, enforcement of the law, etc.). The goal isn’t to make money, though it can happen as a proxy, but to govern and, to a certain extent, control societies. Algorithmic governance is a funambulist’s walk between governance and control. That tension has caused many experts and public policy experts to question the future of democracy in such a system.
Digital Feudalism
A 2021 talk organized by the University College London (UCL) Institute for Innovation and Public Purpose brought together Zuboff and O’Reilly to discuss the future of data capitalism, ironically captured by the term “digital feudalism.” Feudalism is a term borrowed from medieval times; it’s a political, cultural and economic system that dominated between the 9th and 15th centuries. In such a system, there are lords (i.e. landowners) and vassals (i.e. subjects who occupy the land in exchange for the production of goods and services). In this system of exchange, subjects are granted the right to a home and the lord’s protection only if they can pay up. Digital feudalism can be thought of as the user’s right to use social media platforms “for free” only if they agree to the exchange of their personal information. Our individual data becomes the “rent” due to social medial platform giants, the 21st century medieval landowners.
How has Data Capitalism Evolved?
The dot-com bubble burst 📉
The dot-com bust of the early 2000s was a pivotal moment in creating the “monster of data capitalism.” Before then, the focus in tech was innovation. The goal was to design new, creative, futuristic products that would truly make a prior decade look like it was set during the Mesozoic era. When the dot-com bust happened, investors threatened to retire funding of tech startups in Silicon Valley. As a result, tech companies had to find new ways of making money. Innovation in service of the individual was no longer enough to attract investments, individuals became the new commodity, the new product that could be betted on and against. And it made tech companies a lot of money.
Data can make money 💰
Advertising is everywhere— Tech companies realized that advertising was the key to making money (and keeping investors around). Advertising was not a new concept. Whereas before it was seen on billboards, newspapers or TV commercials, it now made its way to social media platforms. Thanks to massive data collection practices, online advertising became increasingly customized and targeted. They could even anticipate a person’s needs. One example of that is Target’s invasive use of ads, exposing a young teen’s pregnancy before it was even announced within the family.
The “Surveillance Dividend” — Shoshana Zuboff describes the profitability of major corporations using our data as the “surveillance dividend.” After the dot-com burst, people at Google realized that there was a “digital exhaust,” a treasure trove of data not collected for any specific purpose, that could be mined for signals that would predict the behavior of consumers. Between 2001 and 2004, ad revenue alone jumped by 4,486% thanks to the exploitation of the digital exhaust. The surplus in data collected up to that point led Google to grow exponentially, and Google continued to rely on our data as a means to grow after the company’s 2004 IPO.
From passive consumption to social conscience
The rents are illegitimate
One important thing that was highlighted in the 2021 talk with Zuboff and O’Reilly is that the rents collected by for-profit corporations (i.e. our data) are illegitimate. There is no justification for a lot of the data extracted, collected, transformed and modeled. Some of the data collected are even problematic and highly discriminatory (e.g. ZIP codes used as a proxy measure in insurance premium calculations). Companies’ data collection practices to predict consumer behaviors are therefore called into question. However, how can a single individual, even with the backing of major data privacy laws, fight a corporation to ensure fair and ethical use of the data? How can we be truly certain that the data we provide companies will be limited to service and/or product improvements only? How can the digital user play an active part in shaping the digital world?
Active participation
This idea that the future of data capitalism is digital feudalism is not necessarily 100% true. It’s a choice that requires active participation of digital users to prevent laissez-faire of data-powered corporations. While data capitalism is here to stay, a shift needs to happen where users become active participants in the shaping of the new money-making data machine. It starts with acknowledging that we can’t change the system unless we’re willing to invest in it.
Today, we enjoy free social media platforms, but they’re only free because we agree to be shown targeted ads. If we don’t want our behavioral data to be analyzed and manipulated, we need to agree that a membership-based model should replace an ad-based one. If we’re not ready, the alternative solution is to strengthen the data privacy regulations in place. To be active participants, digital users can share ideas to improve data privacy and security in open forums and referendums, where they can be debated.
Social conscience
While the user has a role to play in defining the rules of the game, companies have an even greater responsibility to use data fairly and ethically. In the end, companies are the vaults holding our data. There is currently no piece of legislation requiring companies to disclose how user information is collected, analyzed, and used. A proposed legislation submitted by the Securities and Exchange Commission on March 9, 2022 would require publicly-traded companies to disclose security data breaches; however, it only addresses data security and not data privacy. While the SEC’s role is to regulate publicly-traded companies and the proposed legislation does just that, it doesn’t put the same burden on private companies, which also use, model, and share our data.
In addition to developing strong data privacy programs, companies need to develop a social conscience by reprioritizing social goals alongside their economic ones. In the 21st century, I see 4 major themes: climate change, wealth inequality, political polarization and discrimination. Companies have and continue to feed social problems. Think about Facebook (rebranded Meta)’s role in aggravating the Rohingya genocide, Robinhood’s murky role in protecting the hedge fund that was shorting GameStop stock, Amazon’s destruction of millions of unsold products, etc. If companies caused the issue, they should feel the responsibility to fix it. Companies have a part to play in protecting our social fabric, encouraging discourse, providing financial tools to level the playing field, committing to using renewable energy sources, and more.
Conclusion
Data capitalism is here to stay but it doesn’t mean that digital users should be passive consumers in a system that controls their data. They should be empowered to shape the changing digital world we live in. The burden doesn’t fall on the user alone, however. Companies have an even greater responsibility to use our data ethically and fairly. It’s increasingly important that they develop a “social conscience” too — many social, political, economic and environmental problems have surfaced as a result of actions taken by large corporations. Now is the time to infuse compassion and care so we can increase the dividends of our social capital too.
This article was originally published on Towards Data Science and re-published to TOPBOTS with permission from the author.
Enjoy this article? Sign up for more AI research updates.
We’ll let you know when we release more summary articles like this one.
Leave a Reply
You must be logged in to post a comment.