A couple of months ago I sent a message that started as follows: “For the past 5 years, I have been doing a PhD research on large social platforms. I want to share a few thoughts on why I am quitting WhatsApp before February 8th 2021.” The short memo was written after WhatsApp unilateral announcement that there were changing their T&C to accommodate WhatsApp For Business. Since then, Facebook postponed the changes to their T&C to May 15th after the announcement caused an uproar.
Since I sent the message in January, I received many questions and comments, some people quit WhatsApp altogether, some opened accounts on other messaging platforms (mostly Telegram and Signal), and some disagreed completely with my analysis and kept using WhatsApp as usual. Based on the rich discussions I had with friends and acquaintances in the past few months, I want to share a few more thoughts before the change comes into reality.
Why Did FB Postpone the Change to May 15th?
Delaying the change gave Facebook time to achieve two things.
1. First, to rewrite the narrative through a process of habituation.
Habituation is a tactic that has been widely utilised by large social tech corporations in the past 15 years to gradually get us used to increasing encroachments on our privacy.
The tactic works as follows:
- Unilaterally announce the change. Ignore signs of disgruntlement. If the announcement causes a major uproar, issue a statement saying that you have been misunderstood and postpone. Stay low for a little while until some other news takes front stage (that shouldn’t take too long).
- This gives you time to design a communication campaign that heavily emphasises specific (reassuring) aspects of the issue (e.g., privacy is built into our DNA, we do not have access to the content of your messages) and completely obfuscate the real truth (we do not care a bit about the content of your messages, we want to track your behaviour not your words).
- If opposition is too strong – for example as in the case of the Facebook Beacon feature implemented in 2007 and abandoned in 2009 after a class action lawsuit – find other technical ways to achieve the same goal, which is what FB did with Facebook Connect and spreading a piece of code in its Like button that sends information back to FB when you visit any unrelated site anywhere on the web.
2. Second, it allowed Facebook to think out and implement a campaign of misinformation (I am using the word with purpose). Disinformation is the phenomenon of spreading lies. Misinformation is the phenomenon of spreading half-truths to create confusion or take control of the main narrative. WhatsApp is emphasising that the CONTENT of our messages is and will always remain private. What WhatsApp is not saying is that they could not care less about that content, because the data they hunger for is metadata.
METADATA & PRIVACY
What’s Metadata?
It is data about data, or data about your behaviour (not your words) when you are online. A few (non-exhaustive) examples are: who you know (your contacts), who you message, when and how often, who they know, who they message etc. But also, your device and user ID, your name, your location, your email, and all sorts of user and device content. In short, any activity that can be tracked and linked to you.
The Gold of the Internet
In tech parlance, metadata is referred to by the euphemism “breadcrumbs” (a noun that deceptively emphasises the innocuous and insignificant nature of this type of data). Google discovered early on that users left collateral behavioural data (the breadcrumbs) behind when they searched, and that the breadcrumbs could be aggregated and fed into machine learning to get really deep insights into who we are. Those insights go well beyond what we think we reveal when we live our life online. This collateral behavioural data is the “gold” of the internet.
The discovery above marked the birth of “targeted advertising”, another euphemism for the enterprise of surveillance that has grown exponentially and largely unchecked since the early 2000s. It’s not what you say online (content), it’s the traces you leave that count (metadata). For more, see Shoshana Zuboff under “resources” below (if you read only one book on this topic, let it be hers).
A Narrow Definition of Privacy
When we, users, think of privacy, we think about the content of the information we share online. But a quick look at WhatsApp Privacy Policy (see link below) makes two things clear:
- metadata (not content) is the core commodity being harvested.
- we have no say in what and how our data is collected, how it is treated and what it is being used for.
Under the “Information We Collect” heading, there are 3 paragraphs. Interestingly, the first one is called “Information You Provide”, hinting that the other two are information that you do NOT (and may not want to) provide. Indeed, they are: “Automatically Collected Information” and “Third Party Information”.
Here is an excerpt: “Device and Connection Information. We collect device-specific information when you install, access, or use our Services. This includes information such as hardware model, operating system information, browser information, IP address, mobile network information including phone number, and device identifiers.” It is really interesting to read through https://www.whatsapp.com/legal/privacy-policy if you have not already done so.
NB: WhatsApp Terms of Service and Privacy Policies change with time. I posted the pdf and some experts of the document as of 27 March 2021 in a post above.
REAL IMPLICATIONS OF WHATSAPP FOR BUSINESS
What’s Really Behind WhatsApp For Business?
As you probably know, WhatsApp has been sharing metadata from your chats with its parent company Facebook for some time already. The changes in T&C are meant to allow WhatsApp For Business (WFB) to take off the ground.
What does it mean? On the face of it, WFB is merely a tool to help businesses “better communicate” with us, their customers. Officially, Facebook tells us that it just means that businesses will be able to chat with you, give you information on their products, follow up on your purchases and give you “better” customer service. But if you think about it, most of that could already be achieved before WFB. By opening the possibility for transactions on WhatsApp, WFB will allow Facebook to get extraordinarily granular collateral behavioural data on aspects of our lives it had only indirect, limited or no access to before.
What Business?
One thing you may want to contemplate: what does “business” mean in “WhatsApp For Business”? Retail brands certainly, and this seems innocuous enough. But not only (and by the way, what seems innocuous to you is not so innocuous once it goes through the analytical capacities of Big Data). But “Business” may also cover health-related communications and transactions. You may not mind if Facebook peeks into your interactions with a fashion brand, but how would you like it to have granular access to your exchanges with businesses selling health devices and health practitioners (and that may include how much and how often you spend on treatments), insurers, money lenders, financial institutions and more? Do you really think that Facebook ought to have access to details from your credit cards and bank accounts statements (a statement states who, when and how much you spent with, so essentially, FB will have access to your statements)?
IMPLICATIONS
Commodification of users
I sometimes hear people say: “if you do not pay for a service online, you are the product”. It is true in spirit, but not completely. The massive enterprise of data extraction shows that we (users) have become, not the product, but the cheap commodity. From this, FB and Google extract the raw material they use to create the valuable products they sell not only to advertisers but to be honest, to anyone who is willing to pay for them whatever their intention. In other words, we are the pigs, not the Iberico ham.
An Enterprise of Territorialisation
Another aspect of the enterprise of digital surveillance can be summed up in two words: “never enough”. The internal competitive logic of the targeted advertising model drives those corporations to ceaselessly expand in order to acquire ever-more predictive collateral behavioural data. Since we still live (slithers of) our life offline, it is not enough to surveil our lives online, so the tactics used online are seeping through to the physical world.
Our bodies are the next frontier (in January 2021, Google paid $2.1 billion for Fitbit, a company that makes bracelets that record health and fitness data). Sensors, wearables, our bodies have become territories to conquer. How do you claim that territory? By creating new needs and new habits.
Wearables and sensors in the physical world (think “smart” cities) can detect changes that happen in your body below the dermal layer. How is that for an invasion of privacy? Who gets all that data about your heart rate and the number of steps you take and how many times you wake up at night? Where does that information go? What happens to that data after it is collected? Who decides what to do with it? Who benefits in the end? Do you know? And more importantly do you mind?
AN ECOLOGICAL CRISIS
Why It Is Urgent to Take Action
We got where we are today because the unprecedented nature of these developments has obfuscated what was really going on (see Zuboff for more on this). In 2021 though, it is hard to ignore that we have reached a point where we can’t afford to be complacent.
In 1964, environmentalist Rachel Carson wrote Silent Spring, a compelling call for the world to wake up to the large-scale slaughtering of our environment through the commodification of nature by large corporations with unmatched lobbying power. Today, we are doing to human experience what we started doing to nature 60 years ago.
Reconfigurations of Power
First, it is an ecological crisis in the distribution of power. The digital developments of the past 20 years have created massive asymmetries of knowledge. In other words, Google and Facebook know heaps about us but we know very little about them. The algorithms that orchestrate the online life of several billion people across the planet are considered proprietary trade secrets. These massive asymmetries of knowledge lead to massive asymmetries of power.
This is compounded by the void in the legal framework surrounding these issues. An army of in-house lawyers unilaterally controls our contractual relationship with some platforms which in many ways have become public spaces. The only alternative to full acceptance is not to use the service. And a second army of “communication experts” and lobbyists carefully craft the official narrative (the definition of privacy is only one example).
Crisis of the Ecology of Knowledge
But there is another more insidious and less talked about implication. The commodification of users is leading to a major epistemological catastrophe. The reductionist approach to human experience (i.e., to quantify purely qualitative aspects of our lives, and discard what cannot be quantified) is corrupting what we know, and how we know what we know. It is an ecological crisis of knowledge of the sort only seen once in a millennium (more about that in another post).
MISCONCEPTIONS
Everyone on the Web Collects Data, so Why Bother?
Now, coming back to where we started, I hope I have given some avenues of reflection for why quitting WhatsApp may not be such a bad idea after all. However, I want to address one concern that I heard many time in the past couple of months.
Some people told me: “if I move to Telegram or Signal, they will also collect my data, so what’s the point?” That’s a very valid question. While it is true that data collection is widespread online (we all heard about the cooking apps collecting your device ID and geolocation), everything is not created equal so we need to use our power of discrimination.
First, Telegram and Signal collect much less metadata than WhatsApp (or Facebook Messenger). In fact, Signal only collects your phone number. You can easily check this. In fact, just like you brush your teeth in the morning, it is a good habit to check what metadata the apps you use get from you. But that’s not all.
Exercise Our “WE” Muscle
Second, we need to exercise our “we” muscle instead of solely relying on our “I” muscle. As I mentioned above, this is an ecological issue. Self-centred thinking is not a luxury we can afford anymore. If social media have taught us anything, it is that we are intricately interconnected and that the decisions we make individually have global consequences. This is true for the environment; this is also true for the digital environment.
Be selective in whose business you patronise, because as you do, you are participating in creating the world that we will leave to the next generations. In other words, do not underestimate the positive ripple effects from an individual decision to use an equivalent app outside of the Facebook domain (and that applies to Google as well by the way). As I said in my previous post, if you leave, some people in your network will also leave. The network effect works in favour of platform monopoly but it can also work against it.
Final Word: Personal Convenience Is Not a Judicious Ground for Action
One last word. Some will decide to remain on WhatsApp not because they trust Facebook but simply for convenience. This is one option. Social platforms are built for convenience (in design parlance it is called “user experience” or UX for short), because they know that convenience drives users’ lethargy. To follow this argument, it is also more convenient to throw garbage through your car window, release harmful chemicals in rivers rather than in purifying facilities, and throw plastic bottles in the ocean rather than in recycling plants. Personal convenience is not always the most judicious ground for action.
RESOURCES
I am starting a Datafication & Technology blog to keep writing about these issues (www.datafication.space). Post, comment or send me questions (datafication.space AT protonmail.com) on topics you would like to see treated.
GENERAL RESSOURCES
- If you have not done so already, you may want to watch the Netflix documentary The Social Dilemma (trailer here: https://www.youtube.com/watch?v=uaaC57tcci0)
- The MarkUp https://themarkup.org a non-profit newsroom that investigates the impacts of technology on society. Regularly publishes research and investigations about the algorithms black box.
- Julia Angwin (https://juliaangwin.com) is an award-winning investigative journalist and editor-in-chief of The MarkUp. She led investigations into
- NYTimes “On Tech” newsletter by Shira Ovide, a guide to how technology is reshaping our lives and world (https://www.nytimes.com/by/shira-ovide)
- Wired https://www.wired.com
SPECIFIC TOPICS
- To read more on the historical developments, the mechanisms and ideological premises of Surveillance Capitalism, check “The Age of Surveillance Capitalism” by Shoshana Zuboff, Professor Emerita at Harvard Business School who has researched technology since the 1970s. The book made such a noise since it came out that it now has a wikipedia entry (https://en.wikipedia.org/wiki/The_Age_of_Surveillance_Capitalism). Zuboff gave a string of interviews, and you can also easily find those online.
- Wired https://www.wired.com/story/ways-facebook-tracks-you-limit-it
- Buzzfeed, “Here’s How Facebook Tracks You When You’re Not On Facebook” (2018)
- https://www.buzzfeednews.com/article/alexkantrowitz/heres-how-facebook-tracks-you-when-youre-not-on-facebook,
- A 2015 academic article on by Media Studies researcher Anne Helmond on the platformisation of the web https://journals.sagepub.com/doi/abs/10.1177/2056305115603080
- Privacy: https://www.nytimes.com/2018/04/11/technology/facebook-privacy-hearings.html
Leave a Reply