Datafication, Phantasmagoria of the 21st Century

Tag: Digital Ecology (Page 2 of 2)

Datafied. A Critical Exploration of the Production of Knowledge in the Age of Datafication

This is the abstract of my PhD submitted in August 2022

As qualitative aspects of life become increasingly subjected to the extractive processes of datafication, this theoretical research offers an in-depth analysis on how these technologies skew the relationship between tacit and datafied ways of knowing. Given the role tacit knowledge plays in the design process, this research seeks to illuminate how technologies of datafication are impacting designerly ways of knowing and what design can do to recalibrate this imbalance. In particular, this thesis is predicated on 4 interrelated objectives: (1) To understand how the shift toward the technologies of datafication has created an overreliance on datafied (i.e., explicit) knowledge (2) To comprehend how tacit knowledge (i.e. designerly ways of knowing) is impacted by this increased reliance, (3) To critically explore technologies of datafication through the lens of Walter Benjamin’s work on the phantasmagoria of modernity and (4) To discover what design can do to safeguard, protect and revive the production of tacit knowledge in a world increasingly dominated by datafication.

To bring greater awareness into what counts as valid knowledge today, this research begins by first identifying the principles that define tacit knowledge and datafied ways of knowing. By differentiating these two processes of knowledge creation, this thesis offers a foundation for understanding how datafication not only augments how we know things, but also actively directs and dominates what we know. This research goes on to also examine how this unchecked faith in datafication has led to a kind of 21st century phantasmagoria, reinforcing the wholesale belief that technology can be used to solve some of the most perplexing problems we face today. As a result, more tacit processes of knowledge creation are increasingly being overlooked and side-lined. To conclude this discussion, insights into how the discipline of design is uniquely situated to create a more regenerative relationship with technology, one that supports and honours the unique contributions of designerly ways of knowing, are offered.

Fundamental principles framing Grounded Theory are used as a methodological guide for structing this theoretical research. Given the unprecedented and rapid rate technology is being integrated into modern life, this methodological framework provided the flexibility needed to accommodate the evolving contours of this study while also providing the necessary systematic rigour to sustain the integrity of this PhD.

Keywords: datafication, tacit knowledge, phantasmagoria, regeneration, ecology of knowledge

Chris Jones – Designing Designing

A few words from John Thackara (who wrote the afterword of Chris Jones “Designing Designing”) on Chris Jones’ mission and philosophy (the full post can be found on Thackara’s blog).

As a kind of industrial gamekeeper turned poacher, Jones went on to warn about the potential dangers of the digital revolution unleashed by Claude Shannon

Computers were so damned good at the manipulation of symbols, he cautioned, that there would be immense pressure on scientists to reduce all human knowledge and experience to abstract form

Technology-driven innovation, Jones foresaw, would undervalue the knowledge and experience that human beings have by virtue of having bodies, interacting with the physical world, and being trained into a culture. 

Jones coined the word ‘softecnica’ to describe ‘a coming of live objects, a new presence in the world’. He was among the first to anticipate that software, and so-called intelligent objects, were not just neutral tools. They would compel us to adapt continuously to fit new ways of living. 

In time Jones turned away from the search for systematic design methods. He realized that academic attempts to systematize design led, in practice, to the separation of reason from intuition and failed to embody experience in the design process.”

All of the above ring very true today. The reductionist approach to knowledge, the general disdain for the richness of human knowledge and experience, the widespread contempt for embodied knowledge, the radical separation of reason and intuition, the hidden shaping of a new belief system around the superiority of rational machines, the invisible but violent bending of human friendly ways of living to fit machine dominated new ways of living.

Regulation & Regeneration

In the context of an economic environment deficient in self-regulation (also called wisdom), is there a space for outer regulation in Regenerative spaces?

This question was triggered by Musk purchase of Twitter. Since in regenerative communities, we are using metaphors from nature, the take-over of a global platform that carries a massive chunk of the global public debate, which algorithms are opaque and is known to influence the result of elections by a self-professed libertarian billionaire who has clearly indicated that he wants to restore “free speech” (whatever that means) on the platform and who is known to use it for self-serving purposes is a bit like a human-produced toxic algae bloom spreading on live water habitats and killing all life. Never enough seems to qualify the initiative appropriately.

So, in this context, I was wondering about regulation. Living systems, when left to their own device, self-regulate. This is what I would see as “inner regulation”, or in human terms, “wisdom”. I don’t think it would be overly pessimistic to think that inner regulation is found in (very) limited quantity right now in our social and economic environments.

So what about outer regulation? There are many ways outer regulation functions, from the traditional prescriptive approaches to softer ones that involve sway and incentives. Design as a discipline employs the latter ones all the time. I was thinking it is an important discussion to be had in the context of a community focused on Regenerative Economics because many projects start with the best of intentions and fall prey to unintended consequences.

And I am also interested to hear from those of us who have direct experience in designing regulation frameworks in the complex systems that are online communities who share the same purpose. Do we combine incentives for inner regulation with outer regulation, and if so, how? Do we leave it to the invisible hand? I would love to hear different voices chime on this topic.

Web3 Analysis by Moxie Marlingspike

A must-read blog post by Moxie Marlinspike, founder of Signal, sharing his thoughts on Web3.

The basic argument is that although Web3 concept is for decentralization of internet away from platforms, practically it has just reverted back to Web2 (centralized internet) with only superficial trappings of decentralization.

His points:
1) Blockchain and “crypto” (as it’s now commonly referred to meaning blockchain/cryptocurrency rather than the original meaning “cryptography” aka encryption) is discussed in terms of “distributed” and “trustless” and “leaderless”. One might think that this means that every USER involved is a peer in the chain. But practically it’s not about USERS, it’s about SERVERS. The distributed nature is based on SERVERS, not what Moxie calls “clients” (aka YOUR computer, YOUR phone, YOUR device). So the blockchain concept is supposed to follow distributed trustless and leaderless methods between SERVERS. The problem is that your phone is not a server. Your computer is not a server. Your devices are not servers. All of your devices are END-USER devices. Very few people will actually be setting up, running and maintaining their own server. It’s difficult, requires technical knowledge, and time consuming and costs money to maintain.

So what actually ends up happening is that the whole interface of Web3 turns to: Blockchain <-> Servers <-> End-user client devices. And the problem with Web3 so far is that all the end-user interaction with the blockchain has now consolidated to very few servers, aka returned to the phenomenon of platformisation (which describes how Web2 platforms decentralised their API throughout the entire web to centralise data back to their servers in the 2010s). As of now, most of the Web3 “decentralised apps” interact with the blockchain through two companies called Infura and Alchemy. These two companies run the servers in between blockchain and end-user client devices. So if you are using MetaMask and do something with your cryptocurrency wallet in MetaMask, MetaMask will basically communicate to Infura and Alchemy who then communicate with the actual blockchain.

His two sub-complaints to this are:
A) Nobody is verifying the authenticity of information that comes from Infura / Alchemy. There is currently no system in place on the client side (aka MetaMask on user side) to ensure that what information Infura / Alchemy returns to the end-user is actually what is truly on the blockchain. Theoretically if you have 5BTC in your wallet on the blockchain, and you load up MetaMask to query the balance in your wallet, MetaMask might contact Infura / Alchemy requesting your BTC balance and Infura / Alchemy can respond to say you have 0.1BTC. MetaMask won’t verify if that’s actually true, it’s just taken at its word.
B) Privacy concerns with routing all requests via Infura / Alchemy. Moxie’s example is: imagine every single web request you make is first routed through Google before being routed to your actual intended destination.

2) He gives the example of how NFTs are in fact just URLs stored on the blockchain. And these URLs point to servers hosting the actual content. So when you buy an NFT, you only own the URL on the blockchain that DIRECTS to the artwork, NOT the “artwork” itself. He did an exercise where he made an NFT that looks like a picture when viewed through OpenSea, but looks like a poo emoji when accessed via someone’s crypto wallet. Because ultimately the server hosting the image (to which the URL on the actual blockchain points to) is ultimately in control of the artwork.
Even worse, his NFT ended up being deleted by OpenSea. But somehow his NFT ALSO stopped appearing in his wallet. How is this possible? Even if OpenSea deletes the NFT from their website, the NFT should still be on the blockchain, right? Why doesn’t it still show up in his wallet? Well he says that due to this centralisation of supposedly “de-centralised” apps, his wallet is in fact communicating not with the blockchain directly, but through a few centralised platforms (one of which is OpenSea). So because OpenSea deleted his NFT, his wallet also no longer shows the NFT. It doesn’t matter that his NFT still belongs to him on the blockchain if the whole end-user system is totally divorced from the blockchain and instead reliant on the middle servers.

3) Finally, he is saying that Web3 as we know it now is really just Web2 with some fancy “Web3” window dressing. And the window dressing actually makes the whole system run worse than if it just stuck to pure Web2. But why force the window dressing? Simply to sell the whole thing as a next generation Web3 package as part of what he calls a gold rush frenzy over Web3.

The Dark Side of AI

I came across this article in the Financial Times yesterday (March 19 2022) on the Dark Sides of Using AI to Design Drugs by Anjana Ahuja.

Scientists at Collaborations Pharmaceuticals, a North Carolina company using AI to create drugs for rare diseases, experimented with how easy it would be to create rogue molecules for chemical warfare.

As it happens, the answer is VERY EASY! The model took only 6 hours to spit out a set of 40,000 destructive molecules.

And it’s not surprising. As French cultural theorist and philosopher Paul Virilio once said, “when you invent the ship, you invent the shipwreck”. Just like social platforms can be used both to connect with long lost friends AND to spread fake news, AI can be used both to save lives AND to destroy them.

This is a chilling reminder about the destructive potential of increasingly sophisticated technologies that our civilisation has developed but may not have the wisdom to use well.

TEDx Open Mic Follow Up: What Can we Do?

At the end of the open mic yesterday, you asked me a really important question and I do not think that my on-the-spot answer was explicit enough. The question was: “so what can we do?” It is a complex question. I thought about it last night and today and I would like to add a few more words which I hope will shed a clearer light on this complex matter. Here are some avenues for possible answers.

One way to think about an answer is to look at two possible levels of action. The systemic (policy) and the individual levels. At the policy level, regulation is coming. The EU has been the most aggressive so far (GDPR) but it will take time because this phenomenon is complex and it is unprecedented, which means that at the moment, there are no suitable laws to frame it and we do not even really understand how it works. Regulation will also likely be watered down by powerful networks of influence. In the US, the fact that Facebook and Google WANT the Federal government to come up with regulation clearly shows that they are confident they have the power to lobby and influence the end-result.

However, an increasing number of smart voices are putting forward some creative propositions that could easily and quickly be put into action. One of them for example is Paul Romer (co- recipient of the Nobel Memorial Prize in Economic Sciences in 2018) who advocates a Digital Tax (Digital Ad-Tax) as an incentive for good behaviour. Compelling initiatives are coming from the arts world as well. Adam Harvey did a great work on revealing the hidden datasets that feed the rise of AI-driven facial recognition. Manuel Beltrán’s Cartographies of Dispossession discloses the forms of systematic data dispossession. Taken individually, none of those propositions will make things right, but they all contribute to creating a more sustainable system.

The other level of action is individual; here we ask the question: “what can I do?”

As I said yesterday, I think that right now what is most urgently required is for us to become aware about what this all really means. The different debates around digital platforms technology at the moment (privacy, fake news, misinformation, anti-trust etc) are all parts of the same whole. The datafication of human experience is not only a technological issue, it is not only a social, economic or political issue, it is an ecological issue. Which means that we are dealing with a complex system. Complex systems present dilemma rather than problems. Because they do not lend themselves readily to linear solutions, they ask for a change of mindset. They require to be tackled at the same time from different angles; they need time, flexibility and vision; and they demand from us to do something that humans usually find most challenging: change our existing patterns of behaviour.

We need to change our behaviours. How? To be honest, as users, at the moment, we do not have much leverage. The digital universe we live in has emerged from a legal void and has largely been shaped by the major actors of the digital economy to serve their interests. We can’t opt-out of the terms and conditions of the social platforms we use everyday and keep using their services. Behavioural Economics has revealed what psychologists have known about human nature for a long time: as emotional beings we are easily manipulated. 2017 Nobel Prize recipient behavioural economist Richard Thaler and Cass Sunstein call this nudging and wrote a book on the topic. For the past 30 years or so, BJ Fogg, from the Behaviour Design Lab at Stanford University, has been teaching students how to use technology to persuade people. He calls this discipline by a really interesting name: “captology”. Today, captology helps keep users captive.

However, we are not helpless. We do not have a wide array of choices, but it does not mean we have none. We do have one power. The power of voting with our feet. This means we need to change our behaviours. To say “I can’t leave this platform because everyone is there” is the digital equivalent of saying I will start recycling when the planet is clean. Google is not the only search engine (try DuckDuckGo), Chrome not the only browser (try Firefox), Gmail not the only email provider (try ProtonMail), WhatsApp not the only messaging app (try Signal or Telegram).

We also need to seriously (SERIOUSLY) reassess the personal values that underlie our consumption of digital technologies.

It’s convenient. We are creatures of habits, so convenience has been baked into the design of social tech to make us complacent and lazy. But convenience is not a value that yields the greatest results in terms of ecological sustainability. Today, we understand that our patterns of consumption (food, clothing etc.) affect our environment. So, while it is also more convenient to throw garbage out of the window, despite the efforts required, we recycle. As informed and conscious consumers, we take great pains in consuming consciously. And in doing so, we influence the companies that create the products we consume. Why don’t we adopt the same behaviours when it comes to digital?

It’s free. Would you really expect to go to the supermarket, pile food up in a trolley and leave without paying a cent? Would you find it completely natural to enter a [Prada] shop (fill in with the name of your preferred brand), pick up a few handbags, a jacket or two and some small leather goods and leave with a dashing smile on your face and your credit card safely in your bag? Last time I checked, I think those behaviours were called “stealing” and they were punished by law. As a rule of thumb, we need to remember the most important theorem of the digital age: “when it’s free, it’s not free”. Plus, to go back to the environment analogy, we also considered nature as a free resource to be pilfered for our own profit. See how well we did with that? Just to put things in perspective, a paid account with the most secure email in the world, ProtonMail, costs US$50 a year. This is what you would spend for 8 mocha Frappuccino at Starbucks (and ProtonMail is much better for your health). So don’t be shy, pay for sustainable, clean technology! This requires a major change of mindset, but we will all be better off in the end.

In his book “WTF? What’s the Future and Why It’s Up to Us: Tim O’Reilly says that the master algorithm encoded by the targeted advertising business is optimised to be hostile to humanity. So one last thought. Today, we are still in the social media era, but how about tomorrow? The technologies in the making carry with them a level of intensity and a potential for behaviour modification, control and a possibility for destruction unequaled in the history of humanity (see Jaron Lanier). It took us 60 years to wake up to the slaughtering of our natural environment, we won’t be given so much time to react to the slaughtering of human experience.

Why I Am Quitting WhatsApp – Part II

A couple of months ago I sent a message that started as follows: “For the past 5 years, I have been doing a PhD research on large social platforms. I want to share a few thoughts on why I am quitting WhatsApp before February 8th 2021.” The short memo was written after WhatsApp unilateral announcement that there were changing their T&C to accommodate WhatsApp For Business. Since then, Facebook postponed the changes to their T&C to May 15th after the announcement caused an uproar. 

Since I sent the message in January, I received many questions and comments, some people quit WhatsApp altogether, some opened accounts on other messaging platforms (mostly Telegram and Signal), and some disagreed completely with my analysis and kept using WhatsApp as usual. Based on the rich discussions I had with friends and acquaintances in the past few months, I want to share a few more thoughts before the change comes into reality. 

Why Did FB Postpone the Change to May 15th?

Delaying the change gave Facebook time to achieve two things. 

1. First, to rewrite the narrative through a process of habituation.

Habituation is a tactic that has been widely utilised by large social tech corporations in the past 15 years to gradually get us used to increasing encroachments on our privacy. 

The tactic works as follows:

  • Unilaterally announce the change. Ignore signs of disgruntlement. If the announcement causes a major uproar, issue a statement saying that you have been misunderstood and postpone. Stay low for a little while until some other news takes front stage (that shouldn’t take too long). 
  • This gives you time to design a communication campaign that heavily emphasises specific (reassuring) aspects of the issue (e.g., privacy is built into our DNA, we do not have access to the content of your messages) and completely obfuscate the real truth (we do not care a bit about the content of your messages, we want to track your behaviour not your words).
  • If opposition is too strong – for example as in the case of the Facebook Beacon feature implemented in 2007 and abandoned in 2009 after a class action lawsuit – find other technical ways to achieve the same goal, which is what FB did with Facebook Connect and spreading a piece of code in its Like button that sends information back to FB when you visit any unrelated site anywhere on the web. 

2. Second, it allowed Facebook to think out and implement a campaign of misinformation (I am using the word with purpose). Disinformation is the phenomenon of spreading lies. Misinformation is the phenomenon of spreading half-truths to create confusion or take control of the main narrative. WhatsApp is emphasising that the CONTENT of our messages is and will always remain private. What WhatsApp is not saying is that they could not care less about that content, because the data they hunger for is metadata. 

METADATA & PRIVACY

What’s Metadata? 

It is data about data, or data about your behaviour (not your words) when you are online. A few (non-exhaustive) examples are: who you know (your contacts), who you message, when and how often, who they know, who they message etc. But also, your device and user ID, your name, your location, your email, and all sorts of user and device content. In short, any activity that can be tracked and linked to you.

The Gold of the Internet

In tech parlance, metadata is referred to by the euphemism “breadcrumbs” (a noun that deceptively emphasises the innocuous and insignificant nature of this type of data). Google discovered early on that users left collateral behavioural data (the breadcrumbs) behind when they searched, and that the breadcrumbs could be aggregated and fed into machine learning to get really deep insights into who we are. Those insights go well beyond what we think we reveal when we live our life online. This collateral behavioural data is the “gold” of the internet. 

The discovery above marked the birth of “targeted advertising”, another euphemism for the enterprise of surveillance that has grown exponentially and largely unchecked since the early 2000s. It’s not what you say online (content), it’s the traces you leave that count (metadata). For more, see Shoshana Zuboff under “resources” below (if you read only one book on this topic, let it be hers).

A Narrow Definition of Privacy 

When we, users, think of privacy, we think about the content of the information we share online. But a quick look at WhatsApp Privacy Policy (see link below) makes two things clear:

  1. metadata (not content) is the core commodity being harvested. 
  2. we have no say in what and how our data is collected, how it is treated and what it is being used for.

Under the “Information We Collect” heading, there are 3 paragraphs. Interestingly, the first one is called “Information You Provide”, hinting that the other two are information that you do NOT (and may not want to) provide. Indeed, they are: “Automatically Collected Information” and “Third Party Information”. 

Here is an excerpt: “Device and Connection Information. We collect device-specific information when you install, access, or use our Services. This includes information such as hardware model, operating system information, browser information, IP address, mobile network information including phone number, and device identifiers.” It is really interesting to read through https://www.whatsapp.com/legal/privacy-policy if you have not already done so.

NB: WhatsApp Terms of Service and Privacy Policies change with time. I posted the pdf and some experts of the document as of 27 March 2021 in a post above.

REAL IMPLICATIONS OF WHATSAPP FOR BUSINESS

What’s Really Behind WhatsApp For Business?

As you probably know, WhatsApp has been sharing metadata from your chats with its parent company Facebook for some time already. The changes in T&C are meant to allow WhatsApp For Business (WFB) to take off the ground. 

What does it mean? On the face of it, WFB is merely a tool to help businesses “better communicate” with us, their customers. Officially, Facebook tells us that it just means that businesses will be able to chat with you, give you information on their products, follow up on your purchases and give you “better” customer service. But if you think about it, most of that could already be achieved before WFB. By opening the possibility for transactions on WhatsApp, WFB will allow Facebook to get extraordinarily granular collateral behavioural data on aspects of our lives it had only indirect, limited or no access to before

What Business?

One thing you may want to contemplate: what does “business” mean in “WhatsApp For Business”? Retail brands certainly, and this seems innocuous enough. But not only (and by the way, what seems innocuous to you is not so innocuous once it goes through the analytical capacities of Big Data). But “Business” may also cover health-related communications and transactions. You may not mind if Facebook peeks into your interactions with a fashion brand, but how would you like it to have granular access to your exchanges with businesses selling health devices and health practitioners (and that may include how much and how often you spend on treatments), insurers, money lenders, financial institutions and more? Do you really think that Facebook ought to have access to details from your credit cards and bank accounts statements (a statement states who, when and how much you spent with, so essentially, FB will have access to your statements)? 

IMPLICATIONS

Commodification of users 

I sometimes hear people say: “if you do not pay for a service online, you are the product”. It is true in spirit, but not completely. The massive enterprise of data extraction shows that we (users) have become, not the product, but the cheap commodity. From this, FB and Google extract the raw material they use to create the valuable products they sell not only to advertisers but to be honest, to anyone who is willing to pay for them whatever their intention. In other words, we are the pigs, not the Iberico ham

An Enterprise of Territorialisation

Another aspect of the enterprise of digital surveillance can be summed up in two words: “never enough”. The internal competitive logic of the targeted advertising model drives those corporations to ceaselessly expand in order to acquire ever-more predictive collateral behavioural data. Since we still live (slithers of) our life offline, it is not enough to surveil our lives online, so the tactics used online are seeping through to the physical world. 

Our bodies are the next frontier (in January 2021, Google paid $2.1 billion for Fitbit, a company that makes bracelets that record health and fitness data). Sensors, wearables, our bodies have become territories to conquer. How do you claim that territory? By creating new needs and new habits. 

Wearables and sensors in the physical world (think “smart” cities) can detect changes that happen in your body below the dermal layer. How is that for an invasion of privacy? Who gets all that data about your heart rate and the number of steps you take and how many times you wake up at night? Where does that information go? What happens to that data after it is collected? Who decides what to do with it? Who benefits in the end? Do you know? And more importantly do you mind? 

AN ECOLOGICAL CRISIS

Why It Is Urgent to Take Action 

We got where we are today because the unprecedented nature of these developments has obfuscated what was really going on (see Zuboff for more on this). In 2021 though, it is hard to ignore that we have reached a point where we can’t afford to be complacent

In 1964, environmentalist Rachel Carson wrote Silent Spring, a compelling call for the world to wake up to the large-scale slaughtering of our environment through the commodification of nature by large corporations with unmatched lobbying power. Today, we are doing to human experience what we started doing to nature 60 years ago

Reconfigurations of Power

First, it is an ecological crisis in the distribution of power. The digital developments of the past 20 years have created massive asymmetries of knowledge. In other words, Google and Facebook know heaps about us but we know very little about them. The algorithms that orchestrate the online life of several billion people across the planet are considered proprietary trade secrets. These massive asymmetries of knowledge lead to massive asymmetries of power

This is compounded by the void in the legal framework surrounding these issues. An army of in-house lawyers unilaterally controls our contractual relationship with some platforms which in many ways have become public spaces. The only alternative to full acceptance is not to use the service. And a second army of “communication experts” and lobbyists carefully craft the official narrative (the definition of privacy is only one example). 

Crisis of the Ecology of Knowledge

But there is another more insidious and less talked about implication. The commodification of users is leading to a major epistemological catastrophe. The reductionist approach to human experience (i.e., to quantify purely qualitative aspects of our lives, and discard what cannot be quantified) is corrupting what we know, and how we know what we know. It is an ecological crisis of knowledge of the sort only seen once in a millennium (more about that in another post). 

MISCONCEPTIONS

Everyone on the Web Collects Data, so Why Bother?

Now, coming back to where we started, I hope I have given some avenues of reflection for why quitting WhatsApp may not be such a bad idea after all. However, I want to address one concern that I heard many time in the past couple of months. 

Some people told me: “if I move to Telegram or Signal, they will also collect my data, so what’s the point?” That’s a very valid question. While it is true that data collection is widespread online (we all heard about the cooking apps collecting your device ID and geolocation), everything is not created equal so we need to use our power of discrimination.

First, Telegram and Signal collect much less metadata than WhatsApp (or Facebook Messenger). In fact, Signal only collects your phone number. You can easily check this. In fact, just like you brush your teeth in the morning, it is a good habit to check what metadata the apps you use get from you. But that’s not all. 

Exercise Our “WE” Muscle

Second, we need to exercise our “we” muscle instead of solely relying on our “I” muscle. As I mentioned above, this is an ecological issue. Self-centred thinking is not a luxury we can afford anymore. If social media have taught us anything, it is that we are intricately interconnected and that the decisions we make individually have global consequences. This is true for the environment; this is also true for the digital environment. 

Be selective in whose business you patronise, because as you do, you are participating in creating the world that we will leave to the next generations. In other words, do not underestimate the positive ripple effects from an individual decision to use an equivalent app outside of the Facebook domain (and that applies to Google as well by the way). As I said in my previous post, if you leave, some people in your network will also leave. The network effect works in favour of platform monopoly but it can also work against it. 

Final Word: Personal Convenience Is Not a Judicious Ground for Action

One last word. Some will decide to remain on WhatsApp not because they trust Facebook but simply for convenience. This is one option. Social platforms are built for convenience (in design parlance it is called “user experience” or UX for short), because they know that convenience drives users’ lethargy. To follow this argument, it is also more convenient to throw garbage through your car window, release harmful chemicals in rivers rather than in purifying facilities, and throw plastic bottles in the ocean rather than in recycling plants. Personal convenience is not always the most judicious ground for action. 

RESOURCES

I am starting a Datafication & Technology blog to keep writing about these issues (www.datafication.space). Post, comment or send me questions (datafication.space AT protonmail.com) on topics you would like to see treated.

GENERAL RESSOURCES

SPECIFIC TOPICS

  • To read more on the historical developments, the mechanisms and ideological premises of Surveillance Capitalism, check “The Age of Surveillance Capitalism” by Shoshana Zuboff, Professor Emerita at Harvard Business School who has researched technology since the 1970s. The book made such a noise since it came out that it now has a wikipedia entry (https://en.wikipedia.org/wiki/The_Age_of_Surveillance_Capitalism). Zuboff gave a string of interviews, and you can also easily find those online. 

Why I am quitting WhatsApp – Part I (13 January 2021)

For the past 5 years, I have been doing a PhD research on large social platforms. I want to share a few thoughts on why I am quitting WhatsApp before February 8th 2021. 

As you probably heard by now, WhatsApp is changing its Terms & Conditions as of 8th of February, 2021. Furthermore, as is customary with Facebook companies, the move is unilateral and if users do not accept the new terms, they will lose access to their account. There are two things you want to be very clear about when you make your decision to remain on the platform. What “data” and “privacy” mean and why WhatsApp is making the change. 

WHAT DO DATA & PRIVACY MEAN?

It is not the first assault on our privacy in the race to control the use of our data. WhatsApp has been sharing information with Facebook, its parent company, for a while. However, this change opens the door for major, more intrusive and more unilateral changes that violate the privacy and the dignity of their users. We all know about the privacy issues that come with social platforms. We all know that FB collects our data. But the real questions are: what is this data we are talking about? FB claims that “privacy” is built into their DNA, but how do they define privacy?

Over the years, FB has been very careful to maintain ambiguity about this and have adopted a very narrow definition. They equate data with CONTENT, i.e. WHAT we say in the messages we exchange, and equate privacy with not having access to that content. They say that WhatsApp supports point to point encryption, and even WhatsApp does not have access to the content of what we write, therefore, they respect our privacy. 

This is a FALLACY. The data that FB and WhatsApp want is not the content (what we write), but data on our behaviours when we are on their platform, like where we are, who we contact, how long we stay on the platform, who do we message most, at what time, who is in our network etc… This data is called “breadcrumbs”, because it is a kind of side effect that happens when we use their main service. But it is the most valuable data because, when it is aggregated and analysed by super computers (what is called Big Data), it gives Facebook very deep insights into you, who you are, your personality, what presses your buttons, your body, your health, your mental state, your wishes and many things that you would not want other people to know about you, and rightfully so. 

I often hear people say that “Facebook sells our data”, but Facebook says they don’t and it is true! They do not sell our data, they sell something much more valuable than our raw data. They use our data as raw material to create computer-ready products that they sell on their platform not only to advertisers but to whoever wants to buy them (governments, extremists, third parties trying to influence elections… you name it). In other words, we, the users, have become not the product, but the raw material, the cheap commodity. We are called “users” but a better name should the “USED”. (If you want to know more about this, read Shoshana Zuboff’s “The Age of Surveillance Capitalism”, a powerful eye opener!)

WHY THE CHANGE?

So why is WhatsApp making the changes now? 

First, because Facebook wants to monetise WhatsApp. Think about it as an expensive piece of real estate with just a small house on it. Even if the house is quite nice, that piece of land is not optimised. Facebook wants to create a universe that is so all-encompassing that users never need to leave it. It wants to be the “WeChat of the West”, a place where ultimately, you will be able to socialise, communicate with your friends, share moments, workout, buy stuff, send money, follow your favourite brands or celebrities, fall in love, manage your health and bank accounts or credit cards etc. In other words, Facebook wants you to live your life through Facebook. Why? Well data of course!

Second, because Facebook wants to integrate WhatsApp so deeply into the Facebook family that it will soon be impossible (or very difficult) for regulators to break them down. Think about it as baking a cake. Once the flour and the eggs and the sugar have become the cake, you can’t get the eggs back. Facebook is under scrutiny from governments in several countries. The EU already set up the GDPR, a regulation on data protection and privacy, in 2016. In the US, there is a strong bi-partisan concern and Facebook is being investigated for abuse of dominance and anti competitive conduct, with the idea of possibly breaking it up. But as I said, once the cake is baked, it is really tricky to get the eggs back… 

SO WHAT SHALL I DO?

In the past week or so, many people have downloaded Telegram and Signal, two WhatsApp-like apps that provide instant messaging services. But I also hear a lot of people say they will stay on WhatsApp even though they downloaded the new apps. 

“I will stay on WhatsApp for now but I will quit later”. This is exactly what Facebook is counting on, because they know one fact about human psychology: that we are essentially creatures of habits, and once we are used to doing something in a certain way, it is going to take a lot of effort to change it (ask smokers how easy they find it to stop smoking). Right now, there is a knee jerk reaction, we download Signal or Telegram and it makes us feel better and safer. But in one month time, when another breaking news occupies our (limited) mental space, all will go back to the way we have always done it, i.e. “I’ll WhatsApp you!” 

“I can’t quit WhatsApp, all my friend/clients are there”. True, with over 2 billion users, your friends, family, clients and their cats and dogs are probably all on WhatsApp. But the argument above is the snake biting its own tail. If people leave WhatsApp, people will leave WhatsApp. I am quite certain that you would be surprised if you checked how many people around you actually do have an account on another major messaging app, or are ready to download it and make the switch! (I certainly was). 

So for all those reasons, it’s Farewell WhatsApp for me. And I am looking forward to life without Facebook with curiosity and anticipation! 

Newer posts »