At the end of the open mic yesterday, you asked me a really important question and I do not think that my on-the-spot answer was explicit enough. The question was: “so what can we do?” It is a complex question. I thought about it last night and today and I would like to add a few more words which I hope will shed a clearer light on this complex matter. Here are some avenues for possible answers.
One way to think about an answer is to look at two possible levels of action. The systemic (policy) and the individual levels. At the policy level, regulation is coming. The EU has been the most aggressive so far (GDPR) but it will take time because this phenomenon is complex and it is unprecedented, which means that at the moment, there are no suitable laws to frame it and we do not even really understand how it works. Regulation will also likely be watered down by powerful networks of influence. In the US, the fact that Facebook and Google WANT the Federal government to come up with regulation clearly shows that they are confident they have the power to lobby and influence the end-result.
However, an increasing number of smart voices are putting forward some creative propositions that could easily and quickly be put into action. One of them for example is Paul Romer (co- recipient of the Nobel Memorial Prize in Economic Sciences in 2018) who advocates a Digital Tax (Digital Ad-Tax) as an incentive for good behaviour. Compelling initiatives are coming from the arts world as well. Adam Harvey did a great work on revealing the hidden datasets that feed the rise of AI-driven facial recognition. Manuel Beltrán’s Cartographies of Dispossession discloses the forms of systematic data dispossession. Taken individually, none of those propositions will make things right, but they all contribute to creating a more sustainable system.
The other level of action is individual; here we ask the question: “what can I do?”
As I said yesterday, I think that right now what is most urgently required is for us to become aware about what this all really means. The different debates around digital platforms technology at the moment (privacy, fake news, misinformation, anti-trust etc) are all parts of the same whole. The datafication of human experience is not only a technological issue, it is not only a social, economic or political issue, it is an ecological issue. Which means that we are dealing with a complex system. Complex systems present dilemma rather than problems. Because they do not lend themselves readily to linear solutions, they ask for a change of mindset. They require to be tackled at the same time from different angles; they need time, flexibility and vision; and they demand from us to do something that humans usually find most challenging: change our existing patterns of behaviour.
We need to change our behaviours. How? To be honest, as users, at the moment, we do not have much leverage. The digital universe we live in has emerged from a legal void and has largely been shaped by the major actors of the digital economy to serve their interests. We can’t opt-out of the terms and conditions of the social platforms we use everyday and keep using their services. Behavioural Economics has revealed what psychologists have known about human nature for a long time: as emotional beings we are easily manipulated. 2017 Nobel Prize recipient behavioural economist Richard Thaler and Cass Sunstein call this nudging and wrote a book on the topic. For the past 30 years or so, BJ Fogg, from the Behaviour Design Lab at Stanford University, has been teaching students how to use technology to persuade people. He calls this discipline by a really interesting name: “captology”. Today, captology helps keep users captive.
However, we are not helpless. We do not have a wide array of choices, but it does not mean we have none. We do have one power. The power of voting with our feet. This means we need to change our behaviours. To say “I can’t leave this platform because everyone is there” is the digital equivalent of saying I will start recycling when the planet is clean. Google is not the only search engine (try DuckDuckGo), Chrome not the only browser (try Firefox), Gmail not the only email provider (try ProtonMail), WhatsApp not the only messaging app (try Signal or Telegram).
We also need to seriously (SERIOUSLY) reassess the personal values that underlie our consumption of digital technologies.
It’s convenient. We are creatures of habits, so convenience has been baked into the design of social tech to make us complacent and lazy. But convenience is not a value that yields the greatest results in terms of ecological sustainability. Today, we understand that our patterns of consumption (food, clothing etc.) affect our environment. So, while it is also more convenient to throw garbage out of the window, despite the efforts required, we recycle. As informed and conscious consumers, we take great pains in consuming consciously. And in doing so, we influence the companies that create the products we consume. Why don’t we adopt the same behaviours when it comes to digital?
It’s free. Would you really expect to go to the supermarket, pile food up in a trolley and leave without paying a cent? Would you find it completely natural to enter a [Prada] shop (fill in with the name of your preferred brand), pick up a few handbags, a jacket or two and some small leather goods and leave with a dashing smile on your face and your credit card safely in your bag? Last time I checked, I think those behaviours were called “stealing” and they were punished by law. As a rule of thumb, we need to remember the most important theorem of the digital age: “when it’s free, it’s not free”. Plus, to go back to the environment analogy, we also considered nature as a free resource to be pilfered for our own profit. See how well we did with that? Just to put things in perspective, a paid account with the most secure email in the world, ProtonMail, costs US$50 a year. This is what you would spend for 8 mocha Frappuccino at Starbucks (and ProtonMail is much better for your health). So don’t be shy, pay for sustainable, clean technology! This requires a major change of mindset, but we will all be better off in the end.
In his book “WTF? What’s the Future and Why It’s Up to Us: Tim O’Reilly says that the master algorithm encoded by the targeted advertising business is optimised to be hostile to humanity. So one last thought. Today, we are still in the social media era, but how about tomorrow? The technologies in the making carry with them a level of intensity and a potential for behaviour modification, control and a possibility for destruction unequaled in the history of humanity (see Jaron Lanier). It took us 60 years to wake up to the slaughtering of our natural environment, we won’t be given so much time to react to the slaughtering of human experience.
Leave a Reply