Datafication, Phantasmagoria of the 21st Century

Tag: Phantasmagoria

Datafied. A Critical Exploration of the Production of Knowledge in the Age of Datafication

This is the abstract of my PhD submitted in August 2022

As qualitative aspects of life become increasingly subjected to the extractive processes of datafication, this theoretical research offers an in-depth analysis on how these technologies skew the relationship between tacit and datafied ways of knowing. Given the role tacit knowledge plays in the design process, this research seeks to illuminate how technologies of datafication are impacting designerly ways of knowing and what design can do to recalibrate this imbalance. In particular, this thesis is predicated on 4 interrelated objectives: (1) To understand how the shift toward the technologies of datafication has created an overreliance on datafied (i.e., explicit) knowledge (2) To comprehend how tacit knowledge (i.e. designerly ways of knowing) is impacted by this increased reliance, (3) To critically explore technologies of datafication through the lens of Walter Benjamin’s work on the phantasmagoria of modernity and (4) To discover what design can do to safeguard, protect and revive the production of tacit knowledge in a world increasingly dominated by datafication.

To bring greater awareness into what counts as valid knowledge today, this research begins by first identifying the principles that define tacit knowledge and datafied ways of knowing. By differentiating these two processes of knowledge creation, this thesis offers a foundation for understanding how datafication not only augments how we know things, but also actively directs and dominates what we know. This research goes on to also examine how this unchecked faith in datafication has led to a kind of 21st century phantasmagoria, reinforcing the wholesale belief that technology can be used to solve some of the most perplexing problems we face today. As a result, more tacit processes of knowledge creation are increasingly being overlooked and side-lined. To conclude this discussion, insights into how the discipline of design is uniquely situated to create a more regenerative relationship with technology, one that supports and honours the unique contributions of designerly ways of knowing, are offered.

Fundamental principles framing Grounded Theory are used as a methodological guide for structing this theoretical research. Given the unprecedented and rapid rate technology is being integrated into modern life, this methodological framework provided the flexibility needed to accommodate the evolving contours of this study while also providing the necessary systematic rigour to sustain the integrity of this PhD.

Keywords: datafication, tacit knowledge, phantasmagoria, regeneration, ecology of knowledge

Web3 Analysis by Moxie Marlingspike

A must-read blog post by Moxie Marlinspike, founder of Signal, sharing his thoughts on Web3.

The basic argument is that although Web3 concept is for decentralization of internet away from platforms, practically it has just reverted back to Web2 (centralized internet) with only superficial trappings of decentralization.

His points:
1) Blockchain and “crypto” (as it’s now commonly referred to meaning blockchain/cryptocurrency rather than the original meaning “cryptography” aka encryption) is discussed in terms of “distributed” and “trustless” and “leaderless”. One might think that this means that every USER involved is a peer in the chain. But practically it’s not about USERS, it’s about SERVERS. The distributed nature is based on SERVERS, not what Moxie calls “clients” (aka YOUR computer, YOUR phone, YOUR device). So the blockchain concept is supposed to follow distributed trustless and leaderless methods between SERVERS. The problem is that your phone is not a server. Your computer is not a server. Your devices are not servers. All of your devices are END-USER devices. Very few people will actually be setting up, running and maintaining their own server. It’s difficult, requires technical knowledge, and time consuming and costs money to maintain.

So what actually ends up happening is that the whole interface of Web3 turns to: Blockchain <-> Servers <-> End-user client devices. And the problem with Web3 so far is that all the end-user interaction with the blockchain has now consolidated to very few servers, aka returned to the phenomenon of platformisation (which describes how Web2 platforms decentralised their API throughout the entire web to centralise data back to their servers in the 2010s). As of now, most of the Web3 “decentralised apps” interact with the blockchain through two companies called Infura and Alchemy. These two companies run the servers in between blockchain and end-user client devices. So if you are using MetaMask and do something with your cryptocurrency wallet in MetaMask, MetaMask will basically communicate to Infura and Alchemy who then communicate with the actual blockchain.

His two sub-complaints to this are:
A) Nobody is verifying the authenticity of information that comes from Infura / Alchemy. There is currently no system in place on the client side (aka MetaMask on user side) to ensure that what information Infura / Alchemy returns to the end-user is actually what is truly on the blockchain. Theoretically if you have 5BTC in your wallet on the blockchain, and you load up MetaMask to query the balance in your wallet, MetaMask might contact Infura / Alchemy requesting your BTC balance and Infura / Alchemy can respond to say you have 0.1BTC. MetaMask won’t verify if that’s actually true, it’s just taken at its word.
B) Privacy concerns with routing all requests via Infura / Alchemy. Moxie’s example is: imagine every single web request you make is first routed through Google before being routed to your actual intended destination.

2) He gives the example of how NFTs are in fact just URLs stored on the blockchain. And these URLs point to servers hosting the actual content. So when you buy an NFT, you only own the URL on the blockchain that DIRECTS to the artwork, NOT the “artwork” itself. He did an exercise where he made an NFT that looks like a picture when viewed through OpenSea, but looks like a poo emoji when accessed via someone’s crypto wallet. Because ultimately the server hosting the image (to which the URL on the actual blockchain points to) is ultimately in control of the artwork.
Even worse, his NFT ended up being deleted by OpenSea. But somehow his NFT ALSO stopped appearing in his wallet. How is this possible? Even if OpenSea deletes the NFT from their website, the NFT should still be on the blockchain, right? Why doesn’t it still show up in his wallet? Well he says that due to this centralisation of supposedly “de-centralised” apps, his wallet is in fact communicating not with the blockchain directly, but through a few centralised platforms (one of which is OpenSea). So because OpenSea deleted his NFT, his wallet also no longer shows the NFT. It doesn’t matter that his NFT still belongs to him on the blockchain if the whole end-user system is totally divorced from the blockchain and instead reliant on the middle servers.

3) Finally, he is saying that Web3 as we know it now is really just Web2 with some fancy “Web3” window dressing. And the window dressing actually makes the whole system run worse than if it just stuck to pure Web2. But why force the window dressing? Simply to sell the whole thing as a next generation Web3 package as part of what he calls a gold rush frenzy over Web3.

Datafication as Phantasmagoria

My main argument is that that datafication is the phantasmagoria of the 21st century, the same way mass consumerism was the phantasmagoria or the dream of the 20th century. My inspiration is the work of Walter Benjamin, The Arcades Project.

I am defining datafication as the quantification of qualitative aspects of life, i.e. human experience generally.

I am arguing that this phantasmagoria is creating a massive epistemological shift towards a more impoverished type of knowledge because in this massive enterprise of quantification, what cannot be turned into computer data or in other words what cannot be quantified is just abandoned. And now that algorithms are making decisions in most areas of life such as education finance justice and so on this quantification has a direct impact on the system as we live in.

More on this later…