Datafication, Phantasmagoria of the 21st Century

Category: Reductionism

Predictive AI Fails At… Predicting

You would think that with all the hype around “AI” (in quotation marks because the word has become a catch-all bag, covering a whole range of poorly defined realities), and our civilisation’s enduring blind faith in the omniscience of digital technologies, at least, the technology would perform its function remarkably well.

I mean wouldn’t you?

Well, it seems not.

The Markup is “a nonprofit newsroom that challenges technology to serve the public good.” (Check here if you want to know more, I have been following them for years, they do remarkable work.)

This is what they found out (see below).

A software company sold a New Jersey police department an algorithm that was right less than 1% of the time. Read the whole article here.

It is NOT a blip. It is NOT an exception, an anomaly, a special case. It is another day in the office for predictive AI. And those issues will NOT go away with the next model iteration.

They are here to stay because they are an intrinsic feature of the technology. As a technology of quantification, AI (or whatever name we want to give the Digital) does NOT and, in fact, can NOT reliably handle qualitative aspects of life.

This is why the likes of Facebook employ human content moderators to detect and remove gore, violence and generally harmful content from the platform. (By the way, those people are often sub-contracted, so they do not appear on the main companies’ annual reports, and their contracts contain a clause they won’t sue if they get PTSD on the job, which they often do. Read here about what happened when they did).

So, despite all the hype, “a rose by any other name would smell as sweet.” When it comes to the social, predictive AI mostly fails at predicting.

The Nature of (Digital) Reality

Bruce Schneier’s blog “Schneier on Security” often presents thought-provoking pieces about the digital. This one directly relates to the core question of my PhD about the shifting nature of reality in the digital age.

A piece worth reading. You can also browse through the comments on his blog.

Schneier’s self intro on his blog: “I am a public-interest technologist, working at the intersection of security, technology, and people. I’ve been writing about security issues on my blog since 2004, and in my monthly newsletter since 1998. I’m a fellow and lecturer at Harvard’s Kennedy School, a board member of EFF, and the Chief of Security Architecture at Inrupt, Inc.”

Chris Jones – Designing Designing

A few words from John Thackara (who wrote the afterword of Chris Jones “Designing Designing”) on Chris Jones’ mission and philosophy (the full post can be found on Thackara’s blog).

As a kind of industrial gamekeeper turned poacher, Jones went on to warn about the potential dangers of the digital revolution unleashed by Claude Shannon

Computers were so damned good at the manipulation of symbols, he cautioned, that there would be immense pressure on scientists to reduce all human knowledge and experience to abstract form

Technology-driven innovation, Jones foresaw, would undervalue the knowledge and experience that human beings have by virtue of having bodies, interacting with the physical world, and being trained into a culture. 

Jones coined the word ‘softecnica’ to describe ‘a coming of live objects, a new presence in the world’. He was among the first to anticipate that software, and so-called intelligent objects, were not just neutral tools. They would compel us to adapt continuously to fit new ways of living. 

In time Jones turned away from the search for systematic design methods. He realized that academic attempts to systematize design led, in practice, to the separation of reason from intuition and failed to embody experience in the design process.”

All of the above ring very true today. The reductionist approach to knowledge, the general disdain for the richness of human knowledge and experience, the widespread contempt for embodied knowledge, the radical separation of reason and intuition, the hidden shaping of a new belief system around the superiority of rational machines, the invisible but violent bending of human friendly ways of living to fit machine dominated new ways of living.

Musings on Reductionism

A musing on reductionism, the type of thinking at the root of datafication, after an exchange with a friend on the topic.

He, rightfully I believe, mentioned that there is a place for reductionist thinking, it is useful and even essential for many tasks. The problem starts when we think of it as the path to truth.

I agree.

My issue with reductionism is not that it is useless or “bad” (for lack of a better word) in and of itself, but that, in the datafied society of the early 21st century where algorithms have taken over decision-making in many areas of life, it has become (or is fast becoming) the only valid source of knowledge. What can’t be reduced to computer data is for the most part abandoned. In other words, be subjugated or be forgotten

As a society, we bask in the warmth of the belief about the innate progress inherent to the digital revolution, and we self congratulate for having left a boring 20th century behind. But the type of thinking underscoring the digital “revolution” comes straight from the 19th century, so where is the revolution? It is the pinnacle of the logico-linear engineer type of thinking. I have nothing against engineers, they have an important place and role to play in our societies, but when this type of thinking colonises all areas of life and all dimensions of humaneness, and suppresses other ways of seeing and being in the world, I say Houston, we have a problem. 

Neil Postman (one of my favourite authors in the field of media studies, in many ways a visionary) touches upon this idea in his book “Technopoly: The Surrender of Culture to Technology”, a must-read! In a technopoly, the ideology underlying the technological tools becomes self-justifying and it is the technology that provides guidance to society instead of the other way round. 

Technopoly: The Surrender of Culture to Technology is a book by Neil Postman published in 1992 that describes the development and characteristics of a “technopoly”. He defines a technopoly as a society in which technology is deified, meaning “the culture seeks its authorisation in technology, finds its satisfactions in technology, and takes its orders from technology”. It is characterised by a surplus of information generated by technology, which technological tools are in turn employed to cope with, in order to provide direction and purpose for society and individuals.” [Wikipedia]