Making sense of information age

Print edition : November 08, 2019

The official Twitter account of Auschwitz Memorial’s response to the product on Amazon.

A tablet pc with mobile applications icons on virtual screen. Photo: Getty Images/iStockphoto

The book attempts to meticulously decode the paradox of increase in discord and dissonance despite unfettered access to information and ease of connectivity.

We live in paradoxical times, particularly from the perspective of information and communication technology. The current age of peak information—both in terms of access and abundance—has also seemingly curated a fractured global society. A noticeable increase in discord and dissonance despite, or perhaps because of, unfettered access to information and ease of connectivity is a paradox that requires careful consideration from diverse points of view.

James Bridle’s New Dark Age is a meticulous and impassioned book that attempts to decode this difficult paradox. Early on the author says, “We know more and more about the world, while being less and less able to do anything about it”, which is a helplessness most of us can recognise. The helplessness we have grappled with while scrolling through streams of social media comments or Twitter threads with an increasingly prevalent breed of Internet personalities (sometimes called Internet trolls) or when realising the futility of presenting facts and debating within the invisible and isolated echo chambers of the Internet or when faced with product placements that seem either as if read off our thoughts or obnoxiously disconnected and distasteful. 

In a broader context, the book is also about how complex technologies that enable the Internet-driven networked world are obfuscating our understanding, hampering our ability to think and are altering social interactions, the economy, the polity and the very fabric of societies. The author probes how technology is exacerbating imminent existential crises such as climate change and global economic disparity by pointing out the effects of collapsing consensus, failing sciences and post-factual politics. The nature of the questions that the author investigates does warrant a certain degree of alarm and the probe is done without falling into the trappings of Luddite argumentation. 

With a loaded title, the book does seem, at the outset, like a pessimistic and cautionary tale of technology and the end of the future, as exaggerated in the subtitle. However, quite early on, the author calms the anxious reader down and clarifies that the nature of the darkness dealt in the book is different from what one might presume it to be: “It (the darkness) refers to both the nature and the opportunity of the present crisis: an apparent inability to see clearly what is in front of us, and to act meaningfully, with agency and justice.”

The nature of the darkness, it is explained, is different from the violent ignorance of the European dark ages. Today, the author argues, our ignorance is because of an excess of information rather than a dearth of it. The overwhelming availability of information, emanating from complex technologies that are beyond our comprehension, puts us in a unique position of inaction and aversion to thinking. At one point, he writes: “And so we find ourselves today connected to vast repositories of knowledge, and yet we have not learned to think.” One strong criticism in the book, which is a risky generalisation, is that we have given up thinking, leaving it to be done by computational tools. 

The opportunity that is presented in this new dark age, according to the book, is to acknowledge our limitations and embrace the uncertainty that we are neck deep in. The book formulates a hope out of this darkness: “We have much to learn about unknowing. Uncertainty can be productive, even sublime.” Once we acknowledge this uncertainty and know the limitations of our knowledge and understanding, we can use the computational tools not to seek imperfect answers but to ask better questions. For instance, the author points out how Garry Kasparov returned with a symbiotic format of computer-assisted version of chess after he was defeated by IBM’s dedicated chess playing machine—Deep Blue. The hope professed here is that by reclaiming thinking in new ways, using the computational tools we have at our disposal, we can overcome defeatism and helplessness to act with a renewed form of agency. 

Grappling with Hyper objects

 

An interesting framework presented in the book, to examine our experience within our networked world, is the concept of hyper objects. Drawing upon the philosopher Timothy Morton’s description of a hyper object as “a thing that surrounds us, envelops us and entangles us, but that is literally too big to see in its entirety”, the author extends and re-purposes it to reflect what the Internet has become today. 

To some degree, each one of us experiences the Internet intimately but cannot fathom its scale, span or physicality. This dissonance, he points out, is pivotal in the new dark age as we are unable to wrap our heads around such hyper objects, discouraging any meaningful engagement with the pros and cons seeping out of the networked world. One of the more fascinating connections made in the book, assisted by a dual commentary in a couple of chapters, is to loop in climate change as another hyper object. This allows for making interesting comparisons with our experiences with the network and climate change. For instance, the cognitive dissonance and resulting inaction that has come to define the resistance to acknowledging climate change is somewhat similar to the discord groomed on the Internet that is driving wedges between communities. 

The author also points to the positive feedback between the two hyper objects. The Internet with its massive paraphernalia of infrastructure that includes servers, routers and underwater cables, chugging in a humongous amount of energy behind the scenes, is adding substantially to the global energy consumption. 

More recently, owing to advancements in artificial intelligence (AI)—thanks to deep learning—there has been a further surge in resource consumption. These AI algorithms, that are trained and run on power-hungry infrastructure, will soon be adding to the carbon footprint of the networked world as an increasing number of tasks is handed over to these algorithms. 

A recent publication at a natural language processing (NLP) conference, aimed at helping algorithms understand human language better (Google Translate is a product based on such research), it was reported that training one of their community’s most successful algorithms (Transformer Networks) was equivalent to the annual carbon footprint of about 60 humans  ((Strubell et al. 2019). , Table 1))

In her seminal book, Weapons of Math Destruction, Cathy O’Neil coined a phrase “to codify the past” in describing algorithms. A couple of chapters in New Dark Age draw and extend this perspective when examining AI, basically glorified algorithms learning patterns from big data using powerful hardware. The majority of these algorithms are codifying the past and along with it the historical biases and prejudices present in the data. Extending this concept, the author evokes several popular (notorious in this context) examples where algorithms have failed miserably simply because they were good at perpetuating the patterns they recognised in the historical data they were exposed to. 

The extent of these algorithmic misdemeanours ranges from the extremely upsetting, like in the case bordering digital phrenology where an algorithm was trained on pictures of people to detect if they were criminal or not and the algorithm returned features based on skull shape as a useful indicator, or the obnoxious case of products generated by algorithms sold on Amazon, which have had text printed with rape threats and with pictures of holocaust victims. 

While these are extreme instances of algorithmic misdemeanours that can be accounted for by our intervention, there are perhaps innumerable instances where the decisions made by algorithms are steering the course of our societies without our conscious approval. This is another facet of the new dark age discussed in the book. Intervening in these cases is harder because of the technical expertise that is necessary. Aggravating this imbalance further is the fact that these algorithms are increasingly controlled by a handful of corporations with little or no oversight by governments and/or the public. The author accurately sums up the new dark age as a state where “power (is) in few hands and understanding in fewer heads”.

Although the book presents a diverse and comprehensive look at how the networked world driven by algorithms has altered our relation to technology and ourselves, it does not present any concrete solutions; which is perhaps the point.

 New Dark Age is more a wake-up call, urging us to first acknowledge our unknowing, accept the uncertainty and reclaim our thinking.

Raghavendra Selvan is a post-doctoral researcher at the Machine Learning Group and Data Science Lab at the University of Copenhagen, Denmark.

References

Emma Strubell, Ananya Ganesh and Andrew McCallum. “Energy and Policy Considerations for Deep Learning in NLP”, Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019.

This article is closed for comments.
Please Email the Editor