Real war in virtual space

The objective of cyber capitalism is to colonise cyberspace and generate and use big data for further enslavement of people. The battle between freedom and enslavement has now shifted to cyberspace.

Published : May 09, 2018 12:30 IST

To exploit the ever-expanding cyberspace, all information, in zeros and ones, had to be transformed into controlled, usable and sellable products. Thus came the cyber platforms operated by corporate giants.

To exploit the ever-expanding cyberspace, all information, in zeros and ones, had to be transformed into controlled, usable and sellable products. Thus came the cyber platforms operated by corporate giants.

IN 1978, we were the first batch of students at Indian Institute of Technology (IIT) Delhi to study the microprocessor along with boolean and switching algebra. Little did we know then that we were learning about a chip that would soon transform the entire world and, that too, at an ever-increasing pace. This rectangular octopus-shaped chip, which converted every bit of information into zeros and ones, was to soon change its shape and size and invade all spaces of our life.

My generation is a product of the gigantic mainframe computers such as the DEC system-10, manufactured by Digital Electronic Corporation. For housing this gigantic computer, a new building was constructed on the IIT campus. All those who had access to the mainframe computer, especially students who joined “computer science”, moved around the campus with an acute sense of self-importance and a chip on their shoulders. The sub-discipline, “information technology (IT)”, was yet to be born. Even the brightest among us could not foresee the revolution that was in the offing.

One of the major problems that computer and electronic engineers faced at that time was called the “tyranny of numbers”. To improve the performance of electronic machines and computers, the number of components had to be increased. These components had to be physically connected through wires. The laborious task of soldering these wires was prohibitive. It was clear that the old technology was reaching a saturation point. Jack Kilby’s invention offered a novel solution. In 1958, he demonstrated that several electronic components could be created and integrated on a single semiconductor chip, which was later called integrated circuit (I.C.).

The second half of the 1970s and the early 1980s proved to be the most vital years in the history of computers. This is the period when the single-chip CPU (central processing unit) was invented by a microchip design team. The group of engineers led by Federico Faggin, Ted Hoff and Stan Mazor successfully integrated 2,300 transistors in an area of 3 x 4 mm2 using 10µ technology.

Vigorous and cut-throat competition between the two giants, Intel and Motorola, triggered an ever-increasing density of transistors on per unit area of a semiconductor chip. Gordon Moore, one of the founders of Intel, in 1965 had predicted the “doubling of components per chip every year”. By the mid 1970s, the prediction assumed the status of a law, and since the invention of the first I.C. until this day, the electronics industry has followed Moore’s Law.

Three factors governed the industry with respect to this law. First, the speed and capacity to perform simultaneous multiple operations is directly proportional to the increase in the density of transistors per unit area. Secondly, the power consumption was inversely proportional to the density. Thirdly, for any corporation to withstand the pressure of an ever-enlarging market, it was crucial to optimise the costs: this could not be achieved without the timely bulk production of higher-density I.Cs. On the one hand proliferation of technology led to competition among producers of microchips and on the other mass production forced user firms to search for new markets, new uses, new inventions and new products.

Jose van Dijck, a professor of Comparative Media Studies at the University of Amsterdam, in his foreword to a recently published book, The Datafied Society: Studying Culture Through Data, wrote: “Networked connectivity runs on data—the new oil of the information economy. Just as electricity changed industrial processes and domestic practices in the nineteenth century, a data-driven paradigm will constitute the core of twenty-first-century processes and practices.”

The digital revolution driven by I.Cs impacted all spaces of human activities. Every operation performed by man or machine could now be reduced to zeros and ones. But something else was also happening simultaneously, unlimited digital storage space was available to humankind. This space could consume unlimited data and store them eternally without any loss of information. With powerful computers at one’s command, the capability to collect, collate, transport over huge distances, and analyse the large datasets did not just double every year, it increased exponentially. It is interesting to note that the Human Genome Project, the largest data collection project of the 20th century, was initiated as an idea in 1984, launched in 1990, and declared completed in 2003. This is the period when the digital revolution was also taking place.

The changing world caught the imagination of both scholars and novelists. Sidney Sheldon, in his detective novel Bloodline (2007), observed: “Nothing was sacred, nothing was safe. Privacy in today’s civilisation was a delusion, a myth. Every citizen was exposed, his deepest secretes laid bare, waiting to be read. People were on record if they had a social security number, and insurance policy, a driving licence, or a bank account.” Obviously, he was referring to the ease with which government surveillance agencies could continually observe a citizen’s activities. In 2007, Sheldon could not have imagined that the datafied society would bring about a major transformation in the economic, political and cultural spheres, government agencies would be left far behind, and big giants would keep an eye on all activities of individual citizens and manipulate them, online.

Big data chaos

Every technology, revolutionary or otherwise, brings its own culture and aesthetics. Until the turn of the century, the collection, analysis and storage of data was a cumbersome, time-consuming and expensive exercise. Therefore, libraries, government departments, museums, art galleries and archives were the repository of large visual, textual and numeric datasets. They occupied huge spaces and their maintenance was expensive. Cyberspace had no such barriers, its creation brought about multidimensional cultural changes in almost every society. But the biggest change it brought about was in the methodology of gathering data. All endeavours to collect data in the pre-cyber era were objective-driven, focussed and directed to save time and money. Writing a book, administering a survey, creating records, or for that matter even taking a photograph was a planned activity driven by its potential use.

Digital technology changed our attitude towards data completely. It placed the cart before the horse. As we entered the 21st century, to collect, collate and store data and finally construct a large database became the objective of many international projects. The possible application and uses could be thought of later. During the late 1990s and early 2000s, in order to make books and documents available to anyone across the globe, individuals and small groups started large-scale scanning and digitising.

The idea of an online free access encyclopaedia proposed by Richard Stallman in December 2000 became a reality in January 2001 when Jimmy Wales and Larry Sanger launched Wikipedia. Its novelty was embedded in the decision that there will be no centralised editing team. In effect, every person became a possible contributor. The idea took about five years to catch the imagination of people; however, the number of articles posted on Wikipedia rose from 7,50,000 in 2005 to more than 45 million, in various languages, in the next 10 years. Intellectuals and scholars rejected it as an authentic source, which it is not, and some still remain quite cynical about it. But I do not know any scholar who has not consulted it as a ready reference source.

In the past 20 years, mind-boggling technological innovations have made every citizen a potential writer, reporter, language editor, journalist, critique, photographer, musician, film-maker, artist and singer. The democratic effect of the new technologies, which ensured unprecedented connectivity, is profound. Those who could never think of making their creative or mundane acts or opinions public could now address the entire humanity with unhindered freedom. Apparently, this new-found liberty has created what could be termed as a big data chaos, and has invited a lot of criticism. But let us not forget that it has also impacted the power balance across the globe. Those who are placed at the periphery of the economic and socio-cultural pyramid, and have been kept away from cultural participation, knowledge generation and even knowledge consumption, can now claim spaces with equal ease.

The big resistance movements witnessed in the Americas, Europe and West Asia, seemingly spontaneous, were in fact highly organised, in a short period, thanks to high-speed connectivity. The working class was never this close to realising the Marxist dream, “workers of the word unite”, articulated in 1848.

Colonisation of cyberspaces

No technology is value-free. In a corporate-led capitalist world, despite its democratic characteristics, cyberspaces provided the same opportunity, which the discovery of America a few centuries ago had provided. But there is a big difference between the discovery of America and the creation of cyberspace. Colonisation of the globe by imperialism was propelled by the greed to loot material riches and occupy physical spaces. Cyberspace is not physical; virtual zeros and ones construct it.

The products that generate these zeros and ones, for example a computer or a cell phone or an ATM card, are physical products and, therefore, offer a market, which at some point of time is likely to saturate. Like the cosmos, cyberspace is ever expanding. For exploiting this space, all information, in zeroes and ones, had to be transformed into controllable, usable and sellable products.

Thus, during the past two decades we have also witnessed the emergence of cyber platforms operated by corporate giants such as Google, Amazon, Facebook, Twitter, WhatsApp, Paytm, and Uber. In other words, our generation is witnessing the birth of cyber capitalism from the womb of traditional imperialism. The bigger objective of this brand of capitalism is to colonise cyberspace and generate and use big data for further enslavement of people, instead of liberating them. The centuries-old tug of war between freedom and enslavement is transferred to cyberspace as well. The space is virtual but the war is real.

Once colonised, cyberspace also offers the capability to centralise, control and manipulate individuals, often without them knowing about it. For example, the seemingly harmless and helpful Google Maps, which has brought about a major cultural change in our attitude during travel (we today seldom ask fellow travellers for directions, we use Google Maps), also keeps a record of destinations we travelled to and how much time we spent at a particular location. These records, now a sellable commodity, are kept for ever. Even if we keep our cell phones switched off, we can be tracked anywhere in the world. Our conversation on a cell phone can be tapped, recorded and analysed by an agency located thousands of miles away. Chats or visuals uploaded on Facebook or WhatsApp add to the big data, which can be used to even scrutinise our feelings, choices and hobbies, for the ultimate purpose of controlling our collective or individual behaviour.

It is not that all scholars subscribe to this scary picture of the fast-changing world. Nick Couldry, a professor at the London School of Economics, rejects claims of “ultimate control” by those who handle big data. He forcefully argues in his essay “The Myth of Big Data” in Datafied Society that “already in the latently metaphorical term big data, there is a story being told about human beings’ changing relation to the domain we have called social that is highly contestable. Big data, it is implied, is the source of a different order of knowledge, a step change in human self-understanding that precisely bypasses humans’ meagre attempts at self-understanding through interpreting the local details of what they think, say and do.” The “big data”, however expanded its scope and however fine-tuned its workings, must always be a selection from the actual world of action and interaction.

The recent debate on Aadhaar, in India, which is being enforced despite a stay by the Supreme Court, or the controversy around “Cambridge Analytica” and Facebook shows that as cyber capitalism is becoming stronger, the clash of two dreams, the dream to control and manipulate every citizen on the planet and the dream to achieve individual freedom and liberty, is becoming vigorous. Cyberspace is being used by resistance movements and individuals to mobilise people on various issues; a recent example is the call, “delete Facebook”.

It must be recalled that Mahatma Gandhi started his political career in South Africa by burning identity cards. Only time will tell if people will come out on the streets and reject the big data hegemony.

Gauhar Raza, formerly Chief Scientist of the CSIR, is at present Honorary Scholar, Human Sciences Research Council, South Africa.

Sign in to Unlock member-only benefits!
  • Bookmark stories to read later.
  • Comment on stories to start conversations.
  • Subscribe to our newsletters.
  • Get notified about discounts and offers to our products.
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide to our community guidelines for posting your comment