Compendium: The Internet, the Y2K Bug, and the Work-Standard’s Mechanization Rate

Information technologies grew up alongside nuclear technologies in the wake of the Second World War. The most obvious example of their pervasive importance is of course the World Wide Web (WWW). The WWW is the digital medium that everyone uses whenever they are referring to the Internet. Its emergence in the 1990s came as a result of decades of research into Cybernetics and the role of information technologies during the height of the Cold War.

The Cold War was a time of rapid developments for the technologies as the transdisciplinary field of Cybernetics was increasingly explored on both sides of the Iron Curtain. The realm of information technologies reached their infancy as a viable model of economic organization around the death of Bretton Woods. This was between the late 1960s and early 1970s, at a time when the Cold War was starting to become normalized as the latter half of the conflict dragged on towards the end of the 20th century.

The computers in use during the latter half of the Cold War were unlike the ones that are so commonplace in the early 21st century. They were massive machines that took up an entire room, the storage capacity too small and their overall Prices indicative of their impracticality for everyday uses outside of government bureaucracies. In fact, it was because of those circumstances that the infamous “Y2K Bug” was allowed to be created.

For those who have not had the benefit of not being born before the 21st century, the Y2K Bug was a computer error that came from a computer’s dating system relying on two-digit years rather than four-digit years. A computer prior to Y2K rarely composed the date as “MM/DD/YYYY” to generate an output of 12/31/1999 to indicate “December 31, 1999.” The memory capacities of computers, particularly those that needed to store large amounts of data, were too small and too expensive to make four-digit years practical. Thus, computer scientists in the late 20th century relied on two-digit years as a stopgap measure. Instead of “MM/DD/YYYY,” computers were programmed to use “MM/DD/YY,” their output being “December 31, XX99.”

As one can imagine, the rollover after ‘99’ raises all kinds of questions. Yes, average people can figure out that the day after December 31, 1999 was January 1, 2000, but what about the computer? Will the computer register the rollover year as “1900” or perhaps “19100” (19/00)?

The reasoning behind the use of two-digit years at the time was that, given the rapid advancement of computer and information technologies since the death of Bretton Woods, these affected computers would be replaced by newer computers that have enough memory capacity to render the Y2K Bug ‘obsolete.’ Unfortunately, this sort of reasoning is just more of the same one-sided Liberal Capitalist thinking characteristic of having a linear perception of Life itself. The flaws of that logic became discernible by the 1990s, which was when the Y2K Bug began to become taken seriously. Anyone who was alive at that time remembers the panic from the Western world where there was a push by the United States to have computer systems “Y2K compliant.”  

The WWW was already in its early years when the Y2K Bug began to receive the attention that it deserved. In a time before the rise of social media and the growing dependence on the WWW from the 2000s onward, the transmission of information on the Internet in the 1990s was scant. Fewer people were on the Internet during those days, which accounts for why the historical record on the Internet from that period is so scarce. Of course, one can go through old news archives from the period, but there really is no Internet-only website or blog from those days. A lot of what has been written about the Y2K Bug by average people on the Internet was done in hindsight, sometime between the years and decades after 2000.

The purpose behind discussing about the WWW in relation to the Y2K is to illustrate a point.  Practically everyone knows by now that information technologies have grown increasingly prominent within everyday life. Anyone who is reading this entry in the Compendium on their PC, smartphone or any other device will be accessing the WWW through an Internet connection. The greater availability and accessibility of the WWW came as a result of subsequent advancements in information technologies like WIFI and Cloud computing allowing greater coverage for wireless devices. From gaming consoles to smartphones, they have enabled people around the world to become more interconnected than they were in the past.

It has become commonplace in the daily lives of billions of people around the world. The very notion of humanity somehow ‘losing’ access to the Internet seems unfathomable, especially for people born after 2000 since they grew up without memories of a time before the WWW. Even for them, the premonition is so outrageous that it cannot possibly happen, the possibility has been explored by governments and telecommunications firms since the 1990s. 

The phenomenon in question has been termed “Internet balkanization,” which refers to a possible breakup of the World Wide Web into a series of national intranets called the ‘Splinternet’. Unlike the Internet, Splinternet is limited in terms of where and when information is allowed to circulate across networks. This is already being done among the internal networks of universities, institutions, governments, and organizations. What the Splinternet describes is the concept of the Internet splintering into intranets restricted on a national level, hence its term.

Recent examples of trends toward the Splinternet have been the “Great Firewall” of Mainland China and the more recent “Sovereign Internet Law” of Russia. It is possible that Cybersyn would have thrived better under the circumstances of the Splinternet rather than the Internet as the issue of malware, cyberwarfare, cyberespionage and cyberterrorism become increasingly dangerous in this century. It may seem preposterous, but the possibility of malware and cyberattacks are not the only conditions for the rise of the Splinternet. Another problem has been the widespread proliferation of trivial information such as memes, including the inability of institutions, technology firms, and governments to filter the flow of information. Information on the WWW has grown exponentially that it is impossible for anyone to keep track of everything.

This provided the impetus for the development of data mining and analytics to gather information, sort out the reliable from the unreliable, and draw conclusions based on their implications. Not to mention the common usage of search engine browsers like Google to navigate and sift through the information.

If it was stated that the Work-Standard is suitable for being used alongside Cybernetics technologies, what would be its interactions with the emerging importance of the digital realm? Is the Work-Standard designed to function with the Splinternet or the Internet? And assuming it is the former instead of the latter, why? Is there anything within the literature about the history behind the World Wide Web that lends credence to such a conclusion?

After two days of searching, I am confident that the Work-Standard is ideal for both the Splinternet and the Internet. It can be argued that, in a Socialistic world order, nation-states would maintain their own national Intranets and leave a central area within the digital space where the Internet exists. The World Wide Web in its current form would be ill-suited for such an arrangement because of how open, vast and uncontrolled it is as of late. While there have been attempts in recent years by American technology firms, particularly those operating social media platforms, to control a growing portion of the WWW, those attempts are in final analysis Liberal Capitalist endeavors. They are negative consequences of a free market ‘being cornered’ by growing monopolization by businesses with the ability to become states-within-the-State.

The Work-Standard is also compatible with the emerging realm of eCommerce and technology platforms that have grown up around the time of the Y2K Bug. But rather than eCommerce replacing outright actual economic activities offline, the Work-Standard will instead promote the integration of online activities with the offline ones. Again, this goes back to the issue of Liberal Capitalist economics never bothering to maintain a proactive approach in guiding technological developments toward less self-destructive and more meaningful outcomes where everyone benefits. If anything, the pervading influence of information and computer technologies have given rise to the potential for the Work-Standard’s “Mechanization Rate” to become a potential replacement for an Interest Rate.

The Mechanization Rate is capable of achieving what the Interest Rate was intended for, which was to control the rate of Currency Depreciation/Appreciation, but without the negative effects that having high Interest or else Negative Interest brings. High Interest means the Borrower has to pay more Kapital to the Lender for simply borrowing, whereas having an equally high Negative Interest results in the Lender paying more Kapital to the Borrower for simply lending. This dynamic has been the basis of Usury as a concept within the Western world, its impact diminished due to the greater impact of having low Interest Rates and countries running into Negative Interest Rates.

In essence, the old dynamics of Usury fades and in turn a different dynamic over the question of humanity and their technologies, which Ernst Jünger anticipated in Der Arbeiter, emerges. The significance of the Work-Standard’s Mechanization Rate comes at a period in Western history where there is now looming questions over the future of Arbeit (Work). A Financial Regime sets the Mechanization Rate not the rate at which Geld is readily available, but the rate at which Arbeit is being done by humans.

The idea here is for the nation-state to determine what should be taken over by automation and what should continue to be done by humans. A suitable application of the Work-Standard would maintain a decent Mechanization Rate, where humans remain integral to economic activities, but their overall effectiveness is enhanced by technologies. The effectiveness itself becomes dependent on the people operating the technologies rather than the technologies themselves. And while further research on the implications of the Mechanization Rate remains forthcoming, this entry to the Compendium remains a viable foray into what is becoming an increasingly relevant topic at this point in the research of the Work-Standard.    



Categories: Compendium, Economic History, Philosophy, Technology

Tags: , , , , , , , , ,

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: