Dubai - Emirates Voice
Happy New Year! Or is it?
A “new” year, I mean, not whether it’s a happy one. For that, we need only check the Bitcoin exchanges, Donald Trump’s Twitter account, or the balmy temperatures at the ice caps.
Declaring it a “new” year, on the other hand, depends on which calendar you follow. Most important in this region is Islamic New Year, which begins on September 11. Aside from this, there’s still the Chinese New Year to come (February 16), the Persian New Year (March 21), and the Hebrew New Year, which this year will be on September 9.
Calendars are one of our oldest technologies, and it’s hard to think of a part of our lives not shaped by them. Yet, as we acquaint ourselves with 2018, it’s perhaps time to pause and reflect on how we are already slaves to even newer technologies.
The year 2017 was one in which we were visibly altered by the tech in our lives. The world at the end of last year was so different to that of a year earlier – America is a changed nation; shaped, so the argument goes, by Russian interference in the elections of 2016.
The mastery of data technology by one country left another nation with real-life consequences in every area of its domestic and foreign policy. What is perhaps most striking is that these are all the result of what was originally considered benign technologies. “Facebook” combines two innocuous words whilst “Twitter” implies something trivial and childlike.
How could something so innocuous as a “tweet” change the world or, indeed, radically alter us as people? The surprise isn’t that change occurs or its rapidity, but that we are shocked when it happens. It’s not like there haven’t been precedents. It was the American sociologist, Robert K Merton, who first coined the term “the law of unintended consequences”, based on his observation that deliberate actions meant to help us often have surprising results.
That is the legacy of Thomas Midgley, the American chemist who infamously solved the problem of “knocking” in combustion engines by adding lead to petrol. He then helped develop chlorofluorocarbons for refrigeration, therefore ensuring his name is forever associated with the two greatest pollutants in human history.
Social media and mobile phones might yet warrant a place alongside those two toxins – considering the fact that psychiatrists have deemed the obsession with taking selfies a mental disorder and schools in some parts of the world are banning the use of mobile phones, claiming the move is a public health message to families.
But, the point might be more broadly applied to so much of our tech designed to fit seamlessly into our lives. Social media would not be so ubiquitous if it were still only accessible through desktop PCs. The early social networks in the 1990s were relatively small in scope and use, but it was the arrival of mobile phones, specifically Apple’s first iPhone in 2007, that enabled them to reach huge audiences.
The problems that ensued were compounded, because as Sean Parker, a Facebook founder, recently admitted, it was designed around “a vulnerability in human psychology”. This is the critical point of understanding.
The dangers of technology are unlikely to be aggressive forms of artifical intelligence (AI) we have been taught to fear by Terminator movies. There will probably never be an attack on some SkyNet of our future. The danger will come from our need for and passive acceptance of technology. Twitter is already the equivalent of Aldous Huxley’s soma from Brave New World: “delicious soma, half a gram for a half-holiday, a gram for a weekend, two grams for a trip to the gorgeous East, three for a dark eternity on the moon”.
Technology of the future will be small, delicious, and provide easy solutions to life’s ills. Yet the cost to us, both individually and as a society, might well be like a dark eternity on the moon. And if that sounds unbelievable, then consider how some of this is already happening.
In 2009, a little known Swedish programmer called Markus Persson made something new and quite different. It was a clever piece of Java code that allowed him to create worlds from maths. This itself was nothing revolutionary. The technique is called “procedural generation” and has been used by computer programmers for decades in a variety of contexts.
What made Persson’s code different was that it allowed users to manipulate these landscapes, building structures from blocks the user could literally “dig” out of the terrain. Five years later, Persson, known to the world by the more memorable sobriquet “Notch”, sold his code to Microsoft for $2.5 billion (Dh9.1bn). By then, it wasn’t just a code, but a company called Mojang, and a deeply compelling game which the world had come to know as Minecraft. The genius of Notch’s idea lay not in the programming, but in the concept of a game in which players could roam and gather resources. It remains true to this day that the “game” of Minecraft remains fairly limited.
Despite Microsoft’s huge investment, little has been done to change the underlying gameplay and there has been no sequel. The essential mechanics of the game have remained unaltered. The fear, perhaps, is that the mechanism was so perfect that they fear they might break it. Yet, in this, Minecraft is really an allegory for the world itself.
The reasons for Minecraft’s success are the same reasons we are all vulnerable to technology. Minecraft is addictive not because it does something new, but because it does something old: it returns us to our hunter/gatherer roots, exploiting instincts dormant for so long yet somehow still programmed into our nature. Its virtues, such as encouraging creativity and experimentation, are there to see alongside its flaws.