The technology industry works in waves. Famously this includes things such as the first Internet boom of the mid to late nineties, but also notions such as the artificial intelligence explosion during the 1970s, which preceded the even more notorious “AI winter” during which the AI label became positively toxic, and was replaced with euphemisms such as “Knowledge Based Systems” to cover essentially the same ideas in order for funding to continue to flow, though with the hype gone it was greatly reduced.
Revisiting some of the fear and hype around AI in the 80s is amusing today, especially given it was often combined with a fear of being outdone by the Japanese who decided to spend absolutely vast sums of money through their mildly sinister Ministry of International Trade and Industry, an organisation which struck fear into the hearts of many policy makers in the west. Luckily some observers in the west were astute enough to notice that even though AI had shown some highly impressive specific use cases, such as blood disease diagnosis, a working general AI, that is a far more human type of intelligence that adapts readily to different domains, remained more elusive than initially anticipated
What we now know is many of the working assumptions of those early AI days were fundamentally flawed, and many currently held ideas are probably also wrong. Most of the more impressive apparently intelligent systems around today, for example those used by Google, are really the brute force application of relatively simple statistical models to large amounts of data. Attempting to get the machine to understand the rules of domain generally isn’t explicitly attempted, whereas previously the bulk of the work of a successful system would be tracking down what those rules actually need to be. This realisation has led to recent explosions in “big data” and “machine learning”, many decades after the mainstream establishment essentially gave up on AI, though in many ways they are the distant descendants of that movement.
The same thing has happened with the Internet, several times. We have been through the first dotcom bubble, the Web 2.0 boom, and now the mobile bubble. During each the potential of the technology for world change is greatly exaggerated, yet at the end we seem to be left with just a better way to exchange cat photos. While this is important it isn’t life changing and people slowly lose interest and move away to other things.
From the point of view of someone working on the underlying technology this is deeply irritating as it’s hard to express to those outside just how much more advanced our devices and networks are today than they were just a decade ago. My career in mobile games spanned fitting games into 64kB through to full on 3D accelerated online racing monsters weighing several gigabytes, but the truth is a lot of the smaller games were actually better. That’s not pure nostalgia, but a side effect of the way our systems evolved, especially towards the touch only input of most modern cellphones dramatically altering the interfaces and adding enormous unremovable input latency which makes many more arcade style titles impossible. Cellphones of the mid-2000s used to be able to do very good facsimiles of old Nintendo Game and Watch titles, but this is no longer the case, even though the power of the modern device is much higher.
The consequence of all of this is the infrastructure we now have is a long way beyond knowing what it’s actually for, and we’ve lost some of what we used to have at the same time. It is assumed that at some point someone will work all this out, but as the AI example shows that can take decades and the emergence of a seemingly unrelated other technology before a renaissance can occur, often under a new name. This is why things like Google Glass are so disappointing: they’re very clever, but there doesn’t seem to be any point to them at all. Using one just gives you the impression it’s a fun camera, not much else. Gigabit internet is conceivably a bigger change, however, this is mainly going to disrupt the cable television industry, and is unlikely to have as large an impact as the move to always on broadband did in the first place.
While it can sometimes be put down to excessive cynicism it seems the whole industry is entering a sort of lull after a few years of hyperactivity, but the revolution has been postponed. Disappointingly, when it arrives it will still be televised too.
Nigel Birkenshaw is head of Atomirex.