With hotels run by robots popping up in Japan, and robots becoming more and more useful in industry, Sam Bartlett discusses what this means for us mere humans.
As early as 1930, John Maynard Keynes coined the term ‘technological unemployment’ (Keynes, 1930), as a means to suggest that at some point in time, the use of labour could outrun creation of new uses for labour. Advances in technology has been the overarching narrative that has driven forward human progress, but with any advance in technology, there have always been losers; looms supersede weavers, foundries surpass blacksmiths and your average Joe can now manipulate a keyboard faster than a traditional typist of the eighties. In their place though were always fresh new jobs for labour to move into. But with the pace of modern technological revolutions accelerating exponentially, there is a real prospect that the computers we rely on for so much may make a range of jobs obsolete, and may leave no room for a chunk of society.
Industry and manufacturing has always been at the forefront of changing technology to improve efficiency. The use of automation and robotics has been used in large factories since the 1980s, and as the technology improves and reduces in cost, smaller factories and firms can now afford to automate manufacture. This inevitably led to the loss of jobs; however the jobs that have been lost tended to be repetitive, arduous tasks. In a scathing indictment of our species, humans are not that good at repetitive tasks that require consistent precision for hours on end without a lapse of concentration, a cigarette break or a piss; problems not faced by robots. Therefore it seems almost eloquent that a robot should take a job that is in itself robotic.
The issue remains though that technology is encroaching into white-collar jobs, once considered protected from the tide of advancement. There were always industries and skills that it was considered we humans excelled at over computers; the service industry, data analysis and pattern recognition to name a few. But jobs such as financial reporting have already been affected by automation. The Associated Press already automates 3,000 financial reports a quarter about corporate earnings in America (White, 2015). Software can now analyse huge reams of legal data in large law suits to process, compile and decide which documents are relevant to the dispute, typically the job of paralegals and low level lawyers; an algorithm designed by the Michigan State Law Professor can even predicate 71% of Supreme Court legal decisions (Michigan State University, 2014). These examples can be attributed to the rise of ‘Big Data’. Through data mining and programmed pattern recognition, trends that once were impossible for a human to recognize in such huge swarms of data now become apparent. But physical tasks, complex communication tasks such as translating and advanced pattern recognition tend to remain the reserve of human endeavour; for now.
In 2011, IBM built Watson, a supercomputer with the power to complete and win a game of Jeopardy. A cheap parlour trick, you might say; a computer simply programmed to do a single task, no better than a chess computer. But the reality is the complexity behind Watson’s machine learning code has far reaching ramifications. Already, it has a far higher successful diagnosis of lung cancer; a staggering 90% success rate compared with 50% for human doctors (Steadman, 2013). Watson’s learning ability is so new, the applications for such machine have yet to catch up. If a computer such as Watson can recognize, analyse and solve a problem all without human intervention, the potential for computers to supersede humans in a host of industries is unlimited.
So what’s the effect of it all? Clearly, the possibility for value creation using the spectrum of technological advancement available currently is huge, both in terms of societal change and economic good. There is the potential to improve efficiency and productivity across a host of industries. In an ideal world, some might argue that allowing computers and robotics to work for us means less need for humans to work laborious jobs and focus on other, perhaps more creative pursuits. However, there is the biting reality that increasing levels technology is having an effect on employment and income. In the United States, data since the end of World War 2 suggests that as productivity increases, driven by technological advances, so does overall employment and wealth. However since 2000, whilst productivity has continued to increase, the growth of jobs has stagnated. Furthermore, the median income in the US has dropped even though GDP has continued to rise (Rotman, 2013). This is creating rising inequality for those ill-equipped to keep pace. ‘Race against the Machine’, a book on this topic by Brynjolfsson and McAfee, rather than blaming advancing technology, suggest that the problem is that business and skills are not keeping up with the rate of technological advancement.
It is apparent that the unstoppable juggernaut of technological advancement is here to stay. The question remains as to what effect it will have on jobs and society as a whole. Anyway, I’m off to start that programming course.
words by Sam Bartlett