RSS

In the robot job war, imagination is the weapon

28 Feb
Will Smith might have had an easier time if he'd worked with the robots instead of against them.

Will Smith might have had an easier time if he’d worked with the robots instead of against them.

A few weeks back, a story about how humans were losing the labour race against robots was making the media rounds. The mini-furor was kicked off by comments from MIT professor Erik Brynjolfsson on 60 Minutes.

“Technology is always creating jobs,” he said. “It’s always destroying jobs. But right now the pace is accelerating. It’s faster we think than ever before in history. So as a consequence, we are not creating jobs at the same pace that we need to.”

From reading most of the news stories on the subject, it was easy to figure Brynjolfsson for a techno-pessimist – someone who, despite his job at MIT, feared the encroachment and spreading of technology.

The thing is, Brynjolfsson is also the co-author of a 2011 book called Race Against The Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy. If you’ve read it, you’ll know he’s anything but a cynic.

The book spells out how the recent Great Recession ended with a jobless recovery. One of the surest signs that a country is actually in recovery (besides positive economic growth) is when businesses start to spend on things like equipment, computers and software again, which did occur in the United States. Another is that they also start to hire, but that didn’t happen.

U.S. unemployment continues to hover above 7 per cent, or significantly higher than the 4.6 per cent logged in 2007, before the recession. It gets worse than that, according to Brynjolfsson and his co-author James McAfee: a general lack of hiring has resulted in zero net jobs being created over the past decade.

Why haven’t businesses been hiring over the last while in general and the past few years specifically? Automation is the answer, which is where the robots-outpacing-humans part comes in. With machines, software and algorithms getting smarter and better, humans are being required less and less. The proof is in the pudding, the authors argue, since U.S. productivity – a business’s output divided by the number of people it employs to produce it – has been a non-stop steamroller since pretty much World War II. Jobs down plus productivity up equals clearly means more and more is being produced by machines and automation.

That’s where most press reports stopped. But Race Against the Machine (a great title for a book, by the way) suggests this is just a temporary blip, with its authors actually quite optimistic about the future. We’re clearly in the midst of an upheaval akin to 19th century industrialization – everything, especially our job situation, is in flux. This isn’t a problem we haven’t dealt with before, we just need to figure it all out, the authors argue.

Those who try to fight the machine will lose, since the machine will only get smarter, faster and better. We humans can do that too, but we’re much slower at it. As the book suggests, the jobs humans must start doing more – because machines are not likely to do them anytime soon – involve creativity, team building and leadership. Conversely, if your current job involves someone else telling you what to to do, you’d better upgrade your skills because that job will eventually be done by a machine, which excels at performing instructions.

A recent Wired story estimated that about 70 per cent of the existing jobs today will be replaced in this way within this century. That number may even be on the low side, considering that 200 years ago a similar percentage of Americans worked on a farm, versus just 1 per cent today.

The key to winning the race against the machine, the book authors write, is to actually race with them. One particularly good anecdote talks about how chess masters have stopped trying to beat computers, but are now rather using computers to assist in their own decision making. It’s classic case of, if you can’t beat ’em, join ’em.

In this sense, the potential for new human jobs is actually infinite. With technology expanding exponentially, so too are the possibilities for mash-ups, which covers everything from Facebook to cellphones. Each is a combination of a host of rapidly advancing technologies. Humans have proven very adept at creating these mash-ups. Machines, not so much. As the authors put it, “Combinatorial explosion is one of the few mathematical functions that outgrows an exponential trend. And that means that combinatorial innovation is the best way for human ingenuity to stay in the race with Moore’s Law.”

More humans therefore need to stop doing rote jobs and start thinking up new stuff. That’s a surefire way to counter the negative employment trends of the past while.

Perhaps the best quote on the topic comes from Stanford University economics professor Paul Romer, who attributes the current angst to one very particular human failing – a frequent lack of imagination:

Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new recipes or ideas were discovered. And every generation has underestimated the potential for finding new recipes and ideas. We consistently fail to grasp how many ideas remain to be discovered. The difficulty is the same one we have with compounding. Possibilities do not add up. They multiply.

Advertisements
 
5 Comments

Posted by on February 28, 2013 in robots

 

5 responses to “In the robot job war, imagination is the weapon

  1. Infostack

    February 28, 2013 at 8:32 pm

    Peter, the techno-pessimists should factor in that “digitization” of communication/info-media networks in the 1980s and 1990s brought tremendous economic and productivity gains. In fact, since we started remonopolizing the sectors over the past 12-15 years bandwidth pricing has disconnected from both moore’s and metcalfe’s laws. So the lack of labor growth may in fact be directly tied to the lack of innovation and economic growth brought about by hitting the relative bandwidth brick wall.

     
    • Peter Nowak

      February 28, 2013 at 8:36 pm

      That’s an intriguing point. I’m not sure I’d go that far but there probably is something to that.

       
  2. Infostack

    February 28, 2013 at 9:19 pm

    When you add moore’s law (50% improvement in performance every 18 months) to metcalfe (costs increase linearly, while pathways (potential value as defined by transactions) increase geometrically) you should get ~50% improvement in bandwidth performance/price annually. However, since 2003 bandwidth performance/pricing has only been improving 5-10%. That’s why $70/gig in google KC or Wifi that costs 1/20th LTE prices appear to stark. Such a spread between retail price per bit and underlying economic cost per bit serves as a huge brake on economic growth and transactions.

    If we had competitive markets today, broadband would be much faster, much of it would be free and/or heavily subsidized and there would be dramatic restructuring of inefficient institutions based on analog-based information and pricing models. Not sure why you don’t buy into this, as it is exactly what happened when we digitized the WAN/IXC, which directly lead to the digitization of datanets (aka the internet), which then helped scale digital wireless. You of all people should appreciate that the digital wireless revolution began in Canada with Fido’s 10 cent pricing back in 1996. While we digitized all 3 of these sectors and pricing dropped 99% in 10 years, revenues still grew 6-20% on average for each sector; well above GDP growth.

    The demand that offset that tremendous per/bit price compression was created 3 ways:
    1) standard price elasticity;
    2) private back to public consumption models as pricing reflected marginal cost and brought high volume customers back on-net thereby further creating additional scale economies (the reverse by the way happens with average pricing of vertically integrated monopolies….which is what we are seeing today, again); and
    3) application elasticity. http://bit.ly/IZmTfE

    All 3 demand elasticities, but in particular the latter, are the huge economic growth drivers that I am referring to in my first comment and that create entirely “new economies” that none of these techno-pessimists can predict, yet fuel the demand for human and intellectual capital. Unfortunately the creative destruction of the competitive era has been diminished by the ossification of the vertically integrated monopolies. http://bit.ly/Sfmiw1

     
    • Peter Nowak

      February 28, 2013 at 9:33 pm

      Oh yeah, I don’t disagree – broadband providers have artificially disconnected the service from Moore’s Law and innovation is suffering for it. It’s just the extent I’m not sure of. It’d be great if someone could quantify it.

       
  3. Infostack

    March 1, 2013 at 11:00 am

    Peter. sorry about the length, but here’s an answer to quantifying it, as well as following note on telecom oligopolies and a response to the Business Insider article you reference. http://read.bi/Wwf2JY Here goes:

    History can be revised and data distorted to fit any reality. What we need to distinguish is what portion of today’s absolute and relative broadband position can be attributed to the competition of the 1980s and 1990s, and what portion to the consolidation and remonopolization of the sector over the last 15 years brought about by the farcial Telecom Act of 1996, special access “de”regulation of 2002 and equal access abrogation for broadband in 2004. All of these created price control bottlenecks that killed effective competition. The proportion as I outline below is 95% to 5%. And the result is bandwidth that is 20-150x more expensive than it should be and a price to cost arbitrage between the retail price per bit and underlying economic cost per bit that is the widest it has been since the early 1980s.

    The best way to understand where we are is to look at bandwidth prices in the “middle mile”. Private lines are one of the essential costs in any service provider’s model and can represent anywhere between 20-80% of their cost of revenue depending on the market and scale (for startups and rural carriers it is closer to 80%). 10 years ago 100 megs per second (mbs) cost $5,000 per month for an average private line circuit.

    Let’s assume we have 5 different states of competition: monopoly, duopoly, oligopoly, competitive (private), competitive (public). The latter are different, in that a private user can take advantage of Moore’s law, while public service providers benefit as well from scale economies of the network effect (Metcalfe’s laws). The concept of private vs public is easily understood and recognized in both the changes in the voice markets of the 1970s to 1990s (PBX’s going to VPNs) and data markets of 1990s to 2000s (LANs/client server going to the internet/cloud). The increase in the addressable demand and/or range of applications results in significant scale economies driving enormous cost savings.
    Now let’s put our model comparing the 5 states in motion. Assume the monopoly is quasi regulated and society as a whole is aware of decreasing technology costs. As a result let’s say they are generous (or are forced to by the regulator) and drop pricing 6% per annum. Over 10 years our 100 mbs circuit costs $2,693, a reduction of 46%! Wow, looks good.

    Now let’s say we have a duopoly, which the article states is the case with multi-model competition. In that case 10% price drops have been the norm and in fact we end up with pricing 65% below where we started, or $1,743! Wow, wow! Everything looks fantastic!! Just like the article says.

    But wait, there is a catch. That’s really not how it happened, because bandwidth is not bandwidth is not bandwidth, as Gertrude Stein once said. Bandwidth is made up of the layer 1 (physical, say coax, fiber or wireless mediums) and layer 2 costs (transport protocols or electronic costs), but also can be impacted by layer 3 switching (tradeoff for both physical and transport layer costs depending on volumes and type of traffic and market segments). In many instances, there is monopoly in one geographic area or layer and it can create monopoly like price/cost constraints everywhere depending on the nature of the application and market segment in question. But I digress. Let’s keep it simple.
    So let’s just say we have an oligopoly (we really don’t on a national basis even in wireless, which is a duopoly as far as the upper 50% of the market of users is concerned) then we could reasonably expect to see 15% declines. Then that same circuit would cost $984 today, 80% below prices 10 years ago, but more importantly, 63% more gross margin for a competitor or value added reseller or application developer to work with to put into innovation or leasing or building out additional capacity to serve more demand. Uh oh! Now the conclusions in the article don’t look so rosy.

    But wait. It gets worse. Very large users still get to take advantage of Moore’s law because they have buying power and, well, because they can; 95% of us can’t. If you peek under the covers of our telecom infrastructure, private networks are alive and well and getting cheaper and cheaper and cheaper. (And in the process creating diseconomies for the rest of us left on public networks). In urban markets, where most of these large customers are, price declines (but only to these customers) have followed Moore’s law on the order or 35% per year. So that $5,000 circuit is in fact approaching or at $67. A whopping 98.65% cost savings! Absolutely f—king insane! Most people don’t realize or understand this, but it’s the primary reason Google (a huge “private” user) can experiment with and “market” a 1 gig connection for $70/month to the end user (the other 95%) in Kansas City.

    Finally, let’s not forget about Metcalfe, our 5th and final state of competition. (By the way he’s the generally accepted originator of Ethernet, which is powering all our local area data networks (LANS) and increasingly our metro-area (MANS) and wide area networks (WANS), but most importantly all our WiFi connections in all our mobile devices). Metcalfe’s law (or virtue as I call it) is that the cost of a network goes up linearly while the potential pathways, or traffic possibilities, or revenue opportunities goes up geometrically. In other words costs go up minimally while value grows exponentially the more people use it. The internet and all the applications and application ecosystems (Facebook, Google, Amazon, etc…) are one big example of Metcalfe’s law. And the implication of Metcalfe’s law is to scale Moore’s law by a factor of 2. Trust me on this, it’s slightly more complicated math but it’s true. So let’s be conservative and say the price drops per bit should be around 60% (10% less than 2 times 35%). That means that the $5,000 circuit 10 years ago should cost….$0.52 today. Huh???? 52 cents?!?! Say it ain’t so!

    That my friends is what could be. What it should be. And unfortunately, why the application world is look like it’s 1999 all over again, and why our economic growth has declined and why countries, who don’t have our information economy scale have raced ahead of us. It’s not about density or regulatory edict/policy. It’s simply about the lack of competition. Actually, the right type of competition which requires another article. Competition of the 1980s and 1990s brought about digitization and price declines of 99% in voice, data and wireless. But revenues grew anywhere from 6%-25% annually in all those segments because 3 forms of demand elasticity occurred: 1) normal demand/price elasticity; 2) shift from private back to public; and 3) application-driven elasticities. Basically there was a lot of known/perceived, latent/stored and potential demand that was unleashed simply by price declines. And that is precisely the point we are at today, that few if any appreciate, and contrary to the assumptions and conclusions of the article.

     
 
%d bloggers like this: