Andrew McAfee offers a fascinating report on, and response to, a speech by Northwestern economist Robert Gordon last week. Gordon, whose 2000 paper “Does the New Economy Measure Up to the Great Inventions of the Past?” is a classic of the genre, was, as McAfee notes, “one of the last serious scholars to accept the conclusion that IT was having a positive and significant impact on productivity in the US, but he came around when the data showed him clearly that this was in fact the case.”
Now, though, Gordon is having fresh doubts about IT’s long-term potential for boosting productivity. The doubts are spurred by the apparently sharp decline in productivity growth which began in mid-2004 and has continued to the present (following nearly a decade of robust growth). The decline, which, McAfee says, “threatens to take us back to the anemic 1972-1995 productivity growth levels,” has led Gordon to join
a growing chorus of voices who are arguing that the strong relationship between IT investment and productivity growth has broken down recently. If this is accurate, it’s quite bad news. Productivity growth is a primary engine of economic growth and, ultimately, of increases in standard of living. If the wonderful, unprecedented, and unanticipated productivity increases we’ve been enjoying since 1995 are in fact coming to and end despite our continued investment in computing, and despite the fact that computers continue to get much more powerful over time, then we have a problem.
McAfee does a nice job of summarizing Gordon’s pessimistic view – and of providing an optimistic counter to it. He believes that “like previous general purpose technologies IT is having a deeply transformative effect, which will take many years to play out completely.” But he admits that if the productivity numbers remain depressed, he will, like Gordon before him, have to modify his view.
G’Day from Australia,
I am a long-time visitor to your blog, though this is my first comment.
I really enjoyed the Gordon paper you linked to in this post. I agree with you when you have written in the past, that there is too little attention given to historical perspectives of technology/management.
Do you have any other interesting references to share? Even if they have no direct links, it is not too difficult to access them from any University network.
Thanks for the great blog. You should visit Australia some day :-)
Luciano
You cannot make the information worker more efficient with more computing power, but you can eliminate the information worker, which leads to higher productivity gains for the organization (and a higher productivity ratio per remaining information worker).
Based on my own experiences with that market, I am convinced the next 5 years will see a dramatic increase in applied artifial intelligence (disclosure: that’s what my company sells, so I’m biased, but I’m also looking at concrete customer numbers).
This, by the way, is also a strong incentive for centralized service provision for organizations that would traditionally not consider it, usually for privacy reasons. We keep raising the intelligence of our software, and that requires levels of computing power that make it cost inefficient to install the hardware at client site.