October 09, 2002
Technology and Opportunity

Technology and Opportunity

Keynote Talk at the Francisco Partners Fal1 2002 Investors Conference
The Inn at Spanish Bay, Pebble Beach, CA
Slides

J. Bradford DeLong
U.C. Berkeley and NBER

October 8, 2002

Version 1.1

Last week somebody told me of his thoughts, back in the spring of 2000, when he read Yale Professor Robert Shiller's book, Irrational Exuberance. "A lot of good points here. This guy is broadly right," he remembered thinking to himself. Then he came to the passage where Shiller said that even a 50 percent decline in the S&P Composite was not out of the question. "Naaah!" he thought. "Now he's gone to far. Shiller's overstating the downside."

Well, we have had our 50 percent decline in the S&P Composite from its early 2000 peak. And we are still not back to historical price patterns: the current earnings yield on the S&P is 3 percent. The historical average yield is more like 5 to 7 percent, depending on the phase of the business cycle and the state of interest rates. It is true that historical average pricing patterns never made any sense: they made diversified equities overwhelmingly attractive relative to debt for any long-run investment strategy. But anyone who wants to make large leveraged bets on the long-run rationality of stock market pricing needs to remember the old saw: "the market can remain irrational longer than you can remain solvent."

We live in interesting time indeed. If three years ago we thought that we were all geniuses--having temporarily forgotten the other old saw: "Don't confuse being a genius with investing in a bull market"--today it is clear that making wise investments is more than usually difficult and subject to more than usual amounts of risk.

But the task you all face is not insuperable. First of all, the underlying technological revolution that is creating the "new" information-age economy and that powered the bull market of the 1990s is still at work, still proceeding as powerfully as ever. An officially-measured nonfarm business labor productivity growth rate of 5 percent in 2002 is not out of the question. An officially-measured nonfarm business labor productivity growth rate in excess of 4 percent is guaranteed. Crude corrections for the biases in official statistics lead to the conclusion that the average American worker will be perhaps 6 percent more productive in 2002 than he or she was in 2001. Of course, 6 percent productivity growth coupled with 3.5 percent output growth means that American workers will be working 2.5 percent fewer hours in 2002 than in 2001--and since most of that reduction comes not from fewer overtime hours but from fewer jobs, the unemployed are not pleased. But such an upward storm in productivity is almost unheard of while the unemployment rate is still rising. Most of these productivity gains will be captured by consumers and workers in lower prices and higher wages, but some will be captured by firms in higher profits--and the faster productivity growth we think we see in the 2000s than in the 1970s and 1980s means that the profit slice of the pie will be absolutely larger.

Second, we expect the underlying technological revolution to continue, and to continue at an unheard of pace. Yale's William Nordhaus estimates that productivity in computation has improved at a steady pace of 60 percent per year since the end of the 1940s. Physical law imposes no limits on future such productivity increases for at least a decade. And data transmission and storage have their own versions of Moore's law. 60 percent per year productivity increases for more than half a century with no end in sight in sectors of the economy that account for 6 percent of final demand--let's compare that to the historical record. The original British industrial revolution: how big was it? In 1780 it took hand-spinning workers 500 hours to spin a pound of cotton. By 1840 it took machine-spinning workers only 3 hours to perform the same task--a rate of technological progress of 8.5 percent per year sustained across more than half a century (Freeman and Louca (2001)) in sectors that accounted for 3 percent of final demand. Our current technological revolution is seven times as fast. It is twice as salient, measured by the share of its products in total demand. And it has gone on for at least as long. That makes me think that our current technological revolution is, relative to the size of the economy in which it takes place, at least fifteen times as large as the industrial revolution studied in school.

And--this is the third point--industrial revolutions create huge mismatches between supply and demand, as newly-wealthy consumers demand things that do not yet exist and as innovative producers see ways to produce what has been demanded in vastly cheaper ways. Instead of a smooth flow of supply and demand, the economy becomes full of turbulence. Air pockets. Vortices. And each such vortex is the source of a fortune for those keen-eyed and clear-thinking enough to see it coming.

For example…

Let's cast our minds back a century and more to the railroadization of America. Like much of telecom, software, and even hardware today--consider that Intel now spends upward of $2 billion for a single fabrication plant--railroads promised enormous economies of scale: build it once, and then run any number of trains over it. Like our modern data processing and data communications technologies, the railroad was an enormous technological leap: it lowered the cost of transporting bulk goods by land by a factor of a hundred. Like our internet boom, there was extraordinary and wild enthusiasm for railroad investments: if they could capture as profit just one-tenth of the transportation cost savings the railroads had achieved… And like our internet bust, there were moments when investors in New York and London suddenly realized that they had been total fools and idiots to sink their money into a railroad running west from Duluth across northern Minnesota when next to nobody lived in northern Minnesota, or would live there for a generation.

The biggest of the railroad frauds took place in the 1870s, when it appeared that the Union Pacific was bribing 1/3 the U.S. Congress, and that the Central Pacific--run by Crocker, Hopkins, Huntington, and Stanford--had paid an extra $40 million to a construction company owned by--guess who?--Crocker, Hopkins, Huntington, and Stanford. Money that the largely-British investors in the Central Pacific had thought was going into earth-moving and track-laying went, instead, to form the core endowment of a great West Coast University. The biggest of the railroad busts came in the 1880s, when the share of the American labor force engaged in building railroads looks like it fell from 7 percent to 2.5 percent in the course of a year and a half. The biggest of the railroad reorganizations came in the 1890s, when J.P. Morgan and Company, and its clients, picked up bankrupt railroads with greatly distressed debt for a song, and went about finding managers who could run them efficiently and profitably--as they could, for without the burden of amortizing the debt and equity issued during the previous bubble the new managers found that they could slash rates. And as they slashed rates demand picked up.

As newly recapitalized railroads engaged in vicious price wars, and as the cost of transporting bulk goods by rail fell through the floor in the aftermath of corporate reorganizations, something else happened. Sears, Roebuck and Montgomery Ward discovered that--with the new post-reorganization much-cheaper railroad freight rates--all of a sudden the cost of shipping consumer goods to rural America was no longer a crippling burden. Mail a catalog to every rural household in America, offering them big-city goods at the near big-city prices that greatly reduced freight rates made possible, and rake in the money.

Or think of meatpacking. Swift and Armour. The New York Central, Pennsylvania, and New York, New Haven, and Hartford Railroads had crammed their losses down the throats of their ex-bondholders as they converted their debt to equity. Their new lower pricing structure meant that you could ship cows from the Midwest to New England and compete with local beef. Or, rather, you could mass-slaughter the beef in Chicago, ship it dressed to Boston, and undercut local small-scale Boston-area slaughterhouses by a third at the butchershop. This seeemed to Swift and Armour to be a very good business plan. It promised to produce large profits for entrepreneurs and investors and a much better diet at lower cost for consumers.

But what if the Massachusetts legislature were to require--for reasons of health and safety, of course, with the desire to protect the jobs of Massachusetts voters who worked in Massachusetts slaughterhouses or on Massachusetts farms the furthest thing from their minds--that all meat sold in Massachusetts be inspected, live and on the hoof, by a Massachusetts meat inspector, in Massachusetts, immediately before its slaughter? Then Swift and Armour's business model--their profits, the lower prices of beef and higher standards of living for Massachusetts consumers--evaporate.

The turn-of-the-century U.S. government took this particular bull by the horns. The Food and Drug Administration preempted state health and safety regulations affecting interstate commerce. Without this particular move in the system of market governance, you wouldn't have had Chicago's turn-of-the-twentieth-century meatpacking industry.

We can see how the analogy applies today. For the Baldwin Locomotive Company of Philadelphia, read "Intel." For the Carnegie Steel Corporation that made the rails, read "Cisco." For the Central Pacific with its repeated bankruptcies, read "Global Crossing"--although somehow I don't think we are likely to have a Winnick University at the end of all this. Just as the shakeouts and the mass bankruptcies of the railroads in the late nineteenth century had everything to do with financial management and irrational exuberance, and nothing to do with the technological opportunities and long-run economic benefits of rail transportation, so today's shakeout has little to do with the ultimate economic benefits that data processing and data communications are going to bring.

And the next stage of the analogy? Where are today's counterparts of Sears, Roebuck and Montgomery Ward, who will find that modes of retail distribution that had previously been unthinkably costly are now gold mines? Where are today's counterparts of Swift and Armour, who will build technologically-sophisticated systems--the equivalent of assembly-line slaughterhouses and refrigerated transport cars--on top of vastly cheapened high-bandwidth links? Where are today's equivalents of Singer Sewing Machines, which found that it could not sell sewing machines in its early years without providing a salesman to provide an hour's worth of hands-on training in how to teach people to use the machine to each wholesale customer, and which would have found it impossible to bear the load of keeping its sales force in motion around the rural midwest without the cheap railroad ticket?

I'm a professor. So I'm allowed to say that I leave these extensions of what I have said as exercises for the students.

Thank you.

Posted by DeLong at October 09, 2002 12:44 PM | Trackback

Email this entry
Email a link to this entry to:


Your email address:


Message (optional):


Comments

So what happens to the economy when automation becomes a viable substitute for human labor, rather than a complement? Eg, imagine AI robots whose rental price is lower than the subsistence wage.

Posted by: Neel Krishnaswami on October 9, 2002 01:08 PM

The same thing that happened to the economy when machine spinning replaced hand spinning or any other technology came along that allowed fewer workers to produce more stuff. Also, the same thing that happened to the economy when trade allowed companies to replace $10/hour domestic workers with $0.50/hour foreign workers.

Posted by: richard on October 9, 2002 01:58 PM

'That makes me think that our current technological revolution is, relative to the size of the economy in which it takes place, at least fifteen times as large as the industrial revolution studied in school.'

Holy moley.

Posted by: Jason McCullough on October 9, 2002 03:15 PM

The 15 times as large an impact leaves me scratching my head. Wouldn't it be more proper to compare incerases in productivity (generally) over the two equivalent periods. If you don't, you can easily ignore susidiary effects. Now, if I believe (and understand) d^2, then productivity comparisons are pretty meaningless, but should point at least in the right direction.

Picking and comparing the IT industry to weaving, rather than "machinery" (which is, in my opinion, a far more vaild comparison) makes the point, but I don't quite see it as fair or useful (except to make a pedagogical point I suppose)

I agree that the rate is faster (which is a result of the telecommunications revolution sparked by Bell). But the net impact? How does one properly measure such a thing?

Posted by: B on October 9, 2002 05:00 PM

richard: this seems qualitatively different to me. If there is no field in which human labor is more productive than robotic labor, then it's a technology that won't get used -- no one spins cloth using handlooms (or 1830s-era machine looms, for that matter) anymore. But what does it mean to have a society in which there is no human labor participating in the economy? A world in which robots are taxed to provide for the human population? Maybe humans are cybernetically augmented until they can participate in the economy? I simply can't picture it. I feel like a 10th century writer trying to imagine a world without serfdom, but this is a world that I don't think is more than a few decades off....

Posted by: Neel Krishnaswami on October 9, 2002 07:08 PM

'If there is no field in which human labor is more productive than robotic labor'

I think it's unlikely the world will be overrun by computerized psychologists.

Posted by: Jason McCullough on October 9, 2002 08:33 PM

Re:

>>The 15 times as large an impact leaves me scratching my head. Wouldn't it be more proper to compare increases in productivity (generally) over the two equivalent periods.<<

If you mean average economy-wide productivity, I think the answer is yes and no. No, because we're interested in turbulence as well as average growth, and so the dispersion of growth rates across sectors matters.

So I think that (a) total productivity growth in the leading sector, times (b) the salience of the leading sector in the economy, is the right way--or a right way--to go.

Posted by: Brad DeLong on October 9, 2002 09:29 PM

> no one spins cloth using handlooms

That's what I thought until a trip to Ghana. Near our hometown, there's a rare type of cotton crop - a wild and distinctive variety which if marketed successfully could be the next big fabric... It has continued to be spun using handlooms for the past century. I talked to a young woman who has taken over the family farm over the past year. Very energetic and entrepreneurial, she had big plans to increase the yield of the farm and expand production. She was just getting around to finding 'new technology' and looking for small business loans...

Aghast that handlooms are still the order of the day.

Posted by: Koranteng Ofosu-Amaah on October 9, 2002 09:32 PM

I find it very troublesome indeed that Moore's Law is being compared to loom productivity. This isn't my normal Cambridge quibble about aggregation of capital (though I make that too, and thanks to the guy who read it).

It's more a point that you cannot justify the "fifteen times" number unless you are going to count processing cycles as being the equivalent of cotton. Surely this is wrong; processing cycles are an input to the production process far more removed from the output of a consumable good than cotton is. A more meaningful analogy would be to look at the speed at which a competent secretary could revise a document.

In unrelated news, the word "attractive" in the following passage:

>>It is true that historical average pricing patterns never made any sense: they made diversified equities overwhelmingly attractive relative to debt for any long-run investment strategy. <<

is being made to do a lot of work; specifically, it is having to:

1) cover a lot of surprisingly strong assumptions about ergodicity of investment decisions
2) cover the assumption of a mean/variance model for risk aversion (pace Kahneman, for example)
3) cover the assumption that the historically measured ex post excess return was a good measure of the ex ante expected return
4) elide a number of tricky problems in the allocation of credit and the demographics of the savings market.

I count that a fourfold increase in the productivity of the word "attractive", which seems pretty good going even by the standards of Moore's Law.

Posted by: Daniel Davies on October 10, 2002 02:03 AM

Expanding on the less snotty of my comments above, if I remember correctly, Nordhaus measures computer productivity on the basis of a MIPS standard. I regard this as a highly misleading measure, because it ignores the concurrent development of high-level languages. Nordhaus counts this development as an *additional* benefit to the increase in MIPS available; I think that this is crazy.

Fundamentally, as I've intimated above, you don't actually use MIPS in your consumption. Let's take the example of a video game rather than anything physical to simplify matters. MIPS are more like an "input" into a computer program written in a high-level language. The combination of a program and MIPS produces as outputs a modulated stream of bits, which when combined with some more MIPS and a display unit, produces a consumption good.

Measuring computing power in MIPS wouldn't be a problem if the physical productivity of computer programs had remained constant. It hasn't. It's fallen massively. Modern high-level computer languages are massively wasteful of MIPS because they have to do incredibly complicated things at a level of abstraction which makes them tractable to humans (I'm talking about the distinction between "source code" and "binaries" for Lessig fans). There is no way in which you could write something like Windows if you had to use the same machine code John von Neumann used for his Fourier transforms. In order for something like Windows to be possible at all, it has to be written in a way in which far more MIPS are needed to produce the modulated bitstream. I haven't seen any work by Nordhaus or anyone else looking at the productivity of computer programs in MIPS terms which might compensate for this problem, which would seem to be to be serious enough to make the 60% growth estimate massively out.

In other words, Nordhaus' measure only works if we were to assume every MIPS was being used in the same superoptimised manner in which they were used when MIPS were incredibly scarce and precious resources. The productivity of an input to the production process depends on its price, which depends on its productivity ... which makes me think that this point is less tangential to the Cambridge critique than I had originally believed.

Posted by: Daniel Davies on October 10, 2002 02:32 AM

Daniel: you certainly can write a modern operating system in assembly language. QNX is an example of an operating system that is just that. Furthermore, most operating system kernels (including Windows, Linux and Mac OS X) are written in C, which is only modestly higher level than assembly. It's not a matter of feasibility, so much as it is just old-fashioned substitution of a cheaper input (MIPS) for a more expensive one (human labor).

How you make this substitution is through substituting a higher-level language rather for a lower-level one. This means that the productivity improvement is at least 60% a year, because programmers could always continue to write in the old style.

Inverting your argument, more MIPS also let you create new products that could not have existed otherwise. For example, I used to work for Bob Shiller's company CSW, which produced (among other things) repeat-sales home price indexes. The models used to create these just could not have been run 20 years ago because the MIPS weren't there.

Posted by: Neel Krishnaswami on October 10, 2002 03:40 AM

>>Furthermore, most operating system kernels (including Windows, Linux and Mac OS X) are written in C<<

Give over Neel; this is a dodge. While they are written in C, they are not written in anything resembling the kind of optimised machine-code-like C that would be needed to make MIPS the limiting factor. If you follow sensible procedures relating to modularity and code reuse, then it's a matter of semantics whether you're using C (with a lot of libraries) or some higher level language which happens to share the C syntax; it's still the case that a lot of the MIPS you're using are being used in checking for cases other than the one which is your current concern; that simply wasn't the case for the old mainframe programmers. In the interests of readable source code, many of the MIPS that you can throw at a program these days are spent in dead cycles, running through parts of the operating system that have nothing to do with the program you're actually using.

I think we're coming at two different angles to the same point here, however; the bottleneck in computing is not MIPS, so increases in MIPS do not stand as a good proxy for increases in productivity. Think about it as a nonlinear constrained optimisation problem; MIPS are a constraint which is for the most part non-binding, so increases in MIPS are for the most part not going to have any direct effect.

I considered your argument about MIPS as a lower bound on productivity, but I don't think it stacks up for the majority of computer programs in actual use. Take my example of a video game; how would it help to use the maximum MIPS on a Pentium III to create a version of Taito's Space Invaders which moved the aliens at approximately a million times faster than the screen refresh rate? Your CSW point is important mainly because it shows how unusual it is to find an economically important application in which "computations" are the direct output of computing. Usually, they're an intermediate stage in a much slower production process. Broadly speaking, as soon as you introduce a GUI, I would expect that MIPS become irrelevant to improvements in the productivity of your program.

Posted by: Daniel Davies on October 10, 2002 05:58 AM

Further to the above, if I understood and remembered my reading of Brooks' "The Mythical Man-Month", I think I'm correct in saying that the decision to *have an operating system at all* underlines the fact that the increase in MIPS is going to lead to a less than proportionate increase in productivity. Surely the entire point of having an operating system is that you have spare MIPS to allocate to the purpose of optimally scheduling tasks which compete for a scarce system resource which is not MIPS (memory and/or disk accesses, for example)?

This would suggest that, in all probability, disk retrieval speed equivalent would make a better proxy for computer productivity than MIPS-equivalent. Which would presumably still give some pretty fantastic figures for productivity, but not the same ones. And since the platoon marches at the speed of the slowest unit, I begin to suspect that the MIPS 60% is an *upper* bound on productivity growth.

Posted by: Daniel Davies on October 10, 2002 06:06 AM

Daniel, your main point -- that most tasks aren't heavily compute bound -- is exactly why more MIPS are helpful. If you have a heavily compute-bound task, you can't afford to substitute MIPS for human labor, and have to spend a lot of expensive programmer time optimizing your program to run within your time budget. If the task isn't compute-bound, your programmer can use a very high level language to quickly solve the problem. This increases the number of tasks completed per unit time, which is the real measure of productivity. As we get more MIPS, the fraction of compute-bound tasks declines, and you can substitute MIPS for labor more and more heavily by using high-level languages.

Space Invaders is probably a good example. If I had to write such a game today, I would use a high-level language with rich libraries to do it, and I would extremely astonished if it took me more than a few hours to do. If I had to write it in assembly, it might take a month or two. That's where the productivity increase is coming from. It would profoundly wasteful of memory and cycles, compared to the original, but I could create it much more quickly.

Posted by: Neel Krishnaswami on October 10, 2002 07:36 AM

I think I put it too strongly; I'm not suggesting that MIPS aren't very wonderful things, or that they aren't part of the story. I'm sure that the relationship between MIPS growth and productivity growth is monotonically positive. I'm suggesting that increases in MIPS are only a decent proxy for increases in productivity during those periods you describe; when an increase in MIPS suddenly removes the constraint on a class of applications.

Put it this way, sticking with the Space Invaders example. In the last twelve months, MIPS available to you may have grown by around 60% (assuming this kind of Moore's Law relationship). How much faster can you write that Space Invaders game today than you could last November?

In fact , letís dramatise this; taking your estimate, Taito launched Space Invaders in (I think) 1978. Assuming it would have taken you 30 days of 10 hour days to write this in assembler in 1978 and one hour to do it today, that's a growth rate of just under 26% per annum. Which is still pretty good, of course Ö. although Iíd emphasise that this point guesstimate doesnít mean anything; Iím just doing it in order to make the point that the way in which we arrived at it doesnít have anything to do with MIPS.

>>It would profoundly wasteful of memory and cycles, compared to the original, but I could create it much more quickly.<<

Yes, exactly; this is what I mean when I say that I don't think we're in disagreement. Would you agree that the most relevant measure of productivity would be something more closely linked to the speed at which you create a given program, or more generally, could use a computer to achieve a given cognitive task? If we disagree, it's about the extent to which MIPS are a useful proxy for this quantity.

Posted by: Daniel Davies on October 10, 2002 08:23 AM

and, double-posting like the quintessential internet loon I am, what do you think of my argument relating to RAM and disk-space above? I can see how high-level languages could let you substitute MIPS for labour; I have more of a problem with substituting MIPS for storage capability, in any way which did not seriously compromise the productivity of a MIPS.

Posted by: Daniel Davies on October 10, 2002 08:28 AM

But wait - isn't Neel coming at the productivity question from the wrong side? I actually suspected this before his last post, in which he gives away the game. Who cares whether Space Invaders takes a day or a month? Yes, that's a factor of 30 gained through MIPS, but that's the productivity of _one person_. The whole point of the computing revolution - and the reason it took almost 20 years to really impact the economy - is that the gains come from the increased productivity of everyone using well-designed programs. The economically important productivity gain isn't on the part of programmers, it's on the part of users - and a lot of factors come between MIPS and end-user productivity.
Now - I'm an architect, and the strides made in CAD just since I entered school a dozen years ago are stunning - and largely MIPS-driven. And, unlike a lot of computer-driven tasks, some of mine are purely MIPS - rendering of a 3D model is all about chip speed (and buses, etc., but you get the point). But it's not a straight line productivity improvement, because there are still enormous portions of building that model that are the same no matter the chip speed, no matter the program language; or rather, the improvements permitted by increased MIPS (like windows that snap into walls but not couches) are not exponentially more productive than they were 2 years ago, even if the chips are.
I'm not suggesting that productivity gains haven't been huge; I'm just trying to show how deceptive MIPS can be, and that the programming discussion is almost irrelevant.
One final point - OK, two final points:
1. As everyone must be aware, there's speculation that, although MIPS haven't hit a wall, demand for MIPS has, and that's part of the reason for the slowdown in PC sales. Now I'm not saying for a moment that this status will remain quo for long - some killer app will come along and everyone will run out to buy a new PC - but I think it indicates all the factors other than MIPS that relate to computer productivity.
Which leads directly to the second point, which is that the productivity bottleneck is in design, interface, and user training. We all know that most people don't use 90% (whatever) of the features in MSWord, and not just because some of the features are of dubious utility. The big gap between the productivity potential embodied in Word and the actuality is (almost) entirely due to the 3 issues I cited. And it's not clear that they're being addressed by anyone (except, in some ways, Apple and other smaller players). The old fear was that once cars got too complicated, no one would be able to work on them, and the industry would crash. Well, thanks to technology, no one except trained mechanics needs to work on cars anymore, and the net benefits are huge to everyone but weekend greasemonkeys. But since everyone in corporate America has a computer on his or her desk, that model doesn't apply. Or rather part of it does - no one needs to know DOS anymore, just as no one needs to adjust a carbuerator (sp?) anymore. But the training gap - and the willingness to use the tools - are both bigger drags on computer productivity than MIPS ever will be again.
Was that 3 points? Nobody expects...

Posted by: JRoth on October 10, 2002 08:42 AM

'Measuring computing power in MIPS wouldn't be a problem if the physical productivity of computer programs had remained constant. It hasn't. It's fallen massively. Modern high-level computer languages are massively wasteful of MIPS because they have to do incredibly complicated things at a level of abstraction which makes them tractable to humans (I'm talking about the distinction between "source code" and "binaries" for Lessig fans). There is no way in which you could write something like Windows if you had to use the same machine code John von Neumann used for his Fourier transforms. In order for something like Windows to be possible at all, it has to be written in a way in which far more MIPS are needed to produce the modulated bitstream. I haven't seen any work by Nordhaus or anyone else looking at the productivity of computer programs in MIPS terms which might compensate for this problem, which would seem to be to be serious enough to make the 60% growth estimate massively out.'

As a counterbalance, those high level languages have led to significant drops in the price of development. Speed increases can be used to either improve application speed or reduce development costs, both of which benefit consumers (subject to market power constraints for development cost drops).

'and, double-posting like the quintessential internet loon I am, what do you think of my argument relating to RAM and disk-space above? I can see how high-level languages could let you substitute MIPS for labour; I have more of a problem with substituting MIPS for storage capability, in any way which did not seriously compromise the productivity of a MIPS.'

If it's not in RAM, you have to get it off the disk, which is slower. If it's not on disk, you have to get it off the godforsaken backup, which is even slower. Increases in size (unbelievably large) and speed (very large) of storage have led to real gains, even for the average secretary banging out a document.

I think we've hit a wall on office-type applications, but that's anecdotal.

Posted by: Jason McCullough on October 10, 2002 09:29 AM

Oh, administration and maintenance costs have dropped through the floor due to the higher-MIPS powered switch in development languages, too.

Posted by: Jason McCullough on October 10, 2002 09:32 AM

Enough sidetracking already!

The question was: How can this power (MIPS, telecommunications bandwidth, etc.) be used for economic good and not for evil?

I suggest new topics:
Amazon is the new Sears Roebuck.
Targeted business information is available sooner (inventory control, enterprise data, sell-through) will become instantaneous.

Resume discussion.

Posted by: Pontif on October 10, 2002 09:49 AM

>>If it's not in RAM, you have to get it off the disk, which is slower. <<

Exactly. So computer productivity in terms of the speed of performing computer tasks, will progress at the rate of the solution to a constrained optimisation problem, the constraints being MIPS, RAM, disk capacity and disk access speed. Which was where I came in; it's extremely unlikely that MIPS has been the limiting constraint for the entire last sixty years, so the Nordhaus MIPS-based 60% growth rate is quite possibly a serious overestimate.

Posted by: Daniel Davies on October 10, 2002 09:54 AM

Could someone please show anyone in the computer industry that actually uses MIPS? Instructions per second makes no sense unless you also state how much is done per instruction. The comparison is only useful within an architecture, and even then, not always. Intead, applications are benchmarked. It would be a bit more rational to substitute SPEC scores or dtaabase queries per second, or any of teh actual benchmarks used in industry. That gives you the net increase in ability to do something useful--which is what we are after (we being a pretty broad term here).

Even that is tricky, because some things are negatives. The use of powerpoint to present graphs is one...my feeling is that the introduction of animations into powerpoint led to a net decrease in how much information was transmitted, as did the rise of shitty default backgrounds. I think this feeling is shared....

The measure of productivity in these cases is only tenuously tied to computer speed. The use of computers for editing of documents greatly increases speed...a somewhat nicer computer program (WYSUIWYG) is not. The measure of the system ought to be how fast you can type in a document, and do whatever you need to do with it. The existance of clippy used more processing, but did nothing to increase production of office documents.

For instance, rather than space invaders, take the game quake. A decent computer can run quake at a speed of 300 fps (frames per second), but can run quake 3 (quake with a graphics update) at perhaps 50. But fundamentally, both are the same basic game. So is the total system more or less productive? id studios is basically the same size, so programming time did not change.

What is the gain there? The same argument can be made for many systems. The only proper measure is the net increase in ability to make stuff. In some things, that has increased mightily, and in others only modestly.

By the way, (hard drive) memory usage has increased tremendously--much, much faster than moore's law levels--recently at least. 100% per year for the past few, 60% before that. It mostly allows me a big mp3 collection and the ability to use rcs on documents. More productive--i dunno--but lots more fun.

B

Posted by: on October 10, 2002 10:05 AM

'But fundamentally, both are the same basic game.'

In terms of gameplay, yes. A quick glance at monitors (and speakers) running Q1 and Q3 side-by-side will disabuse you of any other equivalances, though.

Posted by: Jason McCullough on October 10, 2002 01:25 PM

>>>
and, double-posting like the quintessential internet loon I am, what do you think of my argument relating to RAM and disk-space above? I can see how high-level languages could let you substitute MIPS for labour; I have more of a problem with substituting MIPS for storage capability, in any way which did not seriously compromise the productivity of a MIPS.<<<

In the initial draft of my response, I actually addressed this, but then deleted it as tangential. Thanks for giving me an excuse to resurrect it. You are absolutely correct that the limiting factor is the slowest/smallest of RAM, disk, access times, CPU and bandwidth.

However, it's possible to trade these off against each other over a surprisingly wide band. A modern operating system is a much bigger beast than the OS/360 that Brooks wrote about. But that's because it does vastly more. Your average modern operating system uses amazingly sophisticated, aggressive algorithms to use CPU to avoid using memory bandwidth, and to use RAM to avoid touching disk, and in general to avoid the bottlenecks in the system. For example, my PC has not touched disk to load programs in weeks, because their program code has all been cached in RAM. Now, these sorts of contortions would not be worthwhile if you were writing to the bare metal. But if you amortize it over hundreds of programs, then the effort becomes worthwhile. That's why a big, modern OS will tend to make programs faster than if they were written to the bare metal. Obviously, you can't get trade things off indefinitely, so at some point you run into the real resource constraints, but you can substitute space for computation over a suprisingly wide band.

I'm designing a new programming language now, and I've been surprised by how few of the resource constraints that the references I've consulted suggest are significant bind me. For example, I have a really sophisticated type system.[*] Typechecking would have been a really painful task on the machines I used a decade ago, but it's now below the perceptual threshold on my 2-year old PC.

I don't actually know if productivity growth due to computer improvements is 20%, 40% or 60%. All I know is that it's huge, and it's feeding on itself -- you can use more sophisticated techniques for compilation and program analysis, which increase final performance *and* have a positive impact on development time.

[*] For the nonprogrammers here: a type system is a proof system associated with a programming language, that can detect certain kinds of inconsistencies in a program when it is compiled, rather than when it is run. As usual, the more expressive and specific the class of things your type system lets you assert about your programs, the harder it becomes to write a typechecker.

Posted by: Neel Krishnaswami on October 10, 2002 02:43 PM

Brad, is there any way in which we could forward Neel's comments and some of the others on this thread to Nordhaus and you could post his response? I think we've reached a point of reflective equilibrium here; we know that the productivity increase is potentially very large, but the difference between 20% and 60% is several multiples of the Industrial Revolution, and given that we've not had any major social upheavals remotely on a par with that period so far, I think it's germane.

In case it's not obvious, I defer to Neel's comments on all remaining technical issues, by the way.

Posted by: Daniel Davies on October 10, 2002 03:35 PM

A BRAVE NEW WORLD
I respond to why we have nothing to fear from a coming AI world Posted by: Ram Ahluwalia on October 12, 2002 11:07 PM

A BRAVE NEW WORLD
I respond to why we have nothing to fear from a coming AI world here. There is no threat of subsistence wages in the offing. Phew.

Posted by: Ram Ahluwalia on October 12, 2002 11:07 PM

Perhaps we might be grateful that there appears to have been no analogue to the federal land grants that fed much of the transcontinental railway mania in the states (more at http://coldspringshops.blogspot.com

Posted by: Stephen Karlson on October 16, 2002 03:57 PM

>>Just as the shakeouts and the mass bankruptcies of the railroads in the late nineteenth century had everything to do with financial management and irrational exuberance, and nothing to do with the technological opportunities and long-run economic benefits of rail transportation, so today's shakeout has little to do with the ultimate economic benefits that data processing and data communications are going to bring.<<

How do economists account for the mismatch between such long-term goods and the short-term irrationality needed to create them?

Posted by: Paul J Kelly on October 21, 2002 07:58 AM
Post a comment
Name:


Email Address:


URL:


Comments:


Remember info?