November 05, 2003

Moore's Law Continued

Intel now thinks it sees how to achieve another four-fold increase in transistor density:

Intel Claims Breakthrough in Chip Making: ...With today's transistor gates - which consist of a piece of material that functions like a water faucet for electrical current - approaching thicknesses of just five atomic layers, computer chips have come to require more power, which causes them to run much hotter. Intel, the world's largest chip maker, has been struggling with the problem of excess heat as it has moved from etchings as small as 130 nanometers to the even narrower 90-nanometer limit.

Intel's chips have been running significantly hotter with each generation and there have been recent reports that the problem has caused a delay in its most advanced version of the Pentium, the Prescott. The new Intel technology would not be in use until about 2007, perhaps three generations of chip advances into the future. The industry is just now making the transition to 90 nanometers. After that it hopes to scale down to 65 nanometers, followed by a leap to just 45 nanometers, where the new material, which Intel refuses to identify, would come into play...

Posted by DeLong at November 5, 2003 08:00 AM | TrackBack

Comments

This is really fascinating. Someone said that the microchip was the first truly advanced thing mankind has produced (not me, I would go for the knitted sweater, try to make one if you don't think its advanced) - a microchip is like a map of the whole world, with every road represented by a connection, every single building on it represented by a transistor.

And the dollars per multiplication diagram in prof. DeLong's State of the Business Cycle was nice too. If it had displayed a longer piece of history, how would the advent of Brigg's logaritm tables have looked in that diagram?

Posted by: Mats on November 5, 2003 09:26 AM

____

The material, most likely, is synthetic diamond.

http://www.wired.com/wired/archive/11.09/diamond.html

Posted by: Josh Narins on November 5, 2003 09:33 AM

____

> And the dollars per multiplication diagram in prof.
> DeLong's State of the Business Cycle was nice too. If it
> had displayed a longer piece of history, how would the
> advent of Brigg's logarithm tables have looked in that
> diagram?

Good question, but I'm not sure I would know what the answer would mean. I think you could argue that *demand* for mulitplies is so different now than it was then (or even in the 1850s). Right now, we're using a lot of our multiplies to dematerialize the world, and that's the really big change.

But one other point from that original post strikes me as dead wrong:

http://www.j-bradford-delong.net/movable_type/bcff/ff_2003-11.html

I think that the idea that "real" hardware investment is anything like you see in his last figure is just insane. The sofware curve is far more believable. The hardware capability tells you about our computational capacity, but not necessarily about our computational usage. If actual cycles used had been the measure plotted, I see a much flatter curve there.

Posted by: Jonathan King on November 5, 2003 09:52 AM

____

Jonathan, I think you're right that there are potentially issues with 'real' computer hardware investment, but I don't think that invalidates Brad's point from the business cycle post.

There doesn't seem to be any doubt that the price per unit of computing power has been plummeting, so $6 billion (roughly the annualized nominal change in computer investment in Q3) represents a lot more computing than the same expenditure in 1996. (I'd note that the computer hardware price index -- which with 1996=1 would be around 0.17 based on the advance GDP figures -- isn't falling as fast as cost per xIPS or xFLOPS.) Other computer-related stuff is just flatly a lot cheaper now than in, say, the mid-90s -- displays, for instance.

You rightly raise the issue of what the computing power is being used for, though I'd add that how and not just that the additional cycles are used matters, too. Some of it, clearly, *is* enabling new (or expanded) 'marketable' computation services, and describing that as a higher 'real' economic investment doesn't bother me at all.

I see a grayer area in how one values (for instance) the computational power thrown at seemingly inconsequential user interface elements and what have you -- a la the progression from Office 2000 (Pentium 75+) to Office 2003 (Pentium III 233+). As the functional differences in, say, Word or Excel over the last several Office releases seem to require an advanced degree in marketing to discern, the question arises as to how, if at all, the upgrade expenditures expand productive potential.

Other questions seem to arise in making sense of payments under licensing schemes for some 'enterprise' software (which probably amounts to the bulk of the expenditure). Anyway, I can see translating software expenditure changes into 'real' investment as being tricky.

This isn't necessarily beyond the reach of some clever analysis, but I have no real idea how the price indexes might reflect that.

Posted by: Tom Bozzo on November 5, 2003 11:05 AM

____

And by the time you figure out how to quantify computational value (think ecosystem grid by the way as the most data intensive application on the horizon) it will be obsolete:

http://www.theregister.co.uk/content/61/33711.html

(I have no clue what a British 'boffin' is but I hope it is not insulting.)

Posted by: Nobody on November 5, 2003 02:53 PM

____

It may be that they'll be leapfrogged by IBM in processor technology. The complexity of the x86 series of processors is such that this technology will be necessary for Intel before it's necessary for IBM's PowerPC line.

Posted by: Randolph Fritz on November 5, 2003 10:51 PM

____

Post a comment
















__