## February 22, 2003

### Notes: Interesting Math Calculations

A suggestion for an "Interesting Math Calculations" problem--something that I don't know the answer to offhand (both because I don't know how small semiconductor circuit features can be before quantum mechanics ceases to be your friend, and I don't know how large semiconductor circuit features are today).

How long can Moore's Law go on? Starting from the average distance between atoms in a silicon crystal, find the time when chip features will be (supposedly) one atom wide...

Posted by DeLong at February 22, 2003 02:42 PM | TrackBack

I recall an article about this subject in the NYTimes a couple of years ago, it's now in their paid archives, IIRC a good search string would be "ultimate laptop." A scientist hypothesized about the ultimate limits to physical matter's computational power. He ended up with a gedanken experiment about a quantum computer that is basically a blob of plasma, the limiting factor is heat dissipation. Moore's Law no longer applied. It was fascinating, you should spend the \$2 and look it up.

Posted by: Charles on February 22, 2003 11:35 PM

I reckon that you can no longer ignore quantum effects when the sizes and speeds(to be exact the momentum: mass times speed), or times and energies, are in the Heisenberg indetermination range. If my understanding is correct, when the speed of the electrons that go through a gate times the size of the gate is less than Planck constant divided by the mass of the electron and twice pi (3.14...) then we cannot know which side of the gate is the electron, and so it cannot be used to control a process.
Probably it would be felt with greater sizes, but my English writing is to clumsy to refine my answer.

DSW

Posted by: Antoni Jaume on February 23, 2003 10:19 AM

heres a good primer on moores law

http://arstechnica.com/paedia/m/moore/moore-1.html

Posted by: part zwei on February 23, 2003 11:36 AM

The essay you are looking for is titled "There's plenty of room at the bottom" by Richard Feynman. It is in his book The Pleasure of Finding Things Out under the title of "Computing Machines in the Future".

Posted by: Jonlongstrider on February 23, 2003 01:16 PM

The arstechnica paper is nice, and the guys who run the page link here every so often.

The point of the ars technica article is that Moore's law is not that chips get 1/2 the size every 18 months, or twice as fast--read it, it is really good....but pretending that we haven't read it, we can make a nice little set of calculations.

Assume that the "deflation" rate for chip feature size is 45% per year (~70/1.5)*. Current, state of the art features are .1 micrometer, or 1000 A. This is about 400 atoms long. Quantum effects--the kind we want to avoid--will be pretty serious at 20 angstroms--thinner than that, and things don't behave very solid. How long will that take?

Quantum effects never stop, and are pretty important at all scales. It may get harder to make chips when the non-quantum effects start diappearing--things like resitivity!

The next tricky question is to ask how power density will change. Current microprocessors will dissipate 40 W over a 120 sqmm die--more, sometimes. Die size is roughly constant. But the power increases with the feature size. You can then plot the power density over time. Ouch.

Now you can also look at the power consumption vs. time.

*The rule of 70 is a very convenient trick

That is the sort of questions I would talk about. It also puts the question in the context of lesson 13.

Posted by: Brennan Peterson on February 23, 2003 10:09 PM

What are the prospects for chips running calculations with light waves rather than electrons?

Posted by: Anarchus on February 24, 2003 08:33 AM

light waves? very good... people are working on that rather intensively... and a new fibre structure looks really cool, as it uses an interleaved lattice to block all but a pure wavelength (no harmonics, so no dispersion) try searching the national post for hollow core fiber...)

we're currently .13 to .08 mm... the problem isn't at the gates, but rather how close together two wires are... the thickness (in absolute number of atoms) of the insulation is then the limitng factor for current methods, as you can't decrease past a specific number of atoms or else the insulator doesn't... we're pretty close, and then you have bigger problems than quantum effects

Posted by: libertarian uber alles on February 24, 2003 08:49 AM

What is most probable is that by the time quantum effects are unavoidable, computation will depend on these quantum effects. So they will no longer be unwanted perturbations but the mecanism through which the computation happens. And if some of the thing I've read on qubits are valid claims, we're in for some very exciting achievements.

DSW

Posted by: Antoni Jaume on February 24, 2003 01:27 PM

Intel seem to think that 22 nm is going to be the minimum feature size.

A VP from Intel gave a talk at a SF Bay Area event. Here's the URL to his presentation: