Monday, Oct. 06, 1997
CHIPS AHOY
By JOSHUA COOPER RAMO
Computer scientists, like the characters in Russian novels, tend to fall into two camps: the optimists and the pessimists. The pessimists grouse in books, at industry conferences and to every journalist in sight that the computer revolution has gone about as far as it can go. They argue that the size of the atom--and the electrons that surround it--puts a limit on how many transistors can be squeezed onto the surface of a silicon chip. The optimists, represented by Intel billionaire Gordon Moore, believe chips will keep getting smaller and faster at a predictable rate (which Moore famously described, in 1965, as a doubling of capacity every 18 months). The success of the computer industry is due in large part to the fact that Moore has been right. But even the optimists of microprocessing recognize that if Moore's Law is going to continue to hold, chipmakers will need a breakthrough.
They just got two. First, Intel, the world's largest semiconductor firm, announced that its engineers had discovered a Houdini-like trick for stuffing twice the quantity of digital information in the same physical space on a chip. Then last week IBM unveiled what may be an even more significant advance: its researchers had found a way to replace the aluminum conductors in their microprocessors with copper, which is cheaper and faster. Says IBM vice president John Kelly, who has been experimenting with copper chips since the 1980s: "This is the big one."
It is big, and not only because copper is relatively cheap. The real breakthrough is that copper conductors will make it simpler to build much smaller chips. This is a big relief to chipmakers, who were, as the pessimists suspected, having a tough time pushing electrons through smaller and smaller aluminum conduits, which become less conductive as they shrink. IBM had been working patiently on the problem since scientists realized a decade ago that to move to the next level of miniaturization (to wiring .25 microns wide, about 400 times thinner than a human hair), they would need to abandon aluminum.
Copper was an obvious replacement, but it had a couple of problems that seemed insurmountable. The first emerged when scientists tried to lay copper onto silicon. The tiny copper atoms filtered into the porous silicon like hot coffee dripping though a percolator. Copper is so conductive that just one hyperkinetic atom could "poison" the entire silicon surface.
The second problem was more subtle: a wire that is .25 microns wide is so small that once you've built it, you can't touch it. So instead of trying to unroll tiny wires onto silicon chips, microprocessor engineers laid down a thin sheet of metal and etched away everything they didn't want. What was left were microscopic paths of metal just wide enough to carry a current. But while chipmakers had developed any number of ways to etch aluminum, no one had yet figured out how to etch copper. Doing that, IBM suspected, would require inventing a whole new kind of chemistry. Doing that became something of a Holy Grail within the industry, says Drew Peck, a semiconductor analyst at Cowen and Co.
A decade later, after following nearly as many blind alleys as there are paths on a chip, Big Blue had solved both problems. For starters, they invented a special coating that rests between the copper and the silicon, a kind of microchip Gortex that allows just enough electricity to pass between layers. They also perfected an etching technique that will not only work with copper but will also take chipmakers to even smaller realms, from .20 microns to .05 microns--Moore's next frontier.
Intel, where Moore still works three days a week, advanced chip technology in a different direction. Two weeks ago, the company announced an important innovation in "flash" memory chips--the kind of memory that is used in everything from digital cameras to those tapeless tape recorders that store short messages ("Buy Intel stock") on a chip. Flash, like most computer memory, records information in a binary state, usually represented as a one or a zero. Intel's innovation was to find a way to store twice as much binary data in a single flash memory transistor. This is roughly akin to fitting twice as many people in your car without feeling more cramped or using more gas. Intel engineers work this magic with new, exquisitely sensitive devices that track the electrical state of their chips.
Intel says it will ship the first of its superchips this fall, including a new 64-bit flash memory cartridge that seemed impossible just two years ago. IBM, which will keep its copper technology a secret for a while, will sell its first "pennies" in 1998.
What are ordinary computer users going to do with all this new power? Both firms say smarter, faster computers could deliver any number of futuristic bonuses, from computers that take dictation to desktop machines that switch on instantly, with no annoying "boot up" delay. Neither of these chip technologies will, in itself, render obsolete that new $3,000 PC you bought last week. But Moore's Law, in time, will.
--With reporting by Lisa Granatstein/New York
With reporting by LISA GRANATSTEIN/NEW YORK