Can America Stay Ahead in the Tech Superpower Race?

By , Contributor

Heartbreak Hill is a 600-meter ascent beyond the 20-mile mark of the Boston Marathon. Many say this otherwise nondescript elevation in the Massachusetts landscape offers the most difficult stretch in the entire course. By the time runners take the hill, they will have exhausted glycogen reserves in their muscles. What happens next is familiar to any runner: a searing pain slices into the thighs, calves and lungs as the body signals the brain that it has reached its limits, and must now stop. Runners call this "hitting the wall."

On April 19, 1965, Morio Shigematsu ran into this wall as he negotiated the tree-lined road up Heartbreak Hill. But the 25-year-old college student from Japan steeled himself against the pain, and — with an explosion of will — held his pace alongside two other runners in his cluster. Shigematsu went on to win the Boston Marathon with a brilliant finish and a new record of two hours, 16 minutes and 33 seconds.

The 35th anniversary issue of Electronics Magazine hit the stands the day Shigematsu claimed the victory wreath at the Boston Marathon. On page 114 of the special issue was a paper entitled "Cramming More Components into Integrated Circuits," written by a young engineer named Gordon E. Moore, then working for the California-based company, Fairchild Semiconductor Corporation. In his paper, Moore had made two rather keen observations: first, that the complexity of microchips had doubled each year since they were invented in 1959; and second, that the minimal cost of producing a chip decreases at a rate that is almost inversely proportional to the increase in component density. Moore then predicted that the US semiconductor industry would continue to build chips that are twice as powerful as those of the previous generation each year for another ten years.

An Industrial Rosetta Stone

Few saw it back then, but Moore had actually stumbled upon an industrial Rosetta Stone, a more or less predictable arc along which chip manufacturers could expect to produce exponentially better integrated circuits at a fixed cost. In spite of its huge implications, however, Moore's paper elicited no fanfare, no popular welcome. Indeed, at the time of its publication, most of America was still transfixed by the previous evening's news footage of the fighting in Vietnam, or — at least in Boston that warm spring day — by Shigematsu's record-breaking finish. But Moore's observations would outlast the '60s. They would outlive the war in Vietnam. They would even see Shigematsu's record broken early on.

When Moore co-founded Intel in 1968, the company turned the basic framework of his 1965 paper into corporate mantra. Eventually, Moore's Law, as it came to be known, would evolve into the penultimate basis for measuring progress and competition in an industry that is now among the most powerful forces behind America's claim to technological supremacy in our time.

Headed for Heartbreak Hill

Today, no one questions that the advances made by the American microchip industry in the 1960s are at the heart of the Information Age. The microchip has made possible innovations ranging from the personal computer, the worldwide web, and text messaging to laser guided bombs, human genome sequencing, and nanotechnology. Questions have, however, been raised as to how long the industry can maintain the mind-boggling rate of progress it has kept for over 50 years now; and whether the basic premises behind Moore's Law — which have kept America at technology's cutting edge for over five decades — will hold beyond the current decade. These questions are underscored by China's aggressive — and remarkably successful — efforts to build up its own semiconductor sector. Considered industry laggards up until a few years ago, China's semiconductor plants now turn out chips that are less than a generation behind those of the US. The Semiconductor Manufacturing International Corporation (SMIC), a Chinese company, is now among the world's top producers of microchips.

In his book, The Physics of the Future, theoretical physicist Michio Kaku asserts that by around 2020 technologies driven by Moore's Law will allow the American semiconductor industry to produce chips with transistors that are a mere five atoms wide. The achievement will constitute both a technological landmark and a barrier. Beyond this point, Kaku says, Moore's Law will end. Atomic physics will come into play and electrons will begin to leak out of the microstructures. There are those who say that when this happens America will have lost a tremendous edge.

Re-Thinking a Cornerstone

But the fact of the matter is that people have been predicting the end of Moore's Law for decades now. In spite of these predictions, science and creative engineering have time and again found ways to extend the law's relevance — and, not surprisingly, its profitability. The US semiconductor industry earned over $291 billion from sales last year, according to the Semiconductor Industry Association (SIA). This year, global sales are expected to reach a massive $319 billion. Observers believe the ever-growing demand for microchips will continue to encourage sizeable rounds of revenue growth for the industry over the next few years. Many argue that this alone should be sufficient incentive for the US to remain ahead.

Equally important, however, is the fact that federal policy considers the American microchip industry to be a vital cornerstone not only in America's economy, but also its defense. This has enormous implications both for the future of the industry and for related scientific research. The Semiconductor Research Corporation (SMRC) and the National Science Foundation (NSF) alone contribute a combined total of around $20 million each year for semiconductor research and development (R&D). The annual budget allocation supports R&D activities in nearly 40 universities in 20 states across the US. Prevailing regulations impose stringent requirements on grants, and this has apparently helped. By requiring device and system performance beyond state-of-the-art, some analysts say government funding has created a research environment that is uniquely conducive to competition and innovation. And competition and innovation have always been defining themes in the great, unfolding narrative that is modern America.

Whether competition and innovation will — like the burst of a runner's determination — drive America to retain its status as the world's biggest technological superpower when the US semiconductor industry confronts its own version of Heartbreak Hill a few years from now remains to be seen. For now, there is no reason to believe otherwise.

Share this story About the author

Nancy Perkins, a full-time mommy wannabe, has been a freelance online writer for two years now. She loves sharing information on health, business, technology, fashion, women's issues and motherhood. Nancy lives life to its fullest each day and is dreaming of retiring on an island she will someday own.

View Profile

More from Nancy
Related Tags
 

Connect With TMR

Recent Writers

View all writers »

September 2016
S M T W T F S
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30