37 Years of Moore's Law
From 2300 transistors on an Intel 4004 chip to the forest of 2 billion transistors residing on the latest generation of Intel microprocessors, Gordon E. Moore’s famous law has guided the steady shrinking of transistors and their consequent density on microchips. That doubling about every two years is how we went from Pong in 1972 to the astonishing real-time rendering of hair that moves according to the laws of real-world physics in the video game Heavenly Sword in 2007.
Enabling such technological marvels has been the sharp growth of available processing power and the commensurate decline of the cost of DRAM over the past 37 years [see timeline]. Though Moore made his prediction in 1965—which was 45 years ago—it was the concurrent invention of the microprocessor and DRAM in 1971 that sparked the complementary arcs shown below. Because the semiconductor industry advances in lockstep, this curve, though based on Intel chips, is likely representative of the general trend.
Technological advances like personal computers, the Internet, and video games drove these complementary slopes of cost decline and processing-power growth. But just as in the chickenâ''and-the-egg scenario, those slopes were also the driving force behind the consequent technological advances. Some might balk at the US $399 price tag of an iPhone, but in 1971, its then strictly hypothetical 128 MB of DRAM would have set the consumer back about $50 688 in 2008 dollars. In recent years, processing power has hit a plateau: to continue ramping up performance according to the expectations set by Moore’s Law, companies like Intel and AMD have turned to multicore processing. The future path of this graph, therefore, is not necessarily predictable.