Keeping Up with Moore's Law

6_computersIntroduction

New lines of laptops, smartphones, tablets, and other computing gadgets roll out constantly. Each electronic is marketed as faster, smaller, and more efficient than its predecessor. Consumers flock to stores to keep up with the “latest and greatest” technology. Many ask why the computer industry is to maintain such growth for so long. The answer lies in Moore’s Law, which is the observation that the number of transistors on an integrated circuit doubles approximately every 24 months (1). The distance between transistors on an integrated circuit is inversely proportional to the processing speed of a computer. Therefore, Moore’s Law implies that the speed of processors due to transistor density doubles every 24 months. When improvements due to individual transistor speeds are factored in, the growth rate of net processor performance doubles every 18 months (1). The rate predicted by Moore’s Law represents exponential growth—a rate which reflects large developments in the computing industry and is rivaled by the speed of developments in few other industries.

 Background

Moore’s Law is named after Intel co-founder Gordon E. Moore. In a 1965 report in Electronics Magazine, Moore noted that the number of components in integrated circuits doubled every year from 1958 to 1965:

“The complexity for minimum component costs has increased at a rate of roughly a factor of two per year… Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer” (2).

At the time, Moore predicted that this growth rate could be maintained for at least ten years. Since then, his prediction has held true—transistor density has increased exponentially for the last half century (Fig. 1) (3). In fact, the law has become an industry standard and is often used to set long-term research and development targets. Companies that produce transistors use the rate predicted by the law as a benchmark for their progress, assuming that competitors will do the same. In this way, Moore’s law is a self-fulfilling prophecy.

The specific formulation of Moore’s Law has also changed over time to increase the accuracy of the law. Notably, in 1975, Moore changed his projection to a two-fold increase in transistor density every two years (4). David House, a colleague of Moore’s at Intel, later factored in increased performance of individual transistors to conclude that overall performance of circuits would double every 18 months (5). This is the rate that is commonly referred to today.

Although Moore wrote only of transistor density in his 1965 paper, other capabilities of digital electronic devices are strongly connected to Moore’s law. Processing speed, pixel density, sensor strength, and memory capacity are all related to the number of components that can fit onto an integrated circuit (6). Improvements to these capabilities have also increased exponentially since their inception. The effects of Moore’s Law have far-reaching implications on technological and social development, and affect many sectors of the global economy.

Past

The history of computers is closely tied to the increase in computing power. As a result, the history of computers is shaped greatly by Moore’s Law. As computing power grew exponentially, the capabilities and functionality of computing devices also increased. This led to more capable devices such as the cellphone or personal computer. This history of computing is often divided into four generations.

First generation computers, built from 1940 to 1956, were based on vacuum tubes, a precursor to the modern transistor (7). A vacuum tube is a device that controls electrical current in a sealed container. Early vacuum tubes were roughly the size of light bulbs and first generation computers built from multiple vacuum tubes often took up entire rooms. The size of the computers also meant they used a great amount of electricity and were expensive to operate. Computers built in this era were used primarily for low-level operations and could only solve simple problems one at a time. Input into the computer was in the form of punch cards and paper tape and output was in the form of paper printouts.  An example of a first generation computer is UNIVAC (UNIVersal Automatic Computer), the first commercial computer, which was used by the U.S. Census Bureau for early census calculations (7). Although Moore referred to transistor-based computers in his 1965 assessment, first generation computers with vacuum tubes loosely followed the exponential growth rate estimated by Moore’s law. As first generation computers advanced, vacuum tubes became smaller, faster, and more energy efficient.

By 1956, transistor technology had replaced vacuum tubes. This development represented the advent of the second generation of computers (7). Transistor technology proved to be far superior to the vacuum tube. Transistors were faster, smaller, cheaper, and more energy efficient. Early vacuum tubes generated a lot of heat, and although transistors also generated heat, it emitted significantly less than the vacuum tube. In addition, unlike vacuum tubes, transistors could begin to work without initially needing to reach a certain energy level. This meant that second generation computers required little to no start up time. Second generation computers still used punch card input and print output but could perform more complex operations at faster speeds. Second generation computers were relied on for heavier operations such as control of the first nuclear reactors (7). Despite these advances, most second generation computers were still too complicated and expensive for domestic use. Although modern computers have applied transistors in new ways, such as integrated circuits and microprocessors, the transistor technology of second generation computers is essentially the same technology that modern computers use. In accordance with Moore’s Law, transistor technology has improved exponentially, with more transistors fitting into smaller spaces.

The development of integrated circuits is the major factor that distinguishes third generation and second generation computers from each other. Third generation computers became widespread around 1964 (7). Integrated circuits consisted of a large number of transistors fit onto silicon chips called semiconductors. These integrated circuits decreased the space between transistors and greatly increased the power and efficiency of computers. The transistors of third generation computers were packed densely enough that the size of most third generation computers is roughly equivalent to computers today. In addition, increased processing power enabled third generation computers to take input from a keyboard and produce output on a monitor. Third generation computers were also the first to use operating systems, which allowed devices to manage many applications at one time with a central interface to monitor computer memory.  The IBM 360, introduced in the April of 1964, is commonly considered to be the first third generation computer (7). Because of decreased size and lower production costs, third generation computers were the first computers to become available to larger audiences.

As integrated circuits became smaller, more integrated circuits could fit into computing devices. The fourth generation of computers is based on the development of the microprocessor, a single silicon chip with thousands of integrated circuits (7). The Intel 4004 chip developed in 1971 was one of the earliest fourth generation computer chips (Fig. 3) (7). It fit the central processing unit, memory, input and output controls on a single chip. The development of the microprocessor falls in line with predictions made by Moore’s Law, as transistors became increasingly smaller and densely packed. The processors of early fourth generation computers could fit entirely in the palm of a hand. Although third generation computers were accessible to the general public, it was not until the development of early fourth generation computers that computer use by the home user became widespread. In 1981, IBM introduced its first home computer. In 1984, Apple introduced the Macintosh (7). As fourth generation computers developed, microprocessors continued to decrease in size and increase in speed and efficiency. As they became more developed, microprocessors began to be used for more than just desktop computers. Laptops, hand-held devices, gaming consoles, and other gadgets all began to make use of the rapidly developing technology. Fourth generation computers saw the development of networks, links between multiple computers and processors to perform certain common operations. This eventually led to the development of the internet.

Present

Today, computers are considered to be at their fourth generation. However, since computing power continues to double every 18 months, modern computers are vastly more capable than the first generation computers developed in 1971.  Intel’s most recent Core i7-3770 microprocessor, released in April of 2012, has 1.4 billion transistors on a 160 cubic mm die. The processor has 4 processing cores which each run at 3.4 GHz at standard power (8). This is substantially more powerful than the Intel 4004, the first fourth generation processor from Intel, which ran at 740 kHz.

Modern day microprocessors can be found in a range of devices from Android smartphones to gaming consoles and household appliances.

Networks also play an increasingly important role in computing today. As of 2012, roughly 8.7 billion devices were connected to the internet (9). The internet is used to deliver information between computers and therefore increase the amount of information computers can access. In addition to increasing data access, the internet can also be used to increase computing power through the use of cloud computing. Cloud computing is the sharing of hardware and software resources over a network, which increases the capability of computers in the network over individual computers.

Today, many prototypes for fifth generation computers have been developed. The most well-known of these prototypes is Watson, an artificial computer system developed by IBM. Watson, as an artificially intelligent system, is capable of interpreting questions in natural language and producing a response. Watson’s hardware consists of “a cluster of ninety IBM Power 750 servers…with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses 3.5 GHz POWER7 eight core processors” (10). In 2011, Watson competed on the TV show Jeopardy against former champions and won the first place prize.

Future

The future of computer development depends on whether the growth rate estimated by Moore’s Law can be maintained in the future. In 2005, Gordon Moore stated that the law “can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens” (16). The computer industry will be faced with many challenges if it is to maintain the growth standard set by the Moore’s Law.

One such challenge is overcoming Rock’s Law, also known as Moore’s Second Law. Rock’s Law states that as computers become faster and more efficient, the cost for producers to fulfill Moore’s Law becomes increasingly more expensive and difficult (11). Research and development, manufacturing, and test costs increase steadily with each new chip generation. The capital cost to produce at Moore’s Law also increases exponentially over time. Although Rock’s Law is not a direct limitation of Moore’s Law, it does remove incentive for companies to continue keeping up with Moore’s Law. Another challenge in maintaining Moore’s Law is the fact that as transistor density increases, so does internal heat of the processor. This can lead to problems of overheating as well as excessive energy use. Michio Kaku estimates that silicon microchips can only withstand the heat of transistors down to 5nm apart (12). In addition, as computers get more powerful, it becomes more difficult to justify increasing power when increased energy consumption is a direct result.

Moore’s Law also faces many physical limitations. For example, the speed of light and the quantum size limit both limit the maximum possible speed for a processor of a certain size. When the speed of light, quantum scale, gravitational constant, and Boltzmann constant are considered, the maximum performance of a laptop of mass one kilogram and volume of one liter is 5.4258e50 logical operations per second on approximately 10e31 bits (13).

Although it is generally agreed upon that Moore’s Law will eventually collapse, the exact time of collapse is not certain. In 2003, Intel predicted the collapse would be between 2013 and 2018 (14). However, it should be noted that within the last 30 years, the predicted collapse of the Moore’s law has been pushed back decade by decade. Some academics see the limits of the Moore’s law as being farther into the future. Professor Lawrence Krauss and Glenn D. Starkman predict an ultimate limit to the law about 600 years in the future (15).

Ultimately, the continuation of Moore’s Law will depend on the ability of the computer industry to develop new technologies to overcome short-term limits, including the susceptibility of silicon to heat. Current research around computing has started to explore the development of a fifth generation of computers. Fifth generation computing will be primarily based on the development of quantum processors (7). A quantum processor is a processor that makes use of the interactions between particles on a quantum scale. Specifically, quantum processing takes advantage of the fact that photons and atoms can exist in multiple states which can then be used to store information and perform processes. Quantum processing increases the component density of processors greatly by bringing components to a quantum level. Therefore, quantum processing offers a significant increase in performance. In addition, quantum computers decrease heat output and energy consumption compared to their silicon-based counterparts.

As processors continue to increase in power, more advanced capabilities of computers will be possible. Most notably, computers with artificial intelligence will likely be developed.  Artificial intelligence is an emerging technology that enables devices to act in a manner that simulates human intelligence (17). Artificially intelligent computers are able to interpret natural language input, learn, and organize information. IBM Watson, developed in 2010, is a current prototype of an artificial intelligent computer (Fig. 4). Although Watson has some of the capabilities of artificial intelligence, its capabilities are limited by its hardware and computing power. However, as processors become more advanced, truly artificially intelligent devices that can simulate the power of the human brain may be possible.

Conclusion

Moore’s Law is a hallmark in the field of computer science. It is because of this law that technological and social change has been so rapid in recent decades and will continue at a rapid pace in the immediate future.

Contact Michael (Siyang) Li at

siyang.li.16@dartmouth.edu

References

1. Expanding Moore’s Law (2002). Available at http://www.cc.gatech.edu/computing/nano/documents/Intel%20-%20Expanding%20Moore’s%20Law.pdf (10 March 2013).

2. M. Hill, N. Jouppi, G. Sohi. Readings in Computer Architecture (Morgan Kaufmann Publishers, San Francisco, ed. 1, 2000) , pp.57.

3. R. Schaller, Moore’s Law: Past, Present, and Future. IEEE Spectrum. 53-59 (June 1997).

4.  G. Moore. Electron Devices Meeting, 1975 International. 21, 11-13 (1975).

5. Excerpts from A Conversation with Gordon Moore: Moore’s Law (2005). Available at ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf (10 March 2013).

6. N. Myhrvold, Moore’s Law Corollary: Pixel Power (07 June 2006). Available at http://www.nytimes.com/2006/06/07/technology/circuits/07essay.html?_r=2& (10 March 2013).

7. The Five Generations of Computers (02 January 2010). Available at http://www.csupomona.edu/~plin/EWS425/computer_generations.html (11 March 2013).

8. Intel Core i7-3770 Processor Specifications (2012). Available at http://ark.intel.com/products/65719/ (11 March 2013).

9. R. Soderbery, How Many Things Are Currently Connected To The “Internet of Things” (IoT)? (7 January 2013). Available at http://www.forbes.com/sites/quora/2013/01/07/how-many-things-are-currently-connected-to-the-internet-of-things-iot/ (11 March 2013).

10. Is Watson the smartest machine on earth? (10 February 2011). Available at http://www.csee.umbc.edu/2011/02/is-watson-the-smartest-machine-on-earth/ (12 March 2013).

11. B. Schaller.  The Origin, Nature, and Implications of “Moore’s Law” (September 26, 1996). Available at http://research.microsoft.com/en-us/um/people/gray/Moore_Law.html (Mar 12, 2013).

12. M. Humphries, Theoretical physicist explains why Moore’s Law will collapse (30 April 2012). Available at http://www.geek.com/chips/theoretical-physicist-explains-why-moores-law-will-collapse-1486677/ (12 March 2013).

13. S. Lloyd, Ultimate physical limits to computation. Nature. 406, 1047-1054 (31 August 2000).

14. M. Kanellos, Intel scientists find wall for Moore’s Law (01 December 2003). Available at http://news.cnet.com/2100-1008-5112061.html (12 March 2013).

15. L. Krauss et. al, Universal Limits on Computation (26 April 2004). Available at http://arxiv.org/abs/astro-ph/0404510 (12 March 2013).

16. M. Dubash, Moore’s Law is dead, says Gordon Moore (13 April 2005). Available at http://news.techworld.com/operating-systems/3477/moores-law-is-dead-says-gordon-moore/ (12 March 2013).

17. S. Istvan, What is Artificial Intelligence? (1997). Available at http://www.ucs.louisiana.edu/~isb9112/dept/phil341/wisai/WhatisAI.html (12 March 2013).

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *