Article, January 2005
Computers Of The Future
by Pim Borman, Web Editor & APCUG Representative
SW Indiana PC Users Group
For the last 40 years or so computer chips have closely followed Moore’s Law, which states that the number of transistors doubles every 18 months. The corresponding increase in computing performance has been enormous, but chip manufacturers are beginning to reach the physical limit of miniaturization. Intel’s latest chip, “Prescott,” with 125 million transistors an improved version of the 55-million transistor Pentium 4, was delayed by production difficulties and proved to be only marginally faster. Cramming more transistors in a certain area by reducing their size leads to increasing electrical leakage problems and crosstalk; it also increases heat generation. Some recent microprocessors consume over 100 watts, generating more heat per square centimeter than a laundry iron on the cotton setting (W. Wayt Gibbs, writing in Scientific American, November 2004, pp.96-101). Increased computation speeds have to a large extent also been the result of clever changes in computer architecture that allow the chip to execute multiple instructions for each clock tick. We are finally reaching the inevitable end of Moore’s Law.
Intel has already announced that it will no longer distinguish its microprocessors by clock speed, which is after all only part of the system’s performance characteristic. In addition, starting next year, all Intel chips will have not one but two “cores” that allow higher computation speeds through parallel processing. AMD already has such chips. There is nothing new about fast computing with parallel processing computers; the fastest computers in the world are now built using thousands of processors that operate in parallel to perform specific operations, such as playing world champion chess. But all current software for home and office use will have to be rewritten.
Many users, as well as software companies, may decide it is not worth the hassle. If you have an up-to-date computer it probably responds faster to your inputs than you can provide them, unless you are a game freak or use industrial-strength graphics or database programs. Customers will be better served by improved security and simplified operations.
Not by coincidence, W.Wayt Gibbs also wrote an article in the same issue of Scientific American (Nov. 2004, pp. 80-87) about future computers using photons (light) instead of electrons (electricity) to perform computations and connections between the cpu and memory storage. There are many technical problems to be solved as yet, including the challenge to bring the cost down, but it seems likely to be the computer technology of the future. Photons move many times faster than electrons and do not significantly generate heat. All the rest is engineering detail!
Pim Borman (email@example.com) is Web Editor and APCUG representative for the SW Indiana PC Users Group, Inc (http://swipcug.apcug.org). This article appeared in the November 2004 issue of the P-SEE URGENT, newsletter of SWIPCUG
There is no restriction against any non-profit group using this article as long as it is kept in context with proper credit given the author. The Editorial Committee of the Association of Personal Computer User Groups (APCUG), an international organization of which this group is a member, brings this article to you.Click here to return to top