Moore’s Law: Turning A Corner

February 1, 2005

Moore’s Law: Turning A Corner

Moore’s Law: Turning A Corner

A Trends Perspective from The Hurwitz Analyst Team

One of the most important trend drivers of information technology is Moore’s Law. Its origin was a “destined to become famous” prediction made in 1965 by Gordon Moore of Intel, in a paper, entitled “Cramming more components onto integrated circuits”. Moore suggested that the number of transistors per integrated circuit would grow exponentially, doubling every couple of years. His prediction was very close to the reality of what happened. In the subsequent 40 years, the number of components actually doubled every 18 months or so – and thus the power of CPU chips also doubled at that rate. Of course, Moore never suggested that this rate of growth would proceed for 4 decades and, if reports are correct, was surprised when it did.

The Wider Impact

In reality the impact of this regular improvement in capability was greater than is often acknowledged. It wasn’t just CPU power that increased at that rate, but also switching speeds, bus speeds, memory speeds and capacity, disk density and speed and ultimately both fiber bandwidth and wireless bandwidth. It would have been convenient too, if all these collaborating technologies had marched together in step, but of course that didn’t happen (and probably could not have happened).

It is easier to think of Moore’s Law as improving the capability of computers and communications by a factor of 10 every 5 to 6 years. This is roughly the same as it doubling every 18 months, but the factor of 10 is important because it is an “order of magnitude” improvement. Such an improvement will often transform the nature of computer applications.

Moore’s Law can thus be held responsible for many step-wise improvements in the IT industry. The move from mainframes to client/server and from character-based interfaces to GUIs can be laid at its door.

Moore’s Law and Database

Another consequence is the amount of data that can be managed within a database. This tends to multiply by a factor of 1000 roughly every 5 to 6 years. (We can think of this as the product of CPU speed, memory size/speed and disk size/speed: 10 x 10 x 10 = 1000). So in 1990 databases holding multiple megabytes of data were manageable, by 1995/6, gigabyte databases were possible, by 2001/2 terabytes became manageable and soon (2006/7) we can expect petabyte databases to be manageable.

The positive consequences of this data-oriented trend include the ability for organizations to collect almost all their data into huge data warehouses, and query it to derive business intelligence. The negative consequences include the tendency to let the whole area of data management spin out of control. IT users (and vendors too) tend to be profligate in their use of the power that technology delivers so inexpensively.

The End of the Road?

Having said all of this, the end of the road for Moore’s Law may now be in sight – at least as regards silicon technologies. Technically the problem is that the miniaturization process that delivers the regular increases in power has just about reached its limit.

At an IBM analyst conference in Austin last year, Bernie Meyerson, who heads up IBM’s semiconductor R&D development gave a good explanation of what was happening at the semiconductor level. As you might expect, miniaturization is a little more complicated than shrinking components to a smaller size. In fact there are 15 dimensions or characteristics that determine the behavior of the micro-circuitry that need to be kept in step. While the going was good the chip manufacturers had managed, one way or another, to keep these dimensions in step.

Well that was fine until things got really really small and the researchers had to deal with layers of material which were no more than 5 atoms thick. When you get down to that level, as chips have done, life gets complicated much more complicated. The major problem that emerges is voltage leaks, which cause the chip to heat up considerably and ultimately put a block on progress. This is not the only problem, but it is the most difficult one to address and the one that IBM, Intel and AMD are now wrestling with.

Now that this point has been reached, further miniaturization is not achievable without considerable innovation, particularly a different approach to power consumption and a look at other ways of increasing the power. To put it simply, from 2005 onwards: Moore’s Law is no longer guaranteed.

IBM and the Power PC Chip

IBM, to its credit, saw this problem coming quite a few years ago and prepared for it in advance in the direction that it developed its PowerPC chip technology. This is why – just in case you hadn’t noticed – that IBM is now doing so well with its Power PC chip and other chip products that derive from it. Meyerson’s team at IBM implemented innovations in on-chip design to cater for power management and address other miniaturization challenges. Beyond that, the design of IBM’s Power PC chip took a “holistic route” with functionality being added to better enable other IBM technology initiatives – such as the development of its hypervisor technology and the introduction of logical partitioning on its servers.

With the extraordinary escalation in chip power, other possible areas of CPU and hardware architecture enhancement have been ignored. For example in the early to mid-1990s there was much debate over the relative merits of CISC and RISC chips. The RISC idea was that with well-designed compilers and intelligently chosen chip instruction sets you could significantly improve the efficiency of the chip. Actually, at the time, it didn’t really matter because you could never get anywhere close to doubling the power of a chip even once by that strategy. What actually mattered was the industrial costs of chip manufacture which naturally depended upon volume. For that reason Intel, with its x86 family of chips, kept a firm grip on the market by focusing on miniaturization.

The game has now changed and it is really quite difficult to predict what the full impact on the IT market as a whole will be. IBM has – after many years of trying – managed to gain an advantage over Intel and this may bring huge changes in its wake.

Disruption

It is best not to try to predict the consequences too soon, because what is happening is truly disruptive. The IT industry never quite managed to anticipate the computer power that chip technology delivered, and thus it never prepared for it in any meaningful way. But neither did it ignore what was going on. Many businesses, without even knowing it, are built on the assumption of an on-going level of growth in computer power.

An obvious example is the PC manufacturing and retailing business. In the advanced economies more PC power means quicker PC obsolescence and this makes PC volumes reasonably predictable. The PC is, to a great extent, an assembly of commodity components and PC manufacture is based on this reality. But the PC market is now being disrupted in quite a few different ways.

First there is now an Open Source influence which is already biting into the PC market in some countries (Latin America and China are good examples). There is also a PC management issue which is important both for home users (because of security problems mostly) and corporations (because of desktop and lap top management costs). The issue is different between the home and the workplace and thus it may result in a splintering of the PC market. Beyond this the much-predicted-but-not-yet-arrived explosion and integration of computer usage in the home, for the sake of entertainment and/or home management will undoubtedly influence the evolution of the PC in the home. To add to this we have trends in the use of tablet PCs and lap tops – which may or may not move off in a different direction.

When one takes a close look at the situation, its complications are clear. However, is it good for Microsoft or bad for Microsoft? That’s not yet clear. Is it good for Dell or bad for Dell? Good for HP or bad for HP? It is too early to call. It is easy to come to the same conclusion if we look at any other broad market that is influenced directly by the chip market – such as the mobile device market or service provider market. Too many trends are in collision and no companeis seem to be prepared.

Right now the only thing we are convinced of is that IBM has taken the fight to Intel in the chip market and this is a market in which the rules have changed. How other markets will be affected is yet to be seen.

 

Newsletters 2005
About admin

Leave a Reply

Your email address will not be published.