It was IBMs then-Chairman Thomas J. Watson who said in 1943 that he thought there was a market in the world for no more than about five computers. Today the quote seems astounding, but remember that Watson was speaking at a time when the most powerful computers were powered by vacuum tubes and filled large rooms. Few people in the World War II era predicted the spread of computing into small devices, a development that caught everyone more or less by surprise as it unfolded years later.
Today we may think smaller and smaller devices are where its at, but were also entering into a golden age of supercomputing. These large machines use thousands of processors and are increasingly playing a role in the computationally intensive modeling of everything from weather systems to nuclear fusion. Developments proceed at uncanny speeds. An IBM supercomputer called Roadrunner that five years ago was clocked at more than one quadrillion floating-point operations per second is now considered obsolete. Based at Los Alamos National Laboratory, Roadrunner is still blazingly fast but its 296 server racks consume way too much electricity.
The Dome project
Energy efficiency is the key to next-generation supercomputers, and its one of the core principles in the Dome project, which IBM is working on in partnership with the Netherlands Institute for Radio Astronomy (Astron). The plan is to create computing systems that can handle all the data produced by the soon to be constructed Square Kilometer Array, a spectacular telescope that will be spread over 3000 individual receivers extending across South Africa and Australia. So intensive will be the datastream when the Square Kilometer Array goes online in 2024 that it will generate the equivalent of todays Internet traffic twice each day.
A telescope like this isnt the domed edifice at the top of a mountain were all familiar with. The SKAs receivers are to be scattered over wide swathes of territory and linked by fiber-optic means to the supercomputers at a central processing area. The SKA should be fifty times more sensitive than any other radio telescope, able to investigate everything from planets around other stars to the period just after the Big Bang, when the first galaxies began to emerge. The data from all these individual dishes will be pooled and analyzed in a technique called interferometry.
Power and data
A lot of new technology is required to do this kind of work, for the 10-nation collaboration that is putting the SKA together is building a tool that will be gathering data from up to 13 billion light years away. IBM talks about green supercomputing, meaning systems that can keep the power requirements low while transferring vast amounts of data through new optical methods. Also in the cards at the Dome project will be advanced data-storage systems and memory technologies. The SKA will need to store up to 1,500 petabytes of data per year. By contrast, the ultra high-tech Large Hadron Collider in Switzerland stores some 15 petabytes per year.
IBMs move in the direction of new, low-power chips for this purpose means that the trend of using computer simulation to conduct science and in many cases replace costly physical experiments will continue. Crays new Blue Waters supercomputer, for just one example, will simulate more than 60 million atoms while analyzing what happens inside the HIV genome.
From curing intractable diseases to predicting earthquakes, the golden age of supercomputing will benefit us all. Last year, sales of supercomputers jumped 29 percent to $5.6 billion, a trend unlikely to slow anytime soon. We consumers tend to be fixated on the latest smartphones and tablets, but when it comes to transforming our lives, big computing is where the action is.
Paul A. Gilster is the author of several books on technology. Reach him at firstname.lastname@example.org.