Massive Computing System Making In-Roads On Superconductor BehaviorMassive Computing System Making In-Roads On Superconductor Behavior

DN Staff

December 5, 2005

2 Min Read
Massive Computing System Making In-Roads On Superconductor Behavior

In a demonstration of massive computing power, researchers at Oak Ridge National Laboratory have employed two Cray Research supercomputers-about 6,000 processors and 43 trillion floating point operations per second-to make a breakthrough in the study of superconducting materials.

The computing study, performed at the lab's National Center for Computational Science, definitively showed that superconductivity can be described through a mathematical approach known as a 2-D Hubbard Model. Equally significant, the research opens the door for the use of such models and supercomputers in astrophysics, climate modeling, and nanoscience.

"The way we were able to apply 43 teraflops of computing power showed we can accomplish breakthrough science in a variety of scientific domains at a similar scale," notes Jeff Nichols, director of the National Center for Computational Science at Oak Ridge.

The machines, including a 1,024-processor Cray X1E and a 5,088-processor Cray XT3, are housed in 64 refrigerator-sized cabinets in a 40,000 square-foot computer room at Oak Ridge.

Dense packing of so many processors, however, creates its own set of challenges, the lab's computing specialists say. Operation of all the units burns as much as 2 MW of electrical power. Moreover, the Cray X1E's densely-packed processors run so hot that they must be cooled with en exotic method that calls for liquid Fluorinert to be sprayed directly onto the chips, drawn off and recirculated.

Scientists at the lab say they needed the 6,000 processors working in unison-and, in particular the 1,024 vector processors of the Cray X1E-in order to prove that the Hubbard Model could be successfully applied to superconductivity. Vector processing, in which an entire vector of numbers is processed with a single operation, was key to making the Hubbard Model work, they say.

"Because of the way the Hubbard Model code was written, we were able to take advantage of vector processing and operate at higher efficiencies," Nichols says. Nevertheless, he adds, the superconductivity simulations took weeks of "compute time."

Computing specialists at the facility say their goal is to provide more supercomputing capability to other scientific endeavors-ranging from fusion simulation to climate modeling-that require huge amounts of compute power and time.

"We're not trying to time-share this to thousands of users," Nichols says. "We want to provide the whole resource to a few users to solve a handful of very high-level problems."

Cray Research's XT3: 56 cabinets, 5,088 processors

Sign up for Design News newsletters

You May Also Like