Despite all the promise of nanoscale computer components and
nanoscale devices, there is a downside: Experts say the circuits and chips can
become less reliable and more expensive to produce, leading to variability in
behavior from device to device over the course of their lifetime.
The National Science
Foundation (NSF) recently awarded a $10 million, five-year grant to help
address the problem. The funding agency's Expeditions in Computing program,
which funds projects that "promise significant advances in the computing
frontier and great benefit to society," has given the green light to a research
project that rethinks and enhances the role software can play in a new class of
computing devices that are adaptive and highly energy-efficient. Specifically,
the research team is exploring "Variability-Aware
Software for Efficient Computing with Nanoscale Devices," a mission to
develop computing systems that sense the nature and extent of variations in
their hardware circuits and expose these variations to compilers, operating
systems and applications, driving adaptions in the entire software stack.
Such variability-aware computing systems would benefit the
entire spectrum of embedded, mobile, desktop and server-class applications by
dramatically reducing hardware design and test costs for computing systems,
while enhancing their performance and energy efficiency, researchers say.
Applications like search engines and medical imaging systems would also
benefit, but the project team's initial focus will be on wireless sensing,
software radio and mobile platforms with plans to transfer these advances to
additional areas moving forward.
"We envision a world where system components - led by
proactive software - routinely monitor, predict and adapt to the variability in
manufactured computing systems," says Rajesh Gupta in prepared remarks about
the project grant. Gupta is director of the Variability Expedition and a
professor of computer science and engineering at the University of California, San Diego's
Jacobs School of Engineering. "Changing the way software interacts with
hardware offers the best hope for perpetuating the fundamental gains in
computing performance at lower cost of the past 40 years."
As transistors and components on chips get smaller,
semiconductor makers are experiencing lower yields and more variability, which
means more components are being thrown away because they don't meet the
timing-, power- and lifetime-related specifications. Researchers on the program
maintain that the trend toward parts that can't reliability scale in capability
or cost will cripple the computing and information technology industries, if
not addressed. A fluid software-hardware interface, the researchers maintain,
will mitigate the variability of manufactured systems and make them more
robust, reliable and responsive to changing operating conditions.
Joining Gupta in this effort is a team of computer
scientists and electrical engineers from six universities. A Technical Advisory
Board that includes top executives from Hewlett-Packard,
and ARM has been recruited to ensure the
project reflects real-world challenges.
What should be the perception of a product’s real-world performance with regard to the published spec sheet? While it is easy to assume that the product will operate according to spec, what variables should be considered, and is that a designer obligation or a customer responsibility? Or both?
Biomimicry has already found its way into the development of robots and new materials, with researchers studying animals and nature to come up with new innovations. Now thanks to researchers in Boston, biomimicry could even inform the future of electrical networks for next-generation displays.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.