Despite all the promise of nanoscale computer components and
nanoscale devices, there is a downside: Experts say the circuits and chips can
become less reliable and more expensive to produce, leading to variability in
behavior from device to device over the course of their lifetime.
The National Science
Foundation (NSF) recently awarded a $10 million, five-year grant to help
address the problem. The funding agency's Expeditions in Computing program,
which funds projects that "promise significant advances in the computing
frontier and great benefit to society," has given the green light to a research
project that rethinks and enhances the role software can play in a new class of
computing devices that are adaptive and highly energy-efficient. Specifically,
the research team is exploring "Variability-Aware
Software for Efficient Computing with Nanoscale Devices," a mission to
develop computing systems that sense the nature and extent of variations in
their hardware circuits and expose these variations to compilers, operating
systems and applications, driving adaptions in the entire software stack.
Such variability-aware computing systems would benefit the
entire spectrum of embedded, mobile, desktop and server-class applications by
dramatically reducing hardware design and test costs for computing systems,
while enhancing their performance and energy efficiency, researchers say.
Applications like search engines and medical imaging systems would also
benefit, but the project team's initial focus will be on wireless sensing,
software radio and mobile platforms with plans to transfer these advances to
additional areas moving forward.
"We envision a world where system components - led by
proactive software - routinely monitor, predict and adapt to the variability in
manufactured computing systems," says Rajesh Gupta in prepared remarks about
the project grant. Gupta is director of the Variability Expedition and a
professor of computer science and engineering at the University of California, San Diego's
Jacobs School of Engineering. "Changing the way software interacts with
hardware offers the best hope for perpetuating the fundamental gains in
computing performance at lower cost of the past 40 years."
As transistors and components on chips get smaller,
semiconductor makers are experiencing lower yields and more variability, which
means more components are being thrown away because they don't meet the
timing-, power- and lifetime-related specifications. Researchers on the program
maintain that the trend toward parts that can't reliability scale in capability
or cost will cripple the computing and information technology industries, if
not addressed. A fluid software-hardware interface, the researchers maintain,
will mitigate the variability of manufactured systems and make them more
robust, reliable and responsive to changing operating conditions.
Joining Gupta in this effort is a team of computer
scientists and electrical engineers from six universities. A Technical Advisory
Board that includes top executives from Hewlett-Packard,
and ARM has been recruited to ensure the
project reflects real-world challenges.
Most cyber attacks could be avoided by adopting a list of Critical Security Controls that were created by the Center for Internet Security. That’s the message from Steve Mustard of the Automation Federation.
How 3D printing fits into the digital thread, and the relationship between its uses for prototyping and for manufacturing, was the subject of a talk by Proto Labs' Rich Baker at last week's Design & Manufacturing Minneapolis.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.