Globalization, outsourcing and other relatively recent phenomena pose serious challenges for product development organizations, including aggressive scheduling fueled by intense time-to-market pressure, the need to change designs rapidly and as late as possible in response to shifting market conditions or customer requirements, and a constant push for design innovation and low-cost manufacturability.
“The needs of today’s product design teams have outstripped the capacity of current processes,” asserts Ulrich Mahle, vice president of worldwide marketing and product development at CoCreate Software. “Design teams are getting behind.”
For years, engineering design managers concerned with lagging productivity could opt for faster servers and workstations, but the era of single-core processors with progressively faster clock speeds has ended.
Power, Heat and Floor Space
The constant need for more computer performance led to a proliferation of servers that occupy costly space in data centers, draw considerable power and generate a significant amount of heat per square foot. To relieve space, power and heat pressures, enterprises are replacing servers based on single-core, single-processor architectures with servers based on multi-core chips.
Intel technology evangelist R.M. Ramanathan says improvements in processing capability have resulted from increases in operating frequency (from 5 MHz in 1983 to 3 GHz in 2002), advances in processing technology and increases in instructions per-cycle. Those factors are still relevant, he says, but “new thinking” is needed if performance increases are to continue, and a prime example of such new thinking is to build multi-core processors by packing multiple execution cores on a single die.
Placing multiple processor cores on a single die and running the cores at lower clock speeds results in greater performance with lower power consumption and less heat-build-up.
Ramanathan says when multiple cores are available, each can run at a lower frequency. Multiple cores can divvy up the power normally assigned to a single core. When the clock frequency of a single-core processor is increased by 20 percent, for example, the result is a 13 percent performance gain and a 73 percent increase in the chip’s power requirements. Decreasing clock frequency by 20 percent reduces power usage by 49 percent and lowers performance by 13 percent. When a second core is added, however, a 20 percent reduction in clock frequency delivers 73 percent more performance but uses no more power than the single-core processor did at maximum frequency.
For example, the “Cell” microprocessor, developed jointly by IBM, Sony and Toshiba and used in Sony’s PlayStation 3, features eight synergistic processing units (SPUs) and a 64-bit central processing core based on IBM’s Power Architecture technology. The OS-neutral RISC chip offers top clock speeds above 4 GHz and is said to offer from 10 to 50 times the performance of current-generation PC processors. Its performance can be tailored for specific applications by increasing or reducing the number of SPUs.
Intel’s Core2 Duo processor, available in new desktop and mobile computers, is up to 40 percent faster and 40 percent more energy efficient than Pentium processors.
Different Tasks for Different Cores
An operating system can schedule different tasks in different cores, allowing each task to run without interference from others, especially if the operating system and the application code are optimized for use with multi-core processors.
Optimized applications can split a task into multiple, smaller tasks and run them in separate threads. A task that needs extensive processing power — a graphics algorithm, for example — can run in one thread while other operations occur simultaneously.
For design engineers, multi-core processing provides a way to handle larger assemblies faster, among other benefits. “Speed buys time,” says CoCreate’s Mahle. “Time to optimize a design across more design cycles. Time to release to market earlier. Or time to keep the window open for design changes longer before transitioning to manufacturing.”
Last September UGS Corp. reported that its NX digital product development software demonstrated a performance increase of up to 50 percent for certain tasks running on multi-core processors from Intel and AMD.
Chuck Grindstaff, UGS executive vice president, PLM, says software applications require a “multi-threaded” architecture to take advantage of multi-core technology. NX is built on UGS’ Parasolid, which is multi-threaded component software for geometric modeling.
Lee Fisher, worldwide CAE business and alliances manager at Hewlett-Packard, points to the recent proliferation of dual-core servers and workstations and notes that quad-core products will be readily available this year. He advises design managers to adopt a modular upgrade strategy and to seek configuration and “tuning” support from vendors.
Building Flexibility into Design Processes
No single productivity solution is right for every enterprise. “Engineering managers see multi-core processors as something they are going to have to take advantage of, but not all of them feel that (upgrading) is a short-term requirement,” says Bill Carrelli, vice president of strategic marketing at UGS.
“What we hear them ask more often is 'how do I build flexibility into my design processes, so that I can quickly adapt existing designs to meet new, customer-driven requirements as needed, or to make changes to designs late in the design cycle.’”
Carrelli says engineers want the ability to grab pieces of existing designs and build innovation on top of them. “To do that they need to find ways to integrate more easily with the multiple software products that exist for representing, storing, managing and accessing design data, and bring all of that together as a starting point.”
More work remains to be done in that regard, according to CIMdata Senior Consultant John MacKrell. “Tools have not been readily available that make it easy to completely develop whole, complex products,” he says, referring to assemblies that may include electronic and mechanical components plus software.
“Many individual capabilities have been available for some time, but typically not in a comprehensive, integrated product development environment,” MacKrell says, adding that software developers are heading in that direction. “New and expanded capabilities to support product design are rapidly advancing so that designs can be developed much more quickly and to higher quality.”
Immersive Product Development
MacKrell calls for “Immersive Product Development” to be deployed within highly integrated PLM environments that leverage tightly coupled, synergistic solutions that encompass data management, knowledge management, workflow and process management, computer-aided design, analysis and simulation, project management, visualization, collaboration and other capabilities.
“Globalization will increasingly mean product design by teams that may be dispersed around the world. This will require not only intellectual property creation and management but also enabling simultaneous and multi-disciplined design collaboration by these dispersed team members,” says Henry Potts, vice president and general manager, Systems Design Div., Mentor Graphics.
In EDA, Mentor Graphics sees functional verification as the number one bottleneck, with low-power design management an especially critical concern thanks to the demand for smaller, feature-rich consumer electronics. “The ability to capture and verify low-power design intent throughout the entire IC design flow will be essential and EDA companies are responding with low-power standards, methodologies, and pre-synthesis verification technologies,” says Robert Hum, vice president and general manager of Mentor Graphics’ design verification and test division.
CAD users can look forward to applications based on “functional design” principles, according to Andrew Anagnost, senior director of CAD, CAE and ETO Products in Autodesk’s Manufacturing Solutions Division. He describes functional design as a new approach to engineering design focused more on functional requirements than on geometric modeling.
“(CAD) software should enable engineers, not distract them from their design process,” he says. “In order for manufacturers to continually drive productivity and innovation, CAD software should make it fast and easy to create and use digital prototypes throughout the entire design through manufacturing process.”
Anagnost says 3D workflows currently allow engineers to build parts and assemblies based on real-world design input such as load, speed or power. “With a workflow driven by functional design, engineers can leverage the benefits of these 'virtual’ real-world conditions to rapidly build digital prototypes that validate design functions and catch errors before they reach the manufacturing floor.” The results, he says, will include accelerated design cycles as well as higher quality designs, resulting in cost savings from both the production of fewer physical prototypes and errors.
“Interoperability between 2D and 3D environments, as well as between CAD and other key software products, will play a more prominent role as CAD converges with CAM tools and designers need to share more information between disparate systems,” Anagnost says. “Delays with the Airbus A380, announced earlier this year resulting from software incompatibles between converged business units, illustrates the need for improved interoperability within the industry as a whole, as this event cost Airbus billions in revenue and production delays. Software developers will continue to improve interoperability with their own respective technologies, as well as with other key software products within a given industry.”
“The days of homogeneous CAD environments within an enterprise and its value chain are gone,” says UGS’ Carrelli. “Development organizations need to be able to integrate data from multiple systems and design from there — and that’s very different from 10 years ago. Organizations need a collaborative environment with the ability to tie-in other members of the value chain to help answer questions, evaluate processes, and determine the downstream effects of decisions made early in the design cycle — if the design path is good, or if the project is going to run into problems.”
Carrelli says multidisciplinary simulation — design integration across multiple applications — is necessary for understanding the ramifications of early design decisions in terms of structural integrity, or dynamic response, or the way suppliers respond.
“Several of our customers are looking at integrating multiple systems to drive flexibility and innovation,” Carrelli adds, “and there are cultural as well as technology issues involved. The technology tools are actually more mature, relative to the cultural issues. Customers are not looking to their vendors to make a great (technological) leap. The fundamental elements are in place and available, though they need to evolve to the next level, or generation.”
Carrelli suggests some design engineers may have to reshape their thinking as their companies rework their processes. “Feedback occurs more quickly now,” he notes. “Where in the past, an engineer could create a design and send it downstream, where it would be validated later, or possibly modified by manufacturing engineers, designers are now working in a much more collaborative environment. Some engineers may not appreciate the thought that someone is looking over their shoulder. They may feel that their freedom to think and design is being restricted.”
Functions may initially take longer because more information is being managed, according to Carrelli, and benefits from changes in processes may accrue in repeat cycles. “Organizations today are designing with multiple eyes, and there may be extra steps involved,” he says.
Design organizations need — and vendors will provide — the ability to handle larger amounts of data and multiple types of data, plus the ability to integrate multiple domains, such as electrical and mechanical assemblies, with more functionality than is available today.
“The third component, in addition to electrical and mechanical, is software,” Carrelli says. “From automobiles and airplanes to consumer products, the complexity of systems is growing because of software. Designers have to know how mechanical systems are going to respond to software changes. For accurate simulation of product performance, designers have to keep electrical, mechanical and software domains in mind.”
Model-based design, which eliminates hand-coding of embedded software, has been used effectively in the automotive and aerospace industries and is now being adopted elsewhere for communications, electronics and industrial automation applications, according to Ken Karnofsky, marketing director at The MathWorks. “In the EDA world, there is a great deal of talk about modeling hardware at higher levels of abstraction, but the true challenge is that design flaws introduced at specification aren’t detected until late in the process,” Karnofsky says, adding that multi-domain system models become executable specifications and provide the basis for design elaboration, automatic code generation, and earlier detection and correction of design flaws.
“The solution to tomorrow’s system design challenges won’t come solely from any one of the traditional tool categories, but from an interdisciplinary collaboration among experts in technical computing, EDA, and embedded software to deliver the full potential of a complete tool chain,” Karnofsky adds. “This will provide great end-to-end value to embedded software and electronic system developers, while creating growth opportunities for all commercial tools in the workflow.”