Bursts of improvements in clock speeds and graphics capabilities have long upped the ante for better performing CAD and CAE tools. But a new generation of multi-core processors and on-board graphics engines is setting the stage for a change in how engineers create, iterate and test product designs. Thanks to these new hardware advances, development teams are able to move highly specialized simulation and analysis work further upfront in the design process, while collaborating on and visualizing more complete models.
CAD, CAE and PLM software vendors, over the last 18 months, have been retooling their product lines to take advantage of many of these new platforms. Dual-core and multi-core chip architectures from leaders like Intel Corp. and Advanced Micro Devices can offer up to eight computing cores on a single workstation and provide up to 64 Gbyte of memory, bringing supercomputer-class horsepower to readily available and reasonably priced PCs from mainstream providers like Hewlett-Packard Co. and Dell Computer Corp. In addition to turbo-charging the raw speed of design tools, these dual- and multi-core workstations can handle multiple, high-performance tasks simultaneously. This allows engineers to move away from a serial design approach to more concurrent practices without having to move into the exclusive and cost-prohibitive domain of highly specialized supercomputers.
“The price/performance curve has solved a lot of problems for us — we’re now able to do things on a routine basis that in the past were specialty runs, required special programs and even days and weeks to execute,” says Ken Amann, director of research at CIMdata Inc., a consultancy specializing in engineering and product development. “With the ability to bring significantly more computing power to bear, we can run larger, more complex applications … and bring things forward in the design process, particularly in the area of simulation.”
Pushing simulation to the front of the design sequence has a multitude of benefits. For one thing, it takes time out of the equation by eliminating any lag period between when an engineer comes up with a design and when they get feedback on its viability from analysis experts who run the simulation models in isolation from any subsequent design changes, using their own software and systems. It also lets the designer test options as they go along, which identifies problems earlier in the process, not to mention, frees up the analysis experts to focus their talents and computing horsepower on large-scale simulation problems.
“Now designers can take a quick look to see if something will cause problems later on and adjust their design upfront before spending a lot of effort,” Amann says. “That way, they catch problems faster and they can exercise more options.”
Pushing the Limits at 80 GFLOPS
The way Intel sees it, its Core Microarchitecture, introduced a little over a year ago, really changes the game for how CAD and CAE tools can be used. Workstations found on most desktops today are powered sufficiently to serve as platforms for CAD operations, but they lack the ability to allow highly computer-intensive tasks to be partitioned and parceled out to multiple processing engines. Not so with workstations based on Intel’s Core Microarchitecture and quad-core technology (an extension of its core microarchitecture). Systems based on this technology can deliver up to eight cores or computational engines that pack nearly 80 GFLOPS of performance — a measure of floating point operations per sec that not too long ago was something only a high-end supercomputer could accomplish, says Wes Shimanek, Intel’s manager for strategic marketing for workstations. Previous generation Intel-based workstations delivered approximately 12 GFLOPS of performance, he says.
This arsenal of raw computing horsepower not only speeds up traditional simulation work and allows for the testing of larger models, it also sets the stage for engineers to work differently, tapping something Shimanek dubs a virtual workbench, which can handle both CAD and CAE operations simultaneously. “The virtual workbench helps engineers move from serial to simultaneous workflows, reducing the time between an idea and a finished product,” he says. “The virtual workbench can also play a role in minimizing unacceptable material costs that can dramatically impact profitability, particularly for mass production.”
Consider, for example, an engineer designing a flange for an automotive application. Suppose they designed the hole of the flange too close to the edge so it could potentially cause problems with engine stability. In a traditional development scenario, the engineer would model the flange part on the desktop and later, ship it off to another group to perform simulation testing on the design. Only then would it become apparent that there were problems with the properties of the flange design in light of the entire engine assembly. “Typically, you’d design on a workstation and then ship off the CAE model to a cluster and get in line with other jobs and wait to hear back,” Shimanek says. “When you move to an eight-core workstation with 80 GFLOPS of performance, you do that iteration locally, which allows the engineer to iterate quicker and speeds the overall design process.”
ESI Group, a provider of physics-based simulation software, sees huge potential in the Intel Core Microarchitecture. It has a partnership with Intel to leverage the chip maker’s software tools to rearchitect its Visual Environment Version 3.0 platform to take full advantage of Intel’s multi-core processors. To be released later this year, Version 3.0 of its simulation tool will offer up to a four- to five-times performance boost over the subsequent version, says Jean-Louis Duval, ESI Group’s worldwide business manager for Enterprise Integrated Solutions. “By taking advantage of multi-core and building a parallelized version of our software, we can be more efficient and manipulate bigger and bigger models,” he says.
CAE vendors also see the new hardware platforms paving the way for them to offer multi-disciplinary simulation toolsets, where engineers can perform motion, structure, acoustics or crash testing analysis in the same environment instead of doing each type of test in separate, disconnected tool sets, says Doug Peterson, senior vice president of product development for MSC Software. Traditional hardware was not robust enough to support this kind of integrated analysis so crash testing might be done on one system, while thermal analysis was handled by a completely different system and there was little, if any, integration between the two.
The ability to study a whole product and not just the individual effects can go a long way in solving quality problems earlier in the design process, while delivering a more accurate view of testing, which in the end, results in getting product to market faster. “The end result is we’re finally getting towards what everyone wanted all along — the ability to consider as much as we can on a computer as opposed to in a physical lab,” says Bob Williams, product manager at Algor Inc., maker of finite element analysis (FEA) software.
Athletic shoe maker adidas Group is taking advantage of hardware advances to do more simulation at a local level as part of its product development processes, according to Gerd Manz, the company’s global head of engineering. Adidas uses simulation tools from SIMULIA, the newly named Dassault Systemes subsidiary that was formerly named Abacus, to gain a deeper understanding about the performance aspects of footwear mechanics and as part of its development efforts around the technically complex aspects of shoes — for example, (foot-)balls and hardware, he says. Virtual testing helps adidas visualize mechanical behavior of shoes prior to investing in tooling and ultimately helps shrink its lead times in introducing new offerings (see photo, page 64).
“With recent developments in PC hardware (dual-core chips, second processors and 64-bit processing), we’re using more desktop solutions for simulations,” Manz says. “A nice side effect is that pre- and post-processing simulation work, as well as regular office work can be handled locally on the same station. So now, hardware is no longer a limiting factor when it comes to spreading the usage of simulation.”
Beyond simulation software, multi-core processors — and additionally, advances on the graphics processor side — have also provided a boost to 3D CAD programs, specifically their ability to perform visualization of complete, large-scale assemblies. In addition, CAD vendors are taking advantage of advances in graphics engines like those offered by NVIDIA Corp. to build support for rendering and other realistic surfacing capabilities directly into their applications, says Sanford Russell, product manager for NVIDIA’s Professional Brand.
“It used to be to see what a car would look like, you’d send it to an offline rendering program and come back in a hour to see the results — it wasn’t part of the CAD experience,” he says. “With more powerful GPU (graphics processing units), you can run these rendering applications in real time and CAD vendors are building technology into their core applications to run this as opposed to making it a batch process.” NVIDIA is also releasing product on the simulation side: It recently rolled out its Tesla high-performance GPU Computing Solution, which can handle crash analysis, fluid flow and other simulation scenarios.
SolidWorks Inc. and UGS PLM Software are among the CAD leaders tapping into the higher performing GPU and multi-core capabilities. The new SolidWorks 2008 release, for example, includes RealView rendering functionality, made possible thanks to the improved graphics, while cosmetic threads for drawing performance and the FloWorks analysis functionality have been architectured to exploit multi-core processing, says Greg Jankowski, SolidWorks’ director of customer service and strategic planning. For its part, UGS has done some work on its CAE solvers, PLM Components and ParaSolid modeling kernel to adapt them for multi-core chip architectures and is working to parallelize its entire product line, a company spokesman says.
The ability to visualize larger and more complex assemblies coupled with more prevalent CAE testing on a local level ultimately combine to make it easier for engineers to get their job done and accomplish more in the design process. “There’s a greater linking of the realities of design which results in greater confidence,” the UGS spokesman says. “Engineers now also have a lot better visualization into what they’re going to make before they actually make it, which allows them to try what might have been considered wild hairs in the same amount of time. This increases cycles of learning and lets them get more out of the design process than they would have otherwise.”