AI in Engineering Makes Rapid Gains, Says Nvidia Product ManagerAI in Engineering Makes Rapid Gains, Says Nvidia Product Manager

DesignCon keynote delves into how AI is streamlining design engineering and improving manufacturing productivity.

Spencer Chin, Senior Editor

January 23, 2025

7 Min Read
AI is playing a key role in making engineering and manufacturing more efficient.
Accelerated computing, physics-informed AI, and generative AI are streamlining engineering and redefining the boundaries of modern manufacturing.Digital Art/ The Image Bank/Via Getty Images

Not surprisingly, AI and machine learning were hot topics at the DesignCon show, held January 28th through 30th in the Santa Clara Convention Center. In the second major keynote on Wednesday, January 29th, John Linford, Principal Technical Product Manager at Nvidia, discussed the impact of accelerated computing, physics-informed AI, and generative AI in redefining the boundaries of modern manufacturing. This talk delves into how these technologies are empowering designers and engineers to overcome constraints, enabling faster innovation, more sustainable practices, and unprecedented levels of precision and creativity.

Recently, Design News conducted an e-mail interview with Linford on the progress and challenges of implementing AI in engineering. The interview is below.

DN: There has been a lot of press on the improvements and capabilities of generative AI.  But what progress is being made in integrating generative AI or other forms of AI into engineering and scientific product design that requires knowledge of physics and physical sciences?

Linford: AI and accelerated computing are transforming industries and integrating AI into engineering and scientific product design is a rapidly growing area of innovation.  Independent software vendors recognize this potential and are therefore integrating AI into their product portfolios. Frameworks like Nvidia Modulus have made AI an integral part of the end-to-end manufacturing workflow, spanning design, simulation and analysis.

Related:2025: A Transformative Year for AI in Product Development

Designers are now using AI-enhanced tools to create products for industries like automotive, manufacturing, aerospace, and energy.  AI helps to optimize material layouts for maximum strength and minimal weight while taking into account physical constraints like stress, strain, fatigue, and thermal limits. AI surrogate models for thermal diffusion, fluid flow, and linear deformation provide real-time product performance feedback as the design evolves and enable designers to anticipate physical figures-of-merit like drag or surface pressure before performing high fidelity simulations.

AI is also transforming simulation-based design processes by reducing computation time and improving accuracy in simulations of complex physical systems. Everything manufactured is first simulated, and physics-informed neural networks (PINNs) enable simulations that are orders of magnitude faster than traditional approaches. PINNs enhance AI models by incorporating the governing equations of physics directly into the machine learning framework. This produces surrogate models that combine physics-based causality with simulation and observed data, enabling real-time prediction. Nvidia Modulus includes PINN architectures appropriate for external aerodynamics, fluid flow, and other applications.

Related:AI Policy and Governance: Shaping the Future of Artificial Intelligence

DN: What progress is being made in AI tools? How will these tools be able to handle known challenges such as power requirements and thermal issues?

Linford: The key metrics for chip manufacturers are power, performance, and area (PPA). To meet PPA requirements, a chip manufacturer must enact a long chain of optimization propagation that begins with design and ends with manufacturing. Each link in this optimization chain is constrained by physical limits. Optimization tools like Nvidia cuOpt can rapidly offer solutions that respect these limits. cuOpt uses CUDA-accelerated solvers relying on heuristics, metaheuristics, and optimizations to calculate solutions for complex routing problems with a wide range of constraints. Integrating generative AI and accelerated computing into the design process enables designers to quickly evaluate a larger range of possibilities than was previously possible.

DN: Can you provide examples of where physics-inspired AI is able to reduce design iterations or enable solutions to complex engineering tasks that could not be easily or quickly solved with existing software or hardware previously?

Related:Artificial Intelligence From Behind the Curtain

Linford: Physics-inspired AI is transforming the automotive, manufacturing, aerospace, and energy industries. For example, Siemens Gamesa used the Nvidia Modulus framework to train a PINN that led to 4,000x faster wind turbine wake optimization compared with traditional approaches. This speedup enables large-scale, detailed wind farm layouts that optimize turbine placement and maximize energy output. [Source: https://blogs.nvidia.com/blog/siemens-gamesa-wind-farms-digital-twins/]

Physics-informed AI also enables real-time digital twins: high-fidelity design tools combining product definition, simulation, optimization, and analysis into a unified, interactive, digital representation of the product. Real-time performance is only possible with AI and enables designers to create rapidly, understand holistically, test aggressively, and explore a broader range of designs than was previously possible. This design freedom enables more energy-efficient products while simultaneously reducing the computational cost of the design cycle, resulting in a more sustainable and effective product overall.

For example, Wistron — a global designer and manufacturer of computers and electronics systems — created digital twins to accurately predict airflow and temperature in test facilities where temperature must be controlled. A simulation that would’ve taken nearly 15 hours with traditional methods on a CPU took about three seconds on an Nvidia GPU running inference with a physics-informed AI model, representing a 15,000x speedup. [Source: https://blogs.nvidia.com/blog/digital-twins-modulus-wistron/]

Siemens Energy uses digital twins to maximize uptime for heat recovery steam generators. These massive machines use hot exhaust gases to boil water. The exhaust gases can cause corrosion, leading to downtime for system maintenance. High-fidelity simulations of multiphase turbulent flow help predict where and when corrosion occurs. Physics-informed AI can infer this flow in seconds, making these simulations feasible. With simulation, unplanned downtime is reduced by up to 70%, saving the industry $1.7 billion per year. [Source: https://resources.nvidia.com/en-us-supercomputing-2021-virtual-theater/siemens-energy-hrsg-digital-twin]

DN: What progress is AI making in state-of-the art research in areas such as quantum computing or other high-end scientific research? Can this advanced research be done with existing AI tools or are advanced iterations  of these tools needed?

Linford: AI is already enabling state-of-the-art research in many domains of quantum computing, and many of the most difficult scaling challenges in the field will require AI to develop solutions. This includes algorithm design, where generative AI is being incorporated into hybrid algorithms; quantum error correction, where approaches such as reinforcement learning and transformers are being used to infer the right corrections to turn noisy qubits into useful ones; calibration of large quantum systems without the need for exponentially increasing engineering resources; and generation of optimal control pulses to minimize noise. 

Linford: Advanced warehouses and factories use fleets of hundreds of autonomous mobile robots, robotic arm manipulators, and humanoid robots collaborating with people. Implementation of increasingly complex systems of sensor and robot autonomy require coordinated AI training in simulation to optimize operations, help ensure safety, and avoid disruptions.

Nvidia offers enterprises a reference architecture of Nvidia accelerated computing, AI, Nvidia Isaac, and Nvidia Omniverse technologies to develop and test digital twins for AI-powered robot “brains” that drive robots, video analytics AI agents, equipment, and more for handling enormous complexity and scale. This framework brings software-defined capabilities to physical facilities, enabling continuous development, testing, optimization, and deployment.

Fast and accurate weather prediction is critical to supply chain optimization. Nvidia Earth-2 combines the power of AI, GPU acceleration, physical simulations, and computer graphics to develop applications that can simulate and visualize weather and climate predictions at a global scale with unprecedented accuracy and speed. The platform comprises microservices and reference implementations for AI, visualization, and simulation.

Digital biology and generative AI are helping reinvent drug discovery, surgery, medical imaging, and wearable devices. By harnessing emerging generative AI tools, drug discovery teams can observe foundational building blocks of molecular sequence, structure, function, and meaning — allowing them to generate or design novel molecules likely to possess desired properties. With these capabilities, researchers can more precisely curate drug candidates to investigate, reducing the need for expensive, time-consuming physical experiments. Accelerating this shift is Nvidia BioNeMo, a generative AI platform that provides services to develop, customize, and deploy foundation models for drug discovery.

Manufacturing company Kawasaki Heavy Industries’s Track Maintenance Platform uses AI and machine learning at the edge and interfaces with railroad-track inspection devices such as digital cameras, lasers, and gyrometric sensors. This AI-based platform enhances manual, visual processes that can be labor-intensive, inefficient, and expensive when freight operations are halted for inspection.

Accelerated computing enables AI, and AI enables quantum computing. AI models trained on quantum data generated from simulators and physical hardware will help unlock useful quantum computing. Broad adoption of AI leads to data generation and model training at scales that were previously impossible, transforming industries and enabling new opportunities.

About the Author

Spencer Chin

Senior Editor, Design News

Spencer Chin is a Senior Editor for Design News, covering the electronics beat, which includes semiconductors, components, power, embedded systems, artificial intelligence, augmented and virtual reality, and other related subjects. He is always open to ideas for coverage. Spencer has spent many years covering electronics for brands including Electronic Products, Electronic Buyers News, EE Times, Power Electronics, and electronics360. You can reach him at [email protected] or follow him at @spencerchin.

Sign up for Design News newsletters

You May Also Like