Why Every Engineer Needs to Know about Ray Tracing

Novel chip architecture promises to lead the way in making ray tracing the new standard in delivering realistic quality to design engineers.

Chris Wiltz

August 23, 2018

8 Min Read
Why Every Engineer Needs to Know about Ray Tracing

Nvidia's new Turing chip architecture combines a specialized ray tracing core with Tensor cores for machine learning to deliver high-quality, ray-traced images in real-time. (Image source: Nvidia)

If you're an engineer or developer working with VR/AR/XR, product design, simulation—or if you use CAD in any form—you should add ray tracing to your vocabulary, if it isn't already there. With its new chip architecture,  Nvidia is promising to bring the sort of high-quality graphics previously reserved for the highest-end video games and big-budget Hollywood movies into the engineering workflow and product design applications.

Nvidia has unveiled a new series of GPUs, the Quatro RTX, based on a new chip architecture called Turing. It applies specialized processor cores and artificial intelligence to ray tracing to render highly realistic computerized graphics in real time. With ray tracing, design engineers will now be able to do simulations and virtual product design that features real-time rendering of realistic lighting, shadows, and reflections. Any material that can be simulated virtually can look and react to light in a virtual simulation the same way it does in the real world. Previously, design engineers have had to settle on low quality or good enough renderings or use farms of CPUs to render high-definition images (which can take minutes or hours, depending on the quality of the processors).


Who Is Ray? And What Is He Tracing?

Though similar, “ray tracing” in terms of computer graphics should not be confused with the term as it is applied to physics (calculating the paths of waves and particles through varying mediums). The ray tracing we mean here refers to simulating light and the way it behaves in the real world.

A big issue with the fidelity between what we see with our eyes and what we see on a computer screen or a virtual world has to do with how light behaves. In the real world, light bounces, gets blocked, and is absorbed by objects, depending on various properties that any materials engineer will be intimately familiar with—all of which affect how that object appears to us.

Computer graphics, as they are commonly seen today, mimic this effect with rasterization. Rasterization, in essence, creates images by turning the data (color, position, texture) of each individual pixel or polygon of an object into digital imagery. It's very good and can create some very photorealistic imagery in its own right—especially with today's high powered computers. And the big plus is that it can be done very fast computationally.

Ray tracing, by contrast, works by simulating the light enacting on virtual objects. It actually “traces” the path of the virtual light (as pixels) as it interacts with virtual objects—bouncing, being absorbed, blocked, etc. The result is that you get a much more realistic rendering of an object because the virtual light is behaving in the way that real light does.

This is not a photograph. It's a ray-traced image created by artist Enrico Cerica using OctaneRender software. Ray tracing allows for details such as distortion in the glass, light diffusion in the windows and floor, and realistic light reflections off various objects. Now, imagine images and environments like this being rendered in real time. (Image source: Nvidia / Enrico Cerica)

It's not a new idea. Ray tracing has existed as a concept at least since the 1960s, when Arthur Appel, a researcher at IBM, examined the idea in a paper titled, “Some techniques for shading machine renderings of solids.” According to the paper's abstract: “Some applications of computer graphics require a vivid illusion of reality...If techniques for the automatic determination of chiaroscuro with good resolution should prove to be competitive with line drawings, and this is a possibility, machine generated photographs might replace line drawings as the principal mode of graphical communication in engineering and architecture.”

Prettier, and Now Faster

Look at all that ray-traced evil! (Image source: Marvel Studios / Disney) 

The longstanding issue with ray tracing is that it is very computationally intensive to pull off and hasn't been feasible for rendering real-time graphics. It has, however, found plenty of other uses in arenas where graphics can be pre-rendered. If you've been to the movies recently, you've witnessed the magic of ray tracing. If you marveled (pun intended) at Thanos' battle with The Avengers in Infinity War, you've seen the quality of imagery that ray tracing can create—particularly in the alien planet environments and in Thanos' character himself. 

Nvidia's new hardware aims to take all the work out of ray tracing, which could substantially cut costs and eventually open up its accessibility to a wide range of enterprise applications. “Ray tracing is going to revolutionize enterprise applications, cinematic experiences, and immersive VR,” Bob Pette, VP of professional visualization at NVIDIA, told Design News via email. “Now, artists and designers can create, view, and interact with content that is practically indistinguishable from reality, all in real time. Professional design application developers are jumping at the chance to bring real-time ray tracing, massive speed increases, and other benefits of Turing to their customers.”

The Turing Test

At the heart of Nvidia's new Turing architecture are processors dedicated to ray tracing called RT Cores. Each RT Core is designed to accelerate the processing of light and sound in 3D environments at a speed of 10 giga rays per second (a giga ray is 1 billion light ray calculations). Coupled with the RT Cores are a series of Tensor Cores—processors designed specifically for AI and machine learning processing. By combining the ray-tracing optimization of the RT Cores with the AI capabilities of the machine cores, Nvidia says its Turing architecture can accelerate features like removing signal noise (denoising), adjusting image resolution to screens to preserve quality (resolution scaling), and converting video frame rates (video re-timing).

Specifically in terms of VR applications, the Turing architecture can also assist positional tracking, eye tracking, and foveated rendering—all of which are key methods of creating an immersive visual experience for VR users.

Who better to demonstrate real-time ray tracing than Captain Phasma?  Unreal created this ray tracing demo to showcase the possiblities of real-time ray tracing with the Unreal graphics engine. Notice the quality of light reflections in Phasma's armor. 

Dassault Systèmes and Autodesk, both big names in enterprise design, have already signed on as partners to begin using Turing-based GPU hardware for ray tracing. Autodesk will be implementing Turing into its Arnold GPU for ray tracing. And Dassault Systèmes is planning to leverage RTX GPUs in its 3DEXPERIENCE CATIA suite for design in electrical, mechanical, systems, and fluid engineering—particularly for accelerating VR rendering and design validation applications. Another partner of interest to engineers, Siemens NX, is pledging to use Turing in its PLM software for applications such as AI-based denoising as well as MDL support.

The pricing for ray-tracing graphics cards is still cost prohibitive for some, particularly at the consumer level. The lowest-end Turing GPU, the GeForce RTX 2070, is targeted at video gamers and priced at $499, while the highest end, the RTX 2080 Ti, retails for $1,199. But as prices come down and more engineers and designers embrace emerging applications in virtual product design and simulation, it's easy to see ray tracing becoming a new standard in CAD and other design applications. And let's not forget that it's going to make those games you play in your downtime look at lot more intense.

Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, and robotics.


Sign up for the Design News Daily newsletter.

You May Also Like