Interesting observation, Naperlou. The vector processor was definitely played down compared to the emphasis on optimizing the software to support multicore architectures. Nevertheless, the outcome is the same: Enabling heavy duty simulation duties, which greatly improves engineers' ability to do larger and more intense simulations on complex assemblies--all good for the development of more sophisticated products.
This is another good example of a software need pushing the hardware to new heights. The inclusion of the vector processor is really interesting and is a throwback to the mainframe era. In the 1980s, using ANSYS and other, similar codes, we purchased a vestor processing facility to attach to our IBM mainframes. This was a great approach to speeding up engineering codes. Many engineering codes are more amenable to speedup with a vector processor than a massively parallel machine.
A new service lets engineers and orthopedic surgeons design and 3D print highly accurate, patient-specific, orthopedic medical implants made of metal -- without owning a 3D printer. Using free, downloadable software, users can import ASCII and binary .STL files, design the implant, and send an encrypted design file to a third-party manufacturer.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.