This is another good example of a software need pushing the hardware to new heights. The inclusion of the vector processor is really interesting and is a throwback to the mainframe era. In the 1980s, using ANSYS and other, similar codes, we purchased a vestor processing facility to attach to our IBM mainframes. This was a great approach to speeding up engineering codes. Many engineering codes are more amenable to speedup with a vector processor than a massively parallel machine.
Interesting observation, Naperlou. The vector processor was definitely played down compared to the emphasis on optimizing the software to support multicore architectures. Nevertheless, the outcome is the same: Enabling heavy duty simulation duties, which greatly improves engineers' ability to do larger and more intense simulations on complex assemblies--all good for the development of more sophisticated products.
In an age of globalization and rapid changes through scientific progress, two of our societies' (and economies') main concerns are to satisfy the needs and wishes of the individual and to save precious resources. Cloud computing caters to both of these.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.