Interesting observation, Naperlou. The vector processor was definitely played down compared to the emphasis on optimizing the software to support multicore architectures. Nevertheless, the outcome is the same: Enabling heavy duty simulation duties, which greatly improves engineers' ability to do larger and more intense simulations on complex assemblies--all good for the development of more sophisticated products.
This is another good example of a software need pushing the hardware to new heights. The inclusion of the vector processor is really interesting and is a throwback to the mainframe era. In the 1980s, using ANSYS and other, similar codes, we purchased a vestor processing facility to attach to our IBM mainframes. This was a great approach to speeding up engineering codes. Many engineering codes are more amenable to speedup with a vector processor than a massively parallel machine.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.