I think Zihong Lin stated something like people moving from single-core programming to multicore have to now think about partitioning the software. When I hear partitioning I think architecture. So regardless if you have firmware or software design, one or multicore upfront planning and architecture should always be done prior to coding. I've seen many poor designs because of the lack of pre-planning. I'm trying to build my software skills, think I will check this tool out.
Interesting article. My experience is mainly in firmware/FPGA/HDL designs and grow up not using schematic capture or graphical tools for development. The main reason for this was coding offered more flexibility. I'm hoping the new generation of graphical tools is much better.
I don't want to sound like a shill for one of the advertisers of Design News, but in addition to these fantastic solutions from PolyCore Software and Texas Instruments, there is another Texas-based company, National Instruments, that has been evangelizing the utility of parallel, graphical, data flow programming that is a perfect fit for multi-threaded and multi-core processes. With its popularity in the engineering disciplines, there is no lack of fans for LabVIEW, but over the past 30 years it has taken a back seat to the popularity of other languages like such as C and Java in the non-engineering markets. Perhaps the rise of additional graphical programming tools like Poly-Platform and others will slowly move us past the bottleneck of text-based development environments and popularize the elegance of graphical programming.
Beth, yes this type of tool is needed. It is akin to programming a FPGA with an soft processor. You then profile the application to determine which functions can be moved to the FPGA fabric. At least with multi-core you do not have the same problem you might have with multicomputers, or machines that are seperate with an interconnection network. That will probably be the next item (there are lots of computers in a modern car, for example). Perhaps we will see some consolidation there.
Making the transition from single core to multicore programming is no easy task, and design and software engineers are going to need a helping hand. Software capabilities always lag behind what the power of the hardware promises. It took some time before visualization software, CAD tools, simulation software, and other development platforms were able to take advantage of the parallel processing capabilities of multicore servers. In fact, that transformation is still underway. I would expect the same with this kind of platform.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.