Ah, complication equals profit when these many units start dying.
And making a car engine in service that gets 7% of it's fuels energy to move it to 8% for all that cost is just not smart. Far better go directly to EV drive with it's 20-65% eff depending on the power source.
The only eff way to use an ICE is running at constant speed driving an alternator which doesn't require complicated junk to be eff, cleanish as an ICE can be. The Lotus EV Range Extender is an example of the future ICE in vehicles.
Far cheaper is lighter unibodies, better aero and even chopping off a wheel you get into the 100mpg and up class. Even the smaller SUV's could do this if they wanted to.
Can anyone explain this to me? In 1972 I bought a new Dodge Colt 1600, made by Mitsubishi. It could seat 4 normal sized adults, regularly got 40+ miles to the gallon and had enough get up that I got a ticket for doing 77 in a 60 mph zone. (I was younger then, so cut me some slack.) It had a 2bbl carb and coil ignition so I imagine the exact same vehicle would perform better today with fuel injection and electronic ignition. The car cost about $2200 as I recall and the only extraordinary maintenance I had was to replace the differential at about 55 thousand miles.
For some reason other cars liked to run into it while parked, sitting at a stop sign or just drivng normally. After the 4th such occurance we got rid of it. My question: how is it that a car like that could be built in the early 70's, without onboard computers, 40-70 pounds of wire, etc. Are we not smarter today? I'm just asking.
Compilers have been/are being updated to support multicore for the appropriate platforms.
However, multicore processors may be used without the hard requirement for a multicore-aware compiler. It depends on the actual chip architecture, as well as the software application design and partitioning.
Good example of evolution at work. Improvements in one place, i.e. multicore chips, lead to improvements in others like powertrains.
I thought this required changes in compiler design to take advantage of multicore chips. Perhaps that was in the cae of parallel programming. I am not sure that applies here but perhaps someone could speak up on that.
I would also expect the multicore design to be extended to additional cores just like our workstations are now quad core, hexcore or even octo core designs. I think this saves power on the chip as well since the multicores reduce the need for higher clock speeds.
Robots that walk have come a long way from simple barebones walking machines or pairs of legs without an upper body and head. Much of the research these days focuses on making more humanoid robots. But they are not all created equal.
The IEEE Computer Society has named the top 10 trends for 2014. You can expect the convergence of cloud computing and mobile devices, advances in health care data and devices, as well as privacy issues in social media to make the headlines. And 3D printing came out of nowhere to make a big splash.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.