We've all seen Moore's Law called a dead man walking over and over. Usually the limits of materials, of physics are identified as the culprit. Ten years ago, I thought Moore's Law was through. Amazing how it continues on.
Regarding the downscale of semiconductor components, surely there has to be a knock on effect with regard to the relative effect that the closer proximity of the legs of the chips has to each other? Has ROHS limited further miniturisation owing to 'tin wiskers?
Eventually we will be doing processing using photons instead of electrons. I would expect that will enable Moore's law to continue a while longer. I would also aniticpate materials other than silicon such as graphene to be applied. Nano scale engineering a la "Diamond Age" is not far off.
Ideally, other materials could be used as replacements for silicon, but that has never happened on a large scale. Silicon is abundant and cheap, has great economies of scale, enjoys a huge base of manufacturing systems that are geared to it, and offers relatively low power consumption. Gallium arsenide was for many years considered as a potential successor to silicon in some quarters, but even the great Seymour Cray failed with it. The old joke in the supercomputer industry was "gallium arsenide is the future of the industry, and it always will be."
The fundamental issue which challenges Moore's Law is that of device physics. As on-chip feature sizes get down to the sub-nanometer region, what starts happening is that leakage currents get significantly worse, impacting the ability of transistors to switch without expending a lot more power. The end-around to this problem has been to try out new materials. This is what Intel did a few years ago when it introduced its hi-K metal dielectric material. Now, this will give Moore's Law a little more runway, but eventually semi manufacturers will hit what's called the "fundamental limits of physics" problem. This is when feature sizes get so small that you have switching being handled by only a few atoms. At that point, performance becomes non-deterministic and all bets are off.
Regarding the move to multicore, as you've written that's very important. Freescale's taking a page here from the Intel and AMD playbook, where in 2005 Intel made its famous "right-hand turn" from single core to multicore in response to the power budgets for single-core microprocessors threatening to rise above 150W.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.