We've all seen Moore's Law called a dead man walking over and over. Usually the limits of materials, of physics are identified as the culprit. Ten years ago, I thought Moore's Law was through. Amazing how it continues on.
Regarding the downscale of semiconductor components, surely there has to be a knock on effect with regard to the relative effect that the closer proximity of the legs of the chips has to each other? Has ROHS limited further miniturisation owing to 'tin wiskers?
Eventually we will be doing processing using photons instead of electrons. I would expect that will enable Moore's law to continue a while longer. I would also aniticpate materials other than silicon such as graphene to be applied. Nano scale engineering a la "Diamond Age" is not far off.
Ideally, other materials could be used as replacements for silicon, but that has never happened on a large scale. Silicon is abundant and cheap, has great economies of scale, enjoys a huge base of manufacturing systems that are geared to it, and offers relatively low power consumption. Gallium arsenide was for many years considered as a potential successor to silicon in some quarters, but even the great Seymour Cray failed with it. The old joke in the supercomputer industry was "gallium arsenide is the future of the industry, and it always will be."
The fundamental issue which challenges Moore's Law is that of device physics. As on-chip feature sizes get down to the sub-nanometer region, what starts happening is that leakage currents get significantly worse, impacting the ability of transistors to switch without expending a lot more power. The end-around to this problem has been to try out new materials. This is what Intel did a few years ago when it introduced its hi-K metal dielectric material. Now, this will give Moore's Law a little more runway, but eventually semi manufacturers will hit what's called the "fundamental limits of physics" problem. This is when feature sizes get so small that you have switching being handled by only a few atoms. At that point, performance becomes non-deterministic and all bets are off.
Regarding the move to multicore, as you've written that's very important. Freescale's taking a page here from the Intel and AMD playbook, where in 2005 Intel made its famous "right-hand turn" from single core to multicore in response to the power budgets for single-core microprocessors threatening to rise above 150W.
What should be the perception of a product’s real-world performance with regard to the published spec sheet? While it is easy to assume that the product will operate according to spec, what variables should be considered, and is that a designer obligation or a customer responsibility? Or both?
Biomimicry has already found its way into the development of robots and new materials, with researchers studying animals and nature to come up with new innovations. Now thanks to researchers in Boston, biomimicry could even inform the future of electrical networks for next-generation displays.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.