I WILL be hoping to hear if you're familiar with the Leap sensor; 0.01mm repeatable position precision, between 32 and 200 frames per second, uses between 2% and 5% of a generic PC's CPU time running the algorithms that actually extract position data from the sensor's data stream... Range limitation is the chief drawback (at the moment); their initial product is designed as a desktop-worksurface solution, only monitors an eight cubic foot volume (which STILL manages to yield around 3.25 giga-bits of position data, if you do a ten-micrometer grid across the back surface of an approximately two foot radius field-of-view)... But that will be tomorrow's discussion, re "Fundamentals of Image Sensors", right?
In an age of globalization and rapid changes through scientific progress, two of our societies' (and economies') main concerns are to satisfy the needs and wishes of the individual and to save precious resources. Cloud computing caters to both of these.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.