What I like most about this technology is the huge difference in size between other multispectral cameras I've written about in the past and the fact that this is a chip-level solution, even doing post-processing filters on-chip. I think the need for this technology will only continue to increase as design features keep getting smaller, and with the mixes of multiple material types.
There's a large number of apps that could take advantage of this technology. Industrial machine vision and inspection of chips, boards and electronics sub-assemblies, R&D of several different kinds including component failure and analysis labs, medical labs of various kinds, and medical equipment manufacturing. It could possibly also be used in various kinds of materials detection, possibly in security apps, as well as for detecting counterfeit components made of inferior materials.
Having seen spectroscopy systems in the semiconductor industry in the 1980s, this seems like about as small a package as I can ever remember. Is this indeed smaller than the current state of the art? Has anyone else used a system on a chip approach like this one, Ann?
Hi Ann, this camera systems looks very interesting and powerful for many applications. I want to ask what is the typical spectral range of vision this components have and what is the difference with this one, I work with concentrated solar power systems and I wonder if this devices would work for analysis of ray tracing for beam radiation. Thank you and great article.
Interesting that it may be able to detect counterfeit components made of inferior materials. Right now, components coming back as returns are inspected by the human eye. There is a training program and certification for inspectors. But that process can't catch everything. Returns are the vulnerability area that lets counterfeit components into the legitimate-component bloodstream. A camera that can see better than the human eye could be a big help.
Chuck, there are other multispectral sensor chips of varying sizes, architectures and wavelength ranges--usually IR or IR plus visible light--but Imec's combination of hyperspectral spectroscopy sensor with regular visible light image sensor is unusual. It may also beunique. It's certainly one of the smallest I've seen, but I don't read German and I wouldn't be surprised if there are others (Germany is a leader in machine vision and imaging technology). That said, European Imec is a leading edge research institution, and they have a lot of firsts to their name.
Aldo, I'm not exactly sure what your question is. As the article states, the prototype chip's hyperspectral filter has 100 spectral bands between 560nm and 1,000nm. The filter bandwidth ranges from 3nm at 560nm to 20nm at 1,000nm, and the transmission efficiency is approximately 85 percent.\
Thanks, Ann! This is amazing technology. Taking a look at the original release, Imec mentions integration times as short at 2 ms, which, not counting transfer time, would equate to their specified max of 500 fps. When it comes to their hyperspectral filter, it has a maximum of 100 band settings at an effective slew rate of 2000 bands per second, which puts it around 500 us per band -- darn fast. I'm guessing that is the reaction time of the electro-optics they are using for their proprietary filter. Either way, this is a really impressive piece of technology that should find rapid acceptance in all sorts of machine vision applications.
William, thanks for the input. I'm not a specialist in this area, but I have a retentive reporter's memory and those specs sure looked leading edge, if not bleeding edge, to me. The one that first caught my eye, besides the integration of spectroscopy plus imaging on a chip, was the 6x maximum frame rate of the 4 MP image sensor. It's good to hear that the hyperspectral sensor frame rate is also wicked fast. The one thing we didn't get info on is the price. Since this is a prototype, it may go through some design shrinks for the first production runs that will help lower that.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.