You are covering FPGA now - any comments on using GPU / Nvidea CUDA technology to do the image processing?
I'll only cover briefly GPU technology. Certainly, it is a very interesting technology, and many people have used it successfully to speed up image processing. Both FPGAs and GPUs have a radically different interface than CPUs (in terms of how you program them, and how you get data on and off the processing unit).
My presentation today will focus mainly on FPGAs and how they work for embedded vision -- I think that GPUs work well as a coprocessor, although there are definitely a wide variety of industry and research engineers that have very strong feelings on the FPGA vs GPU discussion.
The company says it anticipates high-definition video for home security and other uses will be the next mature technology integrated into the IoT domain, hence the introduction of its MatrixCam devkit.
Siemens and Georgia Institute of Technology are partnering to address limitations in the current additive manufacturing design-to-production chain in an applied research project as part of the federally backed America Makes program.
Most of the new 3D printers and 3D printing technologies in this crop are breaking some boundaries, whether it's build volume-per-dollar ratios, multimaterials printing techniques, or new materials types.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.