|When combined with the new Cloud IoT Edge, Google's Edge TPU allows engineers to build and train machine learning models in the cloud and have them run actions and predictions at the edge in mobile devices and embedded systems. (Image source: Google)|
Google's Tensor Processing Units (TPUs) are the tech giant's own proprietary entry into the ongoing battle of processors specialized for machine learning and artificial intelligence applications. Up to this point, engineers wanting to leverage the power TPUs had to either hope for a day when Google allows them to be purchased outright (don't hold your breath) or turn to Google's cost-prohibitive, cloud-based TPU services.
Now, Google has unveiled a new chip, the Edge TPU—a purpose-built chip designed for running machine learning applications on the edge in embedded systems. Coupled with a new software stack, Cloud IoT Edge, enterprises can now train machine learning models using Google's cloud-based TPUs and deploy and run them directly on an edge-based processor.
Naturally, Google would prefer that engineers deploy models trained on Cloud IoT Edge on the Edge TPU. But models can also be executed on GPU and CPU accelerators. The Edge TPU runs machine learning models created on Google's open source TensorFlow Lite framework for mobile and embedded devices. Cloud IoT Edge itself is optimized to run in mobile and embedded systems via operating systems like Linux and Android Things.
In a blog for Google, Injong Rhee, VP of IoT at Google Cloud, said that in designing the Edge TPU, Google was “hyperfocused on optimizing for 'performance per watt' and 'performance per dollar' within a small footprint.”
Rhee continued, “Edge TPUs are designed to complement our Cloud TPU offering, so you can accelerate [machine learning] training in the cloud, then have lightning-fast [machine learning] inference at the edge. Your sensors become more than data collectors—they make local, real-time, intelligent decisions.”
Google is targeting its Edge TPUs and Cloud IoT Edge squarely at engineers developing enterprise applications, citing the advantages of processing speed for operations and predictions that come without the need for cloud connectivity as well as enhanced security. “Cloud IoT Edge can process and analyze images, videos, gestures, acoustics, and motion locally on edge devices instead of needing to send raw data to the cloud and then wait for a response,” Rhee wrote.
|The Edge TPU is smaller than a penny. (Image source: Google)|
No official specs for the Edge TPUs have been released, so it's not clear yet how they stack up against other options, such as GPU-based accelerators or even in comparison to Google's own cloud-based TPUs. The Google I/O developer conference this past June saw the announcement of the latest version of the Cloud TPU (version 3), which is capable of 420-teraflop processing speeds, according to Google.
It's not likely that the Edge IoT offers the same level of processing as the latest Cloud TPU, as Google has said it needs to use liquid cooling in its data centers to achieve its high level of performance with the Cloud TPU. However, third party benchmark tests performed by German machine learning company, RiseML, have found the second version of the Cloud TPU to perform on par with Nvidia's powerful V100 GPUs.
Google is currently offering a development board that includes an Edge TPU, an NXP brand CPU, and a secure element provided by Microchip. It is also working with several partners—including NXP, ARM, Nexcom, Nokia, and ADLINK Technology—to develop devices that utilize the Edge TPU and Cloud IoT Edge.
One such partner is French connected car startup XEE, which will be looking to use Edge TPUs for advanced data processing inside of connected vehicles. In a statement released on Google's blog, Romain Crunelle, CTO at XEE, said, “Cloud IoT Edge and Edge TPU will help us to address use cases such as driving analysis, road condition analysis, and tire wear and tear in real time and in a much more cost efficient and reliable way. Enabling accelerated [machine learning] inference at the edge will enable the XEE platform to analyze images and radar data faster from the connected cars, detect potential driving hazards, and alert drivers with real-time precision.”
|Today's Insights. Tomorrow's Technologies
ESC returns to Minneapolis, Oct. 31-Nov. 1, 2018, with a fresh, in-depth, two-day educational program designed specifically for the needs of today's embedded systems professionals. With four comprehensive tracks, new technical tutorials, and a host of top engineering talent on stage, you'll get the specialized training you need to create competitive embedded products. Get hands-on in the classroom and speak directly to the engineers and developers who can help you work faster, cheaper, and smarter. Click here to submit your registration inquiry today
LG CNS, a subsidiary of LG that focuses on providing IT services, is looking to leverage the new chip and cloud service to augment its manufacturing execution systems (MES). “Our Intelligent Vision Inspection solution enables us to deliver enhanced quality and efficiency in the factory operations of various LG manufacturing divisions," Shingyoon Hyun, CTO of LG CNS, said on Google's blog. “With Google Cloud AI, Google Cloud IoT Edge, and Edge TPU, combined with our conventional MES systems and years of experience, we believe Smart Factory will become increasingly more intelligent and connected.”
Google's Edge TPU development kits are currently in beta and available through an early access program. Engineers interested in getting their hands on one can apply through Google. The company is especially encouraging applicants working in the manufacturing, oil and gas, transportation and logistics, healthcare, commercial building, and retail industries.
Chris Wiltz is a senior editor at Design News covering emerging technologies, including VR/AR, AI, and robotics.