Can Apple Use Its Latest AI Chip for More Than Photos?

Apple calls its new A13 Bionic chip the most powerful ever placed in a smartphone. But will the company ever use its machine learning chips for more than taking photos?
Apple says the A13 outperforms other leading smartphone chips. (Image source: Apple) 

If anything can be taken from Apple's latest iPhone announcement it's that the company clearly views photos as the killer app for any iPhone. Apple said its latest smartphone chip has been optimized for machine learning, but right now much of that technology looks to be applied to image capture.

In addition to a new dual camera setup, Apple's latest iPhone, the iPhone 11, is coming with upgraded chip hardware that Apple is calling the fastest to ever appear in a smartphone. The new A13 Bionic is a combination CPU and GPU (along with Apple's proprietary “Neural Engine”) that is design-optimized for running machine learning applications. It is also the second generation of Apple's A-series chips to use a 7-nanometer manufacturing process – cramming in 8.5 billion transistors that Apple said are engineered to cater for performance and power.

The CPU component contains machine learning accelerators and is capable of running up to 1 trillion operations per second. The A13 is designed so that machine learning models can be scheduled across the GPU, CPU, and Neural Engine depending on the needs of the application. At this week's 2019 Apple Event the company said this allows for optimized efficiency in terms of both processing and power across applications such as natural language processing and image classification.

Better Machine Learning...For Better Photos

Though the company touted the power of the new A13 and talked about the potential for developers to leverage its capabilities, there was limited demonstration of the chip's power at the Apple Event. A glimpse at an upcoming game titled Pascal's Wager demonstrated the graphics capability of the A13's GPU, but didn't give those looking for a comparison to other hardware out there much to go on. Following the event however, Gokhan Avkarogullari, a software engineer at Apple, who managed the iOS, tvOS, and watchOS Graphics Driver Teams at Apple, took to Twitter on his own to elaborate further on the advancements made with the A13's GPU and how the GPU's new API – MetalAPI – fits in with that.

RELATED ARTICLES:

The second mention of the A13's abilities was more of a proof-of-concept as Apple talked about an upcoming feature for the iPhone called Deep Fusion that leverages machine learning to improve image quality in photos. The iPhone 11 takes nine total photos and analyzes them pixel by pixel – combining the best of each image to create a single image that has been optimized to have as much detail and as little noise as possible. Apple called it “computational photography mad science,” to cheers and applause from the event audience. Rather than a live demonstration of Deep Fusion however, the Apple Event only offered a look at a photo that Apple said was taken using the technology.

With claims that the A13 outperforms other leading smartphone chips such as the Snapdragon series offered by Qualcomm, seeing the chip applied to such a specific use case should be disappointing to those who aren't strictly photography enthusiasts or iPhone filmmakers. Certainly, however the chip will also lend itself to improvements with Apple's virtual assistant Siri as well as augmented reality features in the iPhone that may rely on object or facial recognition.

Is It Time for a New Product?

Rumors have circulated for years at this point that Apple could be developing its own virtual reality or augmented reality headset. But the company has never announced any formal plans or even hinted at them at any of its events. However, chip hardware like the A13 could make an ideal fit for some sort of mobile VR or AR headset. Facebook's Oculus Quest headset, for example, runs on a Qualcomm Snapdragon 835 – two generations removed from the latest Snapdragon 8 series chips and far below the performance claimed of Apple's A13.

Since its release of the A11 Bionic chip to coincide with the iPhone 8, Apple has made a point to highlight the machine learning and graphics capabilities of its smartphone chips. Time will tell if the A13 is just another iterative step or part of a longer road in Apple's strategy. But even at this phase, if Apple's claims are to be believed, the chips seem much too powerful to be confined solely to the worlds of smartphones and photography.

The iPhone 11 comes at a time when Apple investors are looking for the company to unveil its next big innovation. Customers are holding onto their iPhones for longer, creating a decline in iPhone sales. Analysts see the iPhone 11 as Apple's chance to reverse that trend. Speaking to CNN, David McQueen , an analyst at ABI Research, said it's going to take a big change for Apple to recapture customers' imaginations. "Apple tends to perform well when it changes the design of the iPhone in a drastic way," McQueen told CNN. "However, it cannot do this every year."

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, and robotics.

Comments (0)

Please log in or to post comments.
  • Oldest First
  • Newest First
Loading Comments...