Apple's New A11 Chip is the Real iPhone X Innovation

The A11 Bionic chip inside of the new iPhone X points to Apple's future ambitions for advanced machine learning and augmented reality.

Chris Wiltz

September 14, 2017

6 Min Read
Apple's New A11 Chip is the Real iPhone X Innovation
Smartphonesenormous impact smartphones have had on our lives. Smartphones have grown into full-It doesn't matter whether you prefer iOS, Android, or another option, there's no denying the fledged computing platforms – enabling entirely new business models ranging from digital health to mobile VR. The gaming market in particular has enjoyed huge returns thanks to the computing power offered by today's smartphones.(Image source: Apple)

On the 10th anniversary of the iPhone's release, Apple has decided to skip over nine and jump straight to 10 with the release of the iPhone X (pronounced “iPhone 10”). And while fans will be delighted with the larger “Super Retina” display, wireless charging, and improved cameras, the real innovation is going on under the hood, with a new chip, the A11 Bionic, that is optimized for next-gen technologies like advanced machine learning and augmented reality (AR).

The A11 Bionic replaces the A10 processor from previous iPhone models and will be the core of the new iPhone 8 and 8 Plus models as well as the X. But its in the X where the chip really shows its capabilities, and shines some light on future directions Apple may be taking. Manufactured using a 10 nm FinFET process, the system-on-a-chip has 4.3 billion transistors and an integrated 64-bit, six-core CPU (two performance cores and four efficiency cores) as well as a three-core GPU. Apple says the A11 can deliver up to 70 percent greater performance and has an up to 30 percent faster graphics performance than previous iPhone chips.

Philip Schiller, Apple’s Senior Vice President of Worldwide Marketing, announces the A11 Bionic at The Apple event on September 12, 2017. (Image source: Apple).

The key feature that Philip Schiller, Apple’s Senior Vice President of Worldwide Marketing, demonstrated for an audience at Apple's latest unveiling event earlier this week was FaceID, which allows users to unlock their iPhone X simply by looking at it. On the surface this would appear to be the next step up from pattern-based passwords or fingerprint-based TouchID. But the underlying facial recognition for FaceID is rather sophisticated.

The iPhone X features an array of new sensors, including a, flood illuminator, an infrared camera and a dot projector. When a user looks at their iPhone the flood illuminator recognizes a user's face, the infrared camera takes an image, and the dot projector sends out 30,000 infrared dots to map the user's face. All of this information goes into neural networks built right into the A11 chip that creates a mathematical model of the user's face. That mode is checked with a stored model and if it's a match the phone unlocks. All of this happens in real-time without any server- or cloud-based computing.

The A11 chip has a “neural engine,” optimized for machine learning processing, built directly into it – a dual-core design capable of performing 600 billion operations per second, according to Apple. Schiller told the keynote audience that Apple engineers trained the neural networks in the A11 chip with over a billion faces. The result is a system that can actually learn faces, according to Schiller. The neural engine can recognize your face even with changes due to hairstyle, accessories like hats and glasses, and can even adapt as your face changes over time (i.e. if you grow a beard). Schiller said the system can't be tricked by photos and that Apple even used sophisticated Hollywood special FX masks to train it to recognize phony faces. Essentially the system has developed its own uncanny valley – a natural sense we humans have in which the more human and realistic something attempts to appear, the more even a tiny flaw will trigger our brains into recognizing it as fake.

Apple was also keen to demonstrate the improved performance the A11 offers for augmented reality – demonstrating several apps and games created with its ARKit software development kit. Apple also announced that ARKit will now support face tracking, allowing developers to leverage the new camera and sensors, along with the power of the A11 chip to create AR experiences capable of integrating and responding to users' facial movements.

With the combination of machine learning optimization as well as an emphasis on delivering AR experiences it's clear that Apple is setting up the iPhone as a powerful platform for delivering new levels of interactive experiences. Coupling ARKit with Apple's Core ML platform for developers looking to integrate machine learning into their iOS apps, could allow the iPhone X to run some very complex apps that combine machine learning and augmented reality.

Right now A11Bionic's more powerful GPU could be seen as a move merely to improve graphics performance, but don't forget that GPU makers like Nvidia have been touting GPUs as the way forward over CPUs in handling the heavy processing demands of sophisticated AI and machine learning apps. Perhaps in future iterations of the iPhone consumers will see the integrated GPU technology being leveraged for machine learning processing, as well as AR graphics.

The multiplayer battle game The Machines by Directive Games is one example of the AR experiences the iPhone X is capable of.

Last month Google, Apple's largest competitor on the software side of the smartphone market, announced its own AR development kit, ARCore – a spinoff of its Tensor-based AR development kit, now expanded for easier delivery across a variety of Android devices. Google clearly saw the writing on the wall that demand for AR is going to be hardware agnostic and users are going to want AR on whatever device they own.

Apple's ARKit was a bit ahead on this front, allowing developers to deploy AR apps across most newer iPhone models. But examining the power of the A11 Bionic chip it looks like Apple is not only working to position itself to be the most diverse and widely-distributed AR platform, but also the most powerful. How Apple's own machine learning and AR efforts will stack up against Google's ARCore and the search giant's own TensorFlow platform for machine learning development remains to be seen. But the battle lines over the next-generation of smartphone are clearly being drawn over machine learning and AR. 

 iFixit Takes You Inside Apple's iPhone 8   
In an ESC Silicon Valley 2017 keynote presentation, Kyle Wiens, CEO of iFixit, will tear down Apple's iPhone 8, uncovering its cutting-edge AR (augmented reality) sensor, and compare it to the competition while discussing the tradeoffs and decisions made by Apple's product designers. Kyle will compare the iPhone to smaller production devices and take you on a tour inside this season's hottest electronics.developments and challenges. ESC Silicon Valley takes place Dec. 5-7, 2017.Click here to register today!

Chris Wiltz is a senior editor at Design News covering emerging technologies including VR/AR, AI, and robotics.

Sign up for the Design News Daily newsletter.

You May Also Like