Apple's New A11 Chip is the Real iPhone X Innovation

The A11 Bionic chip inside of the new iPhone X points to Apple's future ambitions for advanced machine learning and augmented reality.

On the 10th anniversary of the iPhone's release, Apple has decided to skip over nine and jump straight to 10 with the release of the iPhone X (pronounced “iPhone 10”). And while fans will be delighted with the larger “Super Retina” display, wireless charging, and improved cameras, the real innovation is going on under the hood, with a new chip, the A11 Bionic, that is optimized for next-gen technologies like advanced machine learning and augmented reality (AR).

The A11 Bionic replaces the A10 processor from previous iPhone models and will be the core of the new iPhone 8 and 8 Plus models as well as the X. But its in the X where the chip really shows its capabilities, and shines some light on future directions Apple may be taking. Manufactured using a 10 nm FinFET process, the system-on-a-chip has 4.3 billion transistors and an integrated 64-bit, six-core CPU (two performance cores and four efficiency cores) as well as a three-core GPU. Apple says the A11 can deliver up to 70 percent greater performance and has an up to 30 percent faster graphics performance than previous iPhone chips.

Philip Schiller, Apple’s Senior Vice President of Worldwide Marketing, announces the A11 Bionic at The Apple event on September 12, 2017. (Image source: Apple).

The key feature that Philip Schiller, Apple’s Senior Vice President of Worldwide Marketing, demonstrated for an audience at Apple's latest unveiling event earlier this week was FaceID, which allows users to unlock their iPhone X simply by looking at it. On the surface this would appear to be the next step up from pattern-based passwords or fingerprint-based TouchID. But the underlying facial recognition for FaceID is rather sophisticated.

The iPhone X features an array of new sensors, including a, flood illuminator, an infrared camera and a dot projector. When a user looks at their iPhone the flood illuminator recognizes a user's face, the infrared camera takes an image, and the dot projector sends out 30,000 infrared dots to map the user's face. All of this information goes into neural networks built right into the A11 chip that creates a mathematical model of the user's face. That mode is checked with a stored model and if it's a match the phone unlocks. All of this happens in real-time without any server- or cloud-based computing.

The A11 chip has a “neural engine,” optimized for machine learning processing, built directly into it – a dual-core design capable of performing 600 billion operations per second, according to Apple. Schiller told the keynote audience that Apple engineers trained the neural networks in the A11 chip with over a billion faces. The result is a system that can actually learn faces, according to Schiller. The neural engine can recognize your face even with changes due to hairstyle, accessories like hats and glasses, and can even adapt as your face changes over time (i.e. if you grow a beard). Schiller said the system can't be tricked by photos and that Apple even used sophisticated Hollywood special FX masks to train it to recognize phony faces. Essentially the system has developed

Comments (0)

Please log in or register to post comments.
By submitting this form, you accept the Mollom privacy policy.
  • Oldest First
  • Newest First
Loading Comments...