One of the most talked-about features of the long-awaited Apple iPhone X is the phone’s new
TrueDepth Camera system, which forms a key component of what some reviewers are calling
Apple’s most important smartphone in years. The TrueDepth system’s 7-megapixel front-facing
selfie camera, which AppleInsider says stacks up competitively against the camera for the Samsung
Galaxy Note8, supports innovative facial recognition and animated emoji “Animojis,” two of the
iPhone X’s signature features. Here’s a look at how the TrueDepth Camera system works, some of
its most important applications and why it’s a potential game changer.
How TrueDepth Works
The TrueDepth Camera system works through a process similar to that which Kinect uses for motion
sensing. The process begins when a flood illuminator floods your face with infrared light to help with
facial recognition in low-light conditions. A dedicated infrared camera then snaps a picture, and a dot
projector creates about 30,000 infrared dots to construct a 3D image of your face.
The resulting information is then transmitted to the iPhone X’s A11 Bionic processor chip, which
contains on-device artificial intelligence neural network. The network has been trained by over a
billion images, creating a database that can be used to distinguish one 3D facial image from another.
In this way, TrueDepth represents an advance from previous cameras which captured 2D images
and were less sensitive to depth differences.
One of the most important applications of TrueDepth is supporting Face ID, a biometric security
authentication tool that represents an advanced on Touch ID, used in previous iPhone devices.
Touch ID, which relies on the user’s fingerprint, employs 2D landmarks to distinguish one user from
another. In contrast, Face ID uses TrueDepth to recognize 3D landmarks, which are mathematically
more complex than 2D landmarks. As a result of this improvement, Apple claims that the odds of a
random person being able to access your phone through Face ID by imitating you are about one in a
million, compared to 1 in 50,000 for Touch ID.
Apple tested this claim by working with Hollywood mask developers to see if realistic masks could
fool Face ID. Other experts have raised the possibility that Face ID might be fooled by 3D-printed
head replicas. In one case, a child who resembled their parent was able to access their parent’s
phone through Face ID.
Despite these potential workarounds, Face ID represents a breakthrough advance on Touch ID, and
as the technology improves, it will become better at using depth detection to distinguish true user
faces from impersonators. Meanwhile, the iPhone’s on-device AI neural network possesses its own
innate capability to improve at impersonation detection by collecting data about the history of the
user’s facial scans, learning to distinguish the user’s face more precisely the more scans are run.
A second game-changing application of the TrueDepth Camera system is support for animated
emojis, or “Animojis,” derived from 3D images of the user’s face. Animojis enable you to create
customized, animated 3D emojis based on your facial expressions. You can even edit your Animojis
to include audio messages where the emoji moves its mouth in the same way you do when
pronouncing your message.
Animojis represent the cutting edge of mobile emoji technology, placing Apple ahead of its
competition on this front. They also represent other potential innovations. Developers are already
using TrueDepth to create face capture apps that go beyond Animojis to support augmented reality
applications. TrueDepth is also being used to create more realistic 3D Snapchat filters. And
Hollywood animators may soon be able to incorporate TrueDepth into more realistic digital models of