Craig Federighi demonstrates Face ID on iPhone X.
Federighi spoke with TechCrunch to clear up some questions and concerns about Apple’s biometric authentication solution that have cropped up in the days following Face ID’s debut and first public demonstration on Tuesday.
Starting with Face ID’s data backbone, Federighi said Apple collected a “quite exhaustive” library of face scans from consenting subjects that was subsequently used to train the system. To maximize accuracy, Apple captured high quality depth maps of each person’s face from various angles.
“Phil mentioned that we’d gathered a billion images and that we’d done data gathering around the globe to make sure that we had broad geographic and ethnic data sets. Both for testing and validation for great recognition rates,” Federighi said. “That wasn’t just something you could go pull of [sic] the internet.”
He goes on to say that Apple is holding on to the high fidelity depth map data for further algorithm training, noting the dataset is needed for system optimization. As can be expected of a valuable, and potentially sensitive, asset, Apple’s facial scan library is protected.
The same can be said about user data. Media outlets, pundits and potential buyers have raised concerns over what Apple does with data generated from a Face ID scan. Some have suggested data is sent to the cloud for processing, while others pondered whether Apple keeps a repository of scans to train its neural network.
“We do not gather customer data when you enroll in Face ID, it stays on your device, we do not send it to the cloud for training data,” Federighi said.
Just like Touch ID, Apple does not retain Face ID data, that information lives in the iPhone X Secure Enclave. Also like Touch ID, raw Face ID image data is processed and stored as a mathematical model that cannot be reverse engineered back into “model of a face.”
Developers are also not privy to facial scan data through ARKit. Instead, apps leveraging the API are granted access to a depth map, not raw sensor data, that can be applied to photographic effects and other software features, Federighi said.
Further, TrueDepth’s dot projector and infrared receiver are designed to work only at short distances, a hardware limitation noted by industry analysts in the weeks leading up to iPhone X’s unveiling. Extrapolating this information out into the real world, it can be assumed that Apple’s phone is not capable of scanning faces of passersby.
The executive notes iPhone X boasts a redesigned Secure Enclave capable of performing the re-training procedures that power Face ID’s adaptive features. Mentioned during Tuesday’s keynote, Face ID is able to adapt to changes in a user’s face. For example, the system will work when a user grows a beard, wears a hat or puts on glasses. As Federighi noted in an email to a curious customer, Face ID even works with most sunglasses.
There are limitations to the system’s capabilities. Face ID will be unable to recognize a user whose face is obscured by a mask or niqab, the report says. In such cases, users can opt to enter a passcode instead.
A day after Face ID was unveiled, U.S. Senator Al Franken penned a letter to CEO Tim Cook requesting additional information on the system, including questions touching on data acquisition and retention. Aside from answers already available in Apple’s PR and website, Federighi’s answers today address many of Franken’s concerns.
Finally, Federighi said Apple intends to release a white paper on Face ID for security researchers and other interested in learning more about the cutting-edge facial recognition technology.