How Face ID Works On Apple’s iPhone X – Forbes
Apple’s new smartphone, iPhone X, uses a recognition system called ‘Face ID‘ to unlock your device and authorize payments. How does it check that your identity is authentic?
“It compares two face images and determines how similar they are,” explains Professor Anil Jain, who studies pattern recognition and computer vision at Michigan State University. “In the simplest way, that’s what face recognition does. You enrol your face, just like in Touch ID you enrol your fingerprint.”
Face ID uses a combination of light projectors and sensors to take several images of your facial features. Collectively called the ‘TrueDepth camera system’, Apple says these technologies work together to build a ‘detailed depth map of your face to recognize you in an instant.’
To set up Face ID, you follow the on-screen instructions, which involve moving your head in a circle so a camera can take multiple shots of your face for a 3D map. It uses infrared (IR) light to illuminate your face while capturing the images, to work day or night, outside or indoors. IR spans wavelengths of electromagnetic radiation (commonly known as ‘light’) just beyond the visible spectrum, so iPhone X’s display won’t dazzle you in the dark.
“Infrared light is a non-visible illuminator, which will compensate for the effect of low background or ambient light, and very high or bright sunlight,” says Jain. “So infrared illumination helps in enabling the face recognition to work in non-favorable conditions.”
Here are the steps for image capture:
- The proximity sensor and ambient light sensor help the TrueDepth camera system determine how much illumination will be needed for face recognition;
- The flood illuminator produces infrared (IR) light, part of the electromagnetic spectrum that’s invisible to the naked eye, to illuminate your face;
- The dot projector produces more than 30,000 dots of invisible IR light to create a three-dimensional map (for area and depth) of your facial landscape;
- The infrared camera captures images of the dot pattern and the IR light (a heat signature) that’s been reflected back from your face.
Your face is a ‘biometric’ — a measurable biological characteristic. Some other biometrics used in security include fingerprints, voice and the irises of eyes.
All biometric authentication systems basically compare two complex patterns and calculate how similar they are. Those patterns may be the waveforms in your voice, your fingertip ridges, the colored structures of your iris, or your landscape of facial features.
When a biometric system is set up, a computer — such as the processor in your smartphone — will capture and store the reference patterns, known as a template or ‘enrolment image’. Then when you want to access a device (e.g. unlock your phone), you present the computer with a ‘verification image’.
“Internally it computes a score between 0 and 1,” says Jain. “If it’s closer to 1, that means it’s the same fingerprint or face. If it’s closer to 0, it’s not the same person.”
As the enrolment and verification images won’t be identical due to differences in capture conditions, your phone uses a threshold to determine whether they’re significantly different. A comparison score of 0.7 might be close enough under some scenarios, for example, and that minimum score is not a fixed number.
“If you are just unlocking the phone, the internal threshold which your mobile phone company uses could be relatively low — they could set it at 0.5 or 0.6,” says Jain, adding that the number can be high in certain contexts, like when you pay for expensive products. “If this is a secure transaction when you are buying a $10,000 necklace at Tiffany’s, maybe the threshold will go up to 0.9.”
All the calculations happen too fast for you to notice, thanks to the power of smartphone processors. iPhone X uses a ‘Neural Engine’ that, as Apple marketing chief Phil Schiller boasted, “can perform over 600 billion operations per second, and it’s used to do real-time processing of Face ID recognition.”
Here are the steps for face recognition:
- The IR images are sent from the camera to iPhone X’s ‘Neural Engine’ computer processor to build a 3D mathematical model (map) of your face;
- The 3D model or ‘verification image’ is presented to the computer’s algorithms and compared against your stored template or ‘enrolment image’;
- The processor calculates whether the verification and enrolment images match, based on a comparison score of similarity between your images;
- The phone authenticates your identity and unlocks (or authorises a payment) if the comparison score is higher than a certain threshold value.
NEXT PAGE: Neural networks and PIE
Write a Reply or Comment:
You must be logged in to post a comment.