In Philip Pullman's fantasy trilogy 'His Dark Materials', every character has a daemon, an animal that manifests the inner self of a person. For us, the daemon would be our smartphones, capturing our souls in the palm of our hands. But how do we protect something so precious from prying eyes? With these small devices carrying everything from access to our corporate identities to our secret lives, smartphone companies have been stressing on the need to add screen locks to prevent others from accessing our phones. Over the years, companies have offered everything from number locks to patterns and even fingerprint scanners to protect phones, and for authentication in place of passwords or to initiate payments.
Of late, there has been a move towards the phone recognising user's face and then opening the screens for her. Samsung has, for instance, experimented with iris scanners, and has settled with face unlock, where the phone matches 'face' to what it has in its records. A lot of other Android phones, too, use face unlock in one way or the other. Apple, meanwhile, has been using Face ID, which is more complex than face unlock on Android phones, as it uses more layers of data.
Kaiann Drance, V-P, product marketing, Apple, recently told me how Apple's technology is different. Face ID, Drance explained, utilises some of Apple's most sophisticated technologies like the TrueDepth camera, the secure enclave on the chip, and the neural engine. In simple terms, TrueDepth looks for the face when you wake the iPhone up by raising it or simply tapping the screen; when the face is detected, Face ID confirms user's attention and intent to unlock it by detecting if the eyes are open and are directed towards the camera. This process ensures spoofing is near-impossible and reduced to a one-in-a-million possibility. On top of this, the phone locks itself if there have been five failed attempts to unlock using Face ID.
But those who have used Face ID on an Apple phone, first introduced with the iPhone X, would have noticed how this works even if you are in a dark room. This is because the TrueDepth camera system has an inbuilt dot projector that puts 30,000 invisible dots on the user's face to build out a unique facial map. This works along with the flood illuminator to identity a face even in the dark. According to Drance, facial authentication technologies that don't have this layer don't work well in the dark, and can potentially be spoofed by a photograph as these don't have depth information or a facial map to match against.
Face ID and other such technologies are now getting widespread acceptance as a secure authentication method. Just in India, a bunch of financial services apps, from ICICI to Paytm Money and HDFC Bank, have started using this, negating the need to enter a PIN for transactions. I personally use it to secure my blood sugar data on the OneTouch app. Also, on Apple devices, a lot of passwords will auto fill if the phone detects the user's face. Interestingly, Apple does all of this on the device and no data needs to be backed up to the cloud. Even natural changes to the face are factored in, as it has trained multiple neural networks to see if there are changes every time it looks at the user, and then make that adaptive change. Drance said this meant training the network with over 2 billion images, including infrared and depth images collected in extensive studies.
In the coming months, you can expect more devices to unlock when it sees the user's face, even as across other devices this becomes a standard way to authenticate if the user is genuine. This will also lead to a lot of new use cases. If the computer is able to verify a user this accurately, then you could even be doing job and visa interviews remotely. A lot of instances where personal presence is now needed could be replaced by videoconferencing. The possibilities are endless.