Why is the iPhone’s FaceID lawful while any company gets fined a million for “the same thing”? (And how this might benefit you)
Updating (and expanding upon) the "household exemption"
For years, we've been bitterly complaining about the sudden and radical restriction imposed by the data protection authorities on biometric controls in the business environment.
We already outlined the timeline and proposed a solution to unblock the situation in previous posts:
Meanwhile, we've seen all kinds of blows. Football clubs, supermarkets, companies…
And during all this time, every now and then someone would ask me… and what about Face Id on the iPhone…? WHAT? HUH, WHAT?
What about my fingerprint on my Samsung, bitch?
So?
Well, the other day I read in a document by the CNIL—the French data protection authority—that has gone largely unnoticed, the most convincing explanation I've seen in all this time.
Since many people will want to read just the explanation about the iPhone, I’ll get straight to the point: we’ll talk about the example and the implications.
Only afterward, we’ll dive into the technical-legal stuff. But pay close attention to that part, because it could allow us to solve the fingerprint control issue in companies (aka “The Impossible”).
Uh-huh, I see you’re smiling like the little “Face ID” face… let’s continue.
You’re reading ZERO PARTY DATA. The newsletter about the crazy crazy world news from a data protection perspective by Jorge García Herrero and Darío López Rincón.
In the spare time this newsletter leaves us, we like to solve complicated issues in personal data protection. If you’ve got one of those, give us a little wave. Or contact us by email at jgh(at)jorgegarciaherrero.com.
Ladies and gentlemen… the CNIL!
The CNIL says that the GDPR does not apply when these two conditions are met (both must apply):
· First – The processing is initiated at the data subject’s initiative (in this case, the phone owner), and is carried out under their control and on their behalf;
· Second – The processing takes place in a separate environment, meaning there is no possibility for the provider or any third party to access or intervene in these data: the provider has supplied the means for data processing, but cannot access or act upon them.
Why?
Because the household exemption applies (you know, the exceptional rule by which the GDPR does not affect the average citizen doing things within the… domestic sphere, so to speak).
And because that same exemption can also benefit companies that provide the technical means to the data subject, under those two conditions.
And as an example of this, it cites exactly the case of iPhone’s FaceID (or Android, it makes no difference).
Biometric authentication on multifunction mobile phones: this is the case when the processing is carried out solely at the user's discretion, with local storage and encryption of their biometric data. In fact, the processing is carried out at the individual’s discretion, and the data remains fully under their control;
In plain terms:
Apple can do 80 times a day via your iPhone something that neither LaLiga, nor Osasuna, nor 90% of companies can do: verify biometrically that you are you by processing your biometric data (your face or your fingerprint).
Why?
Because they do it in an enclave separate from the mobile’s operating system (Secure Enclave/Trusted Execution Environment), out of reach of third parties’ claws (app developers, companies interested in identifying you or bombarding you with ads) and Apple’s own claws, since Apple deliberately renounces access to that enclave by design.
So does Apple—not to mention—process your biometric data?
Of course it does. If it didn’t, your girlfriend, Darth Vader, or your dog could unlock your phone with their face or finger. And they can’t.
–Your engineer brother-in-law will say no, that no data is processed, but he only says that because he doesn’t understand the law—.
Apple—or the respective provider—has designed the system so that, being one of the most secure systems on the market—its facial recognition rarely fails—it’s built in such a way that all Apple (and any third party relying on Face ID for processing) receives is a token with the affirmative/negative result of the authentication, without access to the biometric template.
The commercial potential of this approach is obvious.
A company that has provided its employees with corporate smartphones (or is willing to do so for this purpose) and wants to implement biometric entry or attendance control (or reactivate the one it had to shut down due to that AEPD guide from 2023) would only need to implement NFC/BLE/FIDO2 corporate sensors to receive identity confirmation from its employees.
The employee would authenticate on their corporate phone, the biometric processing would take place locally, and the company would only receive the identity confirmation/denial token.
Simple.
Now we have a document from an authority (still in the consultation phase, mind you) that, to my taste, provides a solid reasoning.
The whole update about the household exception, here.
Jorge García Herrero
Delegado de Protección de Datos externo de Freepik
One of the very important parts of how Apple does the processing is that the Secure Enclave is on the cpu, and has no wires to connect to. All of the connections to the cpu are done on non surfaces level connections, while other connections necessary for the cpu to function take place above and below.
Also the apple Face ID tracks infrared reflections off of eyeballs to determine when you are looking at the device.
Early Samsung Face ID could be fooled using a Facebook profile picture taped to a 2 liter bottle of soda.