4. Need is All You Need
The problem with the AEPD's argumentation is that its approach clashes with reality. And the reality is that there are situations that cry out for biometric identity control.
The GDPR is built on a risk-based approach and does not prohibit processing that poses risks to individuals’ rights: it requires minimizing such risks and implementing appropriate safeguards until the risk becomes acceptable.
If this is not possible—and only then—does it close the door.
4.1 What does the CJEU say about this notion of “necessity”?
To no one's surprise: "it depends."
Let’s take a look.
For the purposes of applying the GDPR, the CJEU defines the concept of “necessity” as an autonomous concept under EU law, distinct from that of the Member States.
· In this regard, there is the old Lindqvist judgment (C-101/01) from 2003. Lindqvist explicitly establishes the autonomy of the concept of necessity in the context of data protection and its linkage to the objectives of Directive 95. To equalize the level of protection across all Member States, the concept of necessity cannot vary in content from one to another.
Mini-conclusion: With all due respect, excluding, in the abstract or for any case, the necessity of biometric controls in the workplace does not appear to be something the AEPD can or should do.
· In 2022, in the case with the unpronounceable name (C-184/20), the CJEU literally applies Recital 39 of the GDPR: the requirement of necessity is fulfilled when the pursued objective of general interest cannot reasonably be achieved with equal effectiveness by less intrusive means.
· In 2023, in the already mentioned “Ministerstvo na vatreshnite raboti” case (C-205/21), applying Directive 2016/680, the CJEU analyzes the processing of special categories of data (such as biometric and genetic data) in a way that permits it only "when it is strictly necessary." The “strict” nature of necessity implies that this will be assessed with particular rigor.
Mini-conclusion: No one questions the requirement for a higher standard of compliance when processing special category data. But this statement simultaneously acknowledges that such processing IS possible: only it must be minimized to what is strictly necessary.
You’re reading ZERO PARTY DATA. The newsletter about the crazy crazy world news from a data protection perspective by Jorge García Herrero and Darío López Rincón.
In the spare time this newsletter leaves us, we like to solve complicated issues in personal data protection. If you’ve got one of those, give us a little wave. Or contact us by email at jgh(at)jorgegarciaherrero.com.
4.2 That’s all well and good, but paper can hold anything, you poor fool — can you give me an example of one of those “Objective Needs”?
First, a non-example: time clock attendance control. There are hundreds of non-biometric alternatives to achieve that purpose.
That’s not what I’m talking about here.
The usual cliché example would be critical infrastructure such as a nuclear power plant or a water treatment facility, or those cinematic-risk situations like a BSL lab (one of those handling pathogens like Ebola) or weapons depots.
Which leads us to an amusing problem: the obvious path to legitimize processing there would be the essential public interest (Article 9.2(g) GDPR), but I’m not aware of a single case where Spanish legislation meets the high standard of specificity and safeguards set by Constitutional Court Judgment 76/2019 to serve as its basis.
That’s why I prefer to point to more down-to-earth examples—those are the ones that really suffer from the issue.
There are companies embedded in supply chains with an exceptionally high level of compliance: upstream there’s a powerful company (or a consortium of companies) imposing security standards that must be met. If you don’t comply, you’re out of the chain.
In other cases, there are critical points in operations that are subject to attacks or sabotage and that could compromise economic activity (and the public interest, or public safety or health, depending on the nature of the activity).
Where I come from, that has a name: objective contractual necessity.
This is important: in my experience, when this objective need is present, the matter is uncontroversial—none of the data subjects questions the legitimacy of the control, because the context leaves no room for doubt.
Further: stepping outside the subject of biometric control for a moment, there are jobs that entail the worldwide dissemination of direct identifiers, image and voice of those who perform them: politicians, CEOs and other publicly exposed executives of multinationals, elite athletes, TV presenters, actors, musicians, models, dancers, influencers…
There’s no clear solution for the international transfer of data of such persons… other than explicit consent (erm, general or, gulp, periodic or ARGK on a case-by-case basis).
Reality forces us to understand—as indeed happens—that certain jobs inherently involve waiving a part of your privacy.
These situations must be capable of being lawfully channeled under the regulations.
And of course, they can be—as we will now see.
5 The Problem of the AEPD’s “Three-Body” Argument
To the well-known terror of using employee consent, we must add the slow death by a thousand cuts caused by the AEPD’s “three-step reasoning.”
Fortunately, the AEPD’s three-step reasoning is a circular argument.
Let’s recall the three steps:
First. When the purpose of the biometric processing you intend doesn’t fall under any of the exceptions in Article 9.2 GDPR, you have to rely on consent—there is no other path.
Why?
Because this is unequivocally stated by the EDPB in its Guidelines 5/2020 on consent (para. 99): you must go with explicit consent.
“Article 9(2) does not recognize ‘necessary for the performance of a contract’ as an exception to the general prohibition on processing special categories of data. Therefore, controllers and Member States addressing this situation must consider the specific exceptions listed in Article 9(2)(b) to (j). If none of the exceptions listed in (b) to (j) apply, the only lawful exception for processing such data is to obtain explicit consent in accordance with the conditions for valid consent under the GDPR.”
Second. Employee consent must be used only exceptionally and with the utmost caution. Why? Because their freedom is in question, on two levels:
There is a “clear imbalance” —Recital 43 GDPR— between the employer and the employee.
As the Supreme Court notes, employment is a scarce commodity, and the employee is highly likely to provide the requested consent, tied as it is to their employment contract. This is to avoid anything from overt or covert retaliation to the ever-present elephant in the room—dismissal.
Moreover, the explicit consent the employer needs for biometric control becomes embedded or linked to the employment contract. Without such explicit consent, the employer would be in breach.
These are two different angles on the same issue: the freedom of consent. The GDPR requires that one “take into the utmost account” whether the consent given by the data subject, tied to the performance of a contract (or its continuation, i.e., “if you don’t consent, you’re fired”), is truly free.
And this comes from Article 7.4 GDPR:
“When assessing whether consent is freely given, utmost account shall be taken of whether, among other things, the performance of a contract, including the provision of a service, is conditional on the consent to the processing of personal data that is not necessary for the performance of that contract.”
Article 7.4 is aimed at services—like Facebook, which makes you consent to unrelated data processing in order to use the service.
The AEPD applies Article 7.4 by saying: the employer is in a dominant position over the employee. And it requests consent for biometric control, conditioned on performance of the contract. Since things have always been done without biometric controls, these are not, strictly speaking, necessary. Therefore, consent will only be considered free if a functional non-biometric alternative is offered.
At this point, the die is cast.
Third. For consent to be considered free, the data subject must have a non-biometric alternative. If such an alternative exists, the AEPD argues, then necessity and proportionality are nullified. If Article 5 (necessity) is not met, then lawfulness of processing (Article 6) cannot even be considered.
Only that’s not quite true.
That reasoning only holds if the control wasn’t necessary in the first place.
5.1 Where’s the Problem?
The problems with the AEPD’s reasoning begin in the second step:
The extreme caution imposed by the GDPR on using employee consent and consent linked to contracts arises, as we’ve seen, from the imbalance of power between the parties.
But two things must be noted here:
Its use is subject to extreme caution, yes. But it’s not impossible. Otherwise, it would simply be prohibited—and I am not aware of such a prohibition. Yet the AEPD seeks to render the explicit consent path inapplicable in all cases.
Not only that, but—interpreted a sensu contrario—Article 7.4 is not applicable when the data processing in question is necessary for the performance of the contract.
Don’t make that face, dear reader—I’m not making this up: this is clearly stated in the much-revered Guidelines 5/2020 on consent (para. 32):
“Article 7(4) is only relevant where the requested data is not necessary for the performance of the contract (including the provision of a service), and the performance of that contract is conditional upon obtaining that data based on consent. Conversely, if the processing is necessary for the performance of the contract (including the provision of a service), then Article 7(4) does not apply.” (emphasis in the original from the Article 29 Working Party)
That’s why the key is that the processing (the control) is objectively necessary.
Because then, Article 7.4 GDPR does not apply to—among other things—linked consent. And in assessing the freedom of the consent, its connection to contract performance need not be considered.
Among other things.
5.2 And What About the Famous Employer/Employee Imbalance?
The imbalance between employee and employer always exists—whether biometric control is necessary or not.
What the law—not just the GDPR—forbids is the abuse of that imbalance.
But linked consent is not ideal here because, in this case, it must be embedded in the contract for its execution to be lawful.
Of course, the burden of proof falls on the employer to demonstrate that, in the specific case, the performance of the job requires biometric identity control.
But I repeat (for the last time): if the processing is objectively and effectively necessary, then Johnny the worker will have to explicitly consent to submit to biometric control to carry out his duties (whatever they may be)—even if he tells the company to take a hike and goes off to work independently, without bosses, through a cooperative or a labor-owned company.
Or as freelancer, whatever.
If there truly is necessity, the “imbalance” factor becomes secondary in this specific context. And that is precisely why the EDPB says what it says about Article 7.4 GDPR.
Unpopular opinion from 30 years on the ground: An abusive biometric control propped up by coerced employee consent will almost certainly come with a trail of noncompliance, once you lift the rugs.
As we’ve seen in several AEPD rulings over the past year and a half involving multiple breaches.
But that’s not the case I’m arguing here.
My point is: a company that does things properly and has an objective need to impose biometric control should not face major issues in doing so.
Lawfulness will follow naturally: the real challenge will lie elsewhere, in the impact assessment.
5.3 What About the Collective Agreement?
Note that the collective agreement path is, of course, viable, but—from a strict necessity standpoint—it’s more imperfect, among other reasons because it affects all employees within its scope, and necessity demands minimization at all levels, starting with the number of data subjects affected.
Moreover, there’s this trend of treating collective agreements like some miracle that turns water into wine—or, in our case, turns processing into “necessary.” That illusion was firmly shut down by the CJEU last December.
Watch out for that.
5.4 But let’s just stop right there, you bald nazi son-of-a-b*tch: what about protection for the helpless worker if article 7.4 doesn’t apply? how is his/her freedom to consent evaluated when the processing is necessary?
Let’s remember: many elements of the GDPR were already present long before the advent of automated data processing.
One of them is consent.
And its defects, as regulated by civil law.
That’s why, when we remove from the equation the need to “take into the utmost account whether consent is tied to the performance of a contract in evaluating its freedom”—because the processing is necessary—we’re not leaving the employee at the mercy of a murderous beast: we must still rule out the general defects of consent (error about the object, the person, or the cause) or of will (violence, intimidation) as provided under general civil law.
This has to be done in any case.
Was the consent given under intimidation or violence?
Was it given for the correct object or something else?
Was there deceit or fraud in obtaining it?
In all these cases, the employee’s consent would be null or voidable, under the law—even if Article 7.4 GDPR does not apply.
So no, I’m not advocating for a return to Dickensian times.
6.- The End?
Not so fast, my friend… the last word is yet to be spoken…
Jorge García Herrero
Abogado y Delegado de Protección de datos