Biometric data is information at its most sensitive. Not only do health and physical characteristics carry with them the very concept of our personhood and humanity, they are also often immutable and, therefore, permanently identify us. I can change my email address or my password, and I can even get a new legal name if I want to, but my fingerprints and my retina will never change. Once someone gains access to that information, they not only have access to something that confirms my identity, they have access to my identity itself. It’s why, when we think about dystopian futures, biometric data almost always makes an appearance.

Laws surrounding the safe collection, use, and transfer of biometric data are not new: Illinois had one as far back as 2008, when it passed the Biometric Information Protection Act, or BIPA. The law, like many that would follow it, aimed at providing individuals with the tools necessary to understand the consequences of giving away their biometric data. The premise is that, as long as they understand both that a company is collecting their biometric data and that this data is extremely sensitive, individual data subjects will be more cautious about providing their consent. The results have been mixed, though: while plaintiffs have filed more than 200 BIPA-related class actions since 2008, there are no indications that Illinoisans have changed their approach to the collection of biometric data, at all really.
A Move Towards Mass Collection
GDPR made some movement towards standardizing the protection of biometric data when it created heightened requirements for collecting sensitive data. Article 9 of GDPR holds that “the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person” is flatly prohibited, unless there are extremely limited legal requirements in place or the data subject provides their “explicit consent,” which means actually signing a form, or checking an additional box, or taking some other unambiguous action that denotes agreement. Relatively speaking, it’s a very high burden, the epitome of the “friction” that marketing departments are always telling us to get rid of.

There’s an important catch here (there always is). Insofar as the collection of biometric data is related to a legitimate public interest purpose, it is not only permissible, it doesn’t even require consent. In other words, if EU Member State governments or the EU itself and its institutions (which, although not bound by the GDPR, are bound by a very similar set of laws enacted last December) can demonstrate that they’re collecting biometric data for a public interest purpose, they can engage in wholesale biometric data collection, with some minor exceptions.
“But wait,” you might say, “won’t that just create an incentive for governments to build enormous biometric databases and simply claim it’s designed for national security or public health or something?” How cynical can you be, of course they’d never create a massive biometric database of every individual entering or leaving the EU, regardless of country of origin, to be stored and cross-referenced by EU officials in the course of carrying out duties which, conceivably, also include tracking and surveillance.

This new repository is called the Common Identity Repository or “CIR,” because, honestly, who doesn’t want an all-knowing, AI-managed biometric database called “Seer?” We’ve mentioned that this has not exactly been the best month for the EU in terms of upholding its newly-claimed mantle as the protector of a free, rational Internet. But it almost seems like the EU Parliament is working in the opposite direction from GDPR’s central thrust. By collecting and, more importantly, categorizing and processing, billions of biometric data points, the EU is effectively saying that it wants to follow the American model, given that the FBI and Homeland Security operate a similar, giant biometric database in the US. Of course, it was the rejection of the US model that prompted the creation of the GDPR, but who’s keeping score?
Too Much, Too Late
The problem with databases like CIR is that they represent a collection of personal information so vast, so valuable, and so impossible to duplicate, that it becomes the top target for data theft and state-sponsored action. For example, biometric authentication has become standard for most high-security or high-sensitivity spaces – we have come to a point where retina or face scans are as ubiquitous in airports as metal detectors are. But as biometric authentication becomes more pervasive (it does unlock your phone, after all), the attractiveness of a large set of biometric data only grows. Unfortunately, we know from experience that personal data kept by governments in this form are a ripe target for hacking – whether it’s multiple breaches per year at the Defense Department or a single massive breach of personal records kept by the Office of Personnel Management, any time governments keep large stores of data there is a dramatic upswing in hacking activity.
The EU will, undoubtedly, claim that it will implement the “most robust security protocols ever” for accessing the data, which is possibly the truth. But when one stated purpose of the database is to facilitate faster, more efficient border checks by EU officials, you know two things: 1) the database will be accessed by hundreds, if not thousands, of lower-level staff who will actually perform the checks, and 2) the systems will need to be user-friendly enough to allow those lower-level staff to meaningfully engage with the underlying data to verify identity. In short, there will be more than ample opportunity for staff errors, data mishandling, or third party access.

This isn’t to say that governments can’t, or shouldn’t, maintain stores of personal data. Nor is it the case that the mere collection of biometric data is an inevitable move towards a cyberpunk future where we all wear tracking devices and driving goggles, for some reason. But the initial impulses behind BIPA and GDPR were correct – biometric data presents substantial risks to individual privacy and to the integrity of digital autonomy. CIR and similar caches of biometric data — whether in the hands of a government or the hands of an employer — create a disproportionate power dynamic that can, if unchecked, have seriously harmful effects. We have to create, and stick to, procedural and substantive safeguards about how this sensitive data is used and why. If we don’t impose controls now, we may not have the option later.