Employee Monitors and Big Brother at Work

Although we consistently discuss the importance of managing data about customers or partners, it’s crucial to pay attention to a key demographic of your intrinsic datasets – employees.  Your team generates an enormous volume of data simply by showing up to the office (HR data, payment information, personal login details and passwords, etc), and the work that they do, in many ways, generates the information that drives your business.  Understanding how their work affects dataflows at the company is not only appropriate, it’s extremely helpful, and can unlock new ways to achieve efficiencies, develop programs, and identify future stars.

But, people, this?

https://i0.wp.com/lucid.me/wp-content/uploads/2016/04/EmotivInsight-1024x579.jpg?resize=401%2C227&ssl=1
“We put the ‘you’ in ‘dystopian!’  Wait, no, I meant…”

Emotiv’s headsets are a wearable EEG monitor that track brain activity and brain waves.  It’s just one of a number of brands that are working towards integrating biometric informatics into the workplace.  The stated purpose is identifying how and when a mind is actively engaged with work and focused, rather than distracted.  This is another one of those instances where the technology is morally neutral, because it is possible to use the technology for identifying or treating some brain-related issues such as sleep disorders, a use that presents virtually zero ethical or compliance-based risk.  But let’s reiterate that these devices exist precisely so that they can track employee brain functioning to determine who is being efficient and who isn’t.

Tracking or monitoring employees is always a risky endeavor, but tracking their brain functions

Team Ideas – The Bridge to S.H.I.E.L.D. | Critical Missives
Not the ideal employee-data strategy.

brings things to an entirely new level.  In order to justify the collection of any personal data, GDPR requires a controller to have a justifiable basis, and the data must be processed, stored, and shared in a manner that is transparent to the data subject.  In other words, you have to explain why the data is necessary to a lawful purpose, and then only use the data for that lawful purpose.  Employee data is a special case (for reasons we’ll see), and collecting biometric data is also a special case (because obtaining sensitive data involves lots of procedural hurdles), but when you collect them together, you’re essentially doubling down on risk.

 

So let’s look at two key issues regarding employee tracking to understand the risks it poses.

What is the lawful basis?

The first question about any data processing must be “what is the lawful basis” for the data, which is a very different question from “why are you collecting this data.” How did you get the authority to collect the data?  If you can’t answer this question, you can’t collect the data.  In the law, because everything has to be in Latin, we’d say “quo warranto,” meaning “by what right?”

Image result for romans gif
Even the Romans got tired of Latin.

Some employers may say “well, that’s simple.  My employees, my data!”  That….that’s not a good approach.  Not only is it dangerously Petruchian in character, it’s also just incorrect.  The American view of business data has always been that PII is about customers, and not employees, which is why internal data is usable at will.  That is definitively not the approach under GDPREvery act of processing personal data, regardless of source, requires a lawful basis.  And so if you intend to monitor your employees, what’s the basis?

“Consent!”  That’s the easiest answer, but it’s not a good one.  GDPR recognizes the obvious power disparity between employers and employees, a disparity that makes actual, meaningful consent all but impossible.  Will an employee ever truly feel like they can say “no, boss, you can’t track me” without fear of retribution?  Or that they can decide to withdraw their consent at any time (as is their right under GDPR) without fear of consequence?  Of course not.  Which is exactly why the key guidance from EU regulators forbids it:

It is unlikely that an employee would be able to respond freely to a request for consent from his/her employer to, for example, activate monitoring systems such as camera- observation in a workplace. . . . Therefore, WP29 deems it problematic for employers to process personal data of current or future employees on the basis of consent as it is unlikely to be freely given. For the majority of such data processing at work, the lawful basis cannot and should not be the consent of the employees (Article 6(1)(a)) due to the nature of the relationship between employer and employee.

This isn’t to say that you can’t find a lawful basis for monitoring employees, but it had better be a stronger basis than “this will help us increase efficiency.”  Any activity, no matter how invasive, is justifiable as a move towards efficiency.  If you want to convince a regulator (or a tribunal) that your processing wasn’t unlawful, it’ll take something more specific, more closely tailored.  Are you monitoring employee location at a nuclear power plant to ensure that they’re not in potentially risky areas for lengthy periods of time?  Sounds great!  Are you monitoring their brain waves so you can see who wasn’t paying attention during the last conference call?  Sounds less great!  Reality typically falls between those two extremes: just make sure you aren’t falling on the Black Mirror side of the line.

Black Mirror Metalhead GIF - BlackMirror Metalhead Dog GIFs
Still, productivity is up this quarter…

Data Minimization and Purpose Limitation

We talk a lot about “lean data,” the concept that the most important data is not the most voluminous data, but the data that provides the key insight.  It’s an idea that fits in neatly with the GDPR’s emphasis on data minimization – keeping only the data that you need to accomplish your goals.  This is an idea that gets a lot of pushback, particularly from employers.  The objection is largely that whatever information generated by a business belongs solely to the business, and that there’s no reason to worry about employee-generated data if they created the data in the course of their employment.

Football Category Stumps 'Jeopardy' Contestants - Simplemost
“But you’re sitting on a gold mine, Trebek!”

Remember that personal data isn’t “personal” in the sense that it belongs exclusively to the private domain. It’s personal in the sense that it derives from a person – including an employee.  So imagine now that you’ve collected months’ worth of brain wave data about your employees after you’ve obtained, justified, and document a lawful basis for its use.  What are you going to do?

You’ll analyze it, of course.  The Emotiv headset, after all, promotes itself as a means of identifying productivity and engagement in wearers.  Uses aside, we should all be able to agree that this is amazing technology – the kind of thing that might have been unthinkable even a single generation ago.  But now that it’s here and we’re deploying it, are we really making the right use of it?  That is, will it provide lots of data or will it provide lean data?

Here we run into another conundrum of technology: the discernment of meaning.  The massive amount of data you possess when you map brainwave activity of dozens of people over time will, no doubt, provide all manner of interesting metrics.  But interesting things are not necessarily meaningful or useful things.  A slinky is interesting, but it isn’t meaningful.

Happy GIF - Find & Share on GIPHY
Usually.

How will you know if the data provide value?  It requires critical analysis, and a very deep understanding of how the data relates to the actual humans who produce it.  Again, it’s what we call the “humanized approach” in Data Leverage.  A depersonalized, “it’s all just data” method might see that one worker has lower brain activity over the course of the day, and could put them in line for a negative review.  But what if that one person was just highly efficient?  Or has periods of extremely high productivity followed by hours of far less productive time?  Does that person merit a bad review, or being passed over for promotion?  Without understanding who the person is and how they work, the data can mislead, even if it doesn’t lie.

The point here is that gee-whiz technology combined with masses of data can be powerful tools, or they can mislead you.  The determinant, as always, is you, and how you choose to deploy those tools.  Slapping on a brain-monitoring headset might give you information about how your employees think, but it won’t provide value unless you know how to think – about data, about the meaning of the data, and about the ethics of what you’re doing.  You need to be able to turn data into knowledge and information into wisdom.  That is the ultimate lesson of the humanized approach: without the human element, it’s all just data.

 

 

 

Leave a Reply