One of the questions we hear most frequently is “what are we doing wrong?” We almost always try to flip that question around into “what can we do better,” because we’re big believers in the notion that providing goals, rather than chastising, creates the right kind of mindset about data privacy and managing a data partnership. But, to be fair, it’s also very important to identify when something is actually going wrong, and come up with ways to course correct. Simply assuming that you’ve got it all figured out is a great way to end up in a lawsuit.
To that end, we’ve compiled seven of the biggest mistakes we see companies making with their efforts at data privacy, and bundle them here for you to read, immediately say “we don’t do that,” and then six hours later think to yourself “Wait, no, we…yep. We definitely do that.”
1. Procrastination, or “We’re fine for right now”
This is the one that we see most commonly, and it’s the one that is hardest to overcome: simply doing nothing. Many times it’s accompanied by a statement that “we’re going to get there eventually” or “we just don’t have the time for this right now” or “we’re directionally good, we just need to operationalize at a later date.” (That last one is a personal favorite because it combines doing nothing with jargon.)

The biggest problem with this mindset is that it has a subtle calming effect that creates far larger problems later. If you’ve consistently stated that you will deal with privacy sometime in the future, you’re ignoring the fact that it frequently takes months to get a privacy program in place, and the longer your company goes without one, the more complicated the process will be. You’re also running the risk that, in the period before you take any steps on privacy, something (anything) goes wrong. That’s the worst case scenario, and you don’t want to find yourself telling infuriated shareholders “at least we were directionally good.”
2. Mailing it in.
This is where you copy and paste a privacy policy from AWS or Macy’s or some other business totally unrelated to yours, slap together an internal security policy or a WISP, and then declare victory. Privacy: solved! To put this in context, imagine if HR had a two-paragraph policy on employee conduct that it had copied from Wikipedia and that was the extent of its efforts. You would be (rightly) concerned about employee wellbeing, recordkeeping, and compliance. There’s no reason to treat data privacy as though it is simpler or less deserving of time or effort. In fact, given that data touches on every aspect of your business, from employee personal information to valuable derivative data assets, giving lip service to data privacy makes no sense. Worse, stating that you’re taking steps to preserve data privacy but not actually doing it is the classic example of what the FTC considers an actionable unfair or deceptive trade practice. In other words, they’ll sue you.
3. Flat out lying.
What can I say? Some businesses really don’t tell the truth about what they’re doing, or why. But it isn’t just the Cambridge Analyticas of the world, actively covering up their activities and engaging in shadowy, politically-motivated shenanigans. In fact, I would argue that many companies are not being truthful about what they’re doing out of ignorance more than malice. Their internal communications about what is being done are in short supply, and when people do talk about privacy, they’re essentially just talking about cybersecurity, which are definitively not the same thing. This creates a crisis where no one actually knows what they’re talking about but they keep on talking anyway, deepening the problem. The solution is an easy one: don’t lie. The method for reaching that solution, though, is an involved process of education, training, and ongoing review of operations. The effort, though, is well spent if it prevents your company from inadvertently being untruthful.

4. Data Hoarding
We’ve talked about the mass consumption of data before, and why it presents serious obstacles to a good data privacy program. The GDPR makes clear that minimization is a central principle of data privacy, an essential step in safeguarding the privacy rights of natural persons. Companies routinely find themselves in trouble with regulators for taking in more data than they should. And yet, without fail, we talk with businesses that boast about having three hundred data fields on each customer or real-time tracking of client location. Why? Why, when the risks of hoarding data are so clear, is this mass-collection, mass storage mindset so prevalent?
Perhaps because imitation is the sincerest form of flattery, and companies have seen Google, Facebook, and Amazon voraciously consuming data on a global scale for two decades. The thought goes that, if it’s good enough for Google, it’s good enough for us. Except that makes no sense. Why would the business model of a near-trillion dollar company that makes its income largely through selling predictive services and ads (a process that both demands and creates data) work for your business, which almost certainly sells a good or service? It can’t be that emulating Google works for your business simply because you want it to.
It’s not only a bad idea from a regulatory perspective to accumulate data this way (and, let’s be clear, it’s a very bad idea from a regulatory perspective). It’s bad business in most cases as well. Time and again, we’ll talk with clients or other businesses about their data sets, and we often hear boasts about the massive cache of data the company has, and how it’s more comprehensive than any competitor. But when we analyze the value proposition for each component of the data sets, it’s almost always the same outcome: of the many, many fields of data, only about 10-20 are valuable to the company. The rest are merely interesting or, worse, useless. It makes no sense to incur the costs of buying useless data and running the risk of a regulatory action for doing so. Be mindful about your data: more is not always better.

5. Idle Data
I grew up frequently hearing the phrase “idle hands are the devil’s workshop,” because I’m one of five children and, let’s be honest folks, you need to keep that many kids busy if you want your house to remain standing.
It’s a meaningful saying, for a lot of reasons. But the thrust of the maxim is that idleness is a short route to trouble. That concept applies with equal force to data: if it’s sitting idly, it presents a problem. As a corollary to the last point, the existence of idle data begs the question “why do we have this in the first place?” And, in fact, untouched data often is the kind of surplussage that doesn’t serve a meaningful business purpose, and need not be kept. “We paid 10K to have realtime satellite images of every private airport in the state! I forget why, though…”
But the other side to the sin of idleness is that unused data is unleveraged data. So before you put datasets on the chopping block, try to determine whether they represent an opportunity for a data partnership or an unused value proposition. For instance, that airport: on its own it may be relatively worthless to you, but when you look through your data inventory (from when you identified all your internal data assets), do you have another dataset to pair with it to create a novel use case? Say, your privately-owned data set about airport usage paired with publicly-available oil prices and locality-based real property prices? Is there a relationship between increased private aircraft flights in an area and property values that you can identify across states to set out local and macro-level trends? Do you think, perhaps, that hedge funds or investors might want to, maybe, pay for that kind of data?
The point is that allowing data to sit idly is of no use to anyone, especially you. If your data isn’t doing work for your business, and you aren’t doing work to make it useful, you’re simply creating a cycle of purposeless acquisition. Get the data to work, or don’t get the data at all.
6. “I’m not a lawyer, but…”
I love it when people say this because, like “Bless your heart” or “No offense,” you know it means that whatever comes next is going to be utterly, completely awful.

I am in no way suggesting that every issue needs a lawyer as final decisionmaker. I’m not even suggesting that a lawyer needs to be in the room for every decision. I’m suggesting that, in those instances when a lawyer should be in the room that, you know, a lawyer should be in the room. We can go on, at length (and have) about how complicated the GDPR is, or why CCPA compliance will ongoing oversight of data partnerships, or whether blockchain is the answer to all your legal problems (Spoiler: it isn’t.).
The point, rather, is that legal guidance about data privacy has to fit seamlessly into the entire operational plan. You can’t demand a memorandum blessing a proposed processing activity or data partnership like a cherry on top of a sundae and expect the entire operation to magically comply with the law. Instead, you need lawyers who will engage with you from the very beginning, creating an approach to data privacy that permeates your everyday activities and acts as a safeguard against misuse or mistakes. When you treat legal advice about data privacy as an afterthought, you wind up having to retrofit costly, time-consuming privacy structures into your business plan that could, and should, have been integrated more easily. Don’t make that mistake.
7. God Mode
This is the most serious of all the major sins, and rolls many of them up into one.
The biggest problem with this approach is that it exists because of a pernicious view that privacy, responsibility, and oversight are all someone else’s problem. “Yes, objectively I realize that this is a problem, but it isn’t affecting me, personally, right now, and so I’m going to continue moving ahead.” It’s very Francis Urquhart-y.

“God Mode” as a specific instance refers to Uber’s ability and willingness to track its users activity and location up to five minutes after their ride finished. It was a breathtaking invasion of privacy, and allowed for monitoring of activity that was neither necessary to Uber’s business purposes, authorized by the terms of service, nor even legally permissible in the wake of a settlement with the FTC. Not good.
More broadly, we consider “God Mode” to be an approach to data and privacy that subordinates personal autonomy, privacy, and human dignity to a dubious belief that any activity is permissible if it has some commercial utility. The problem with this theory is that it is self-defeating for virtually everyone who employs it. Unrestrained usage of personal data, if uncovered, destroys trust with customers and invites wrath from regulators. If undiscovered, it leads to the gradual erosion of boundaries about data usage within companies that ultimately leads to the costly, unnecessary data hoarding we discussed above and the problematic “creepy” behavior we’ve discussed before. More than this, God Mode inculcates a sense of separateness that undermines the relationships necessary to build lasting partnerships, both with consumers and other businesses. We talk about this at length in Data Leverage when we discuss the “dehumanized” approach to using data and its risks.
Ultimately, you have to choose how to use data, and God Mode is one option — sometimes even a legal one — among many. It’s simply up to you to find the right option.