Fads are a big part of January. Everyone wants to hashtag their efforts at keeping up their New Year’s Resolution (#NoCheeseMonth, #NoCarbMonth, #NoFunMonth), and our collective refusal to acknowledge that the holidays are over means we’re all still desperate for distractions. But nobody minds, because we all love fads — it’s why we get obsessed with one hit wonders and dance routines (and, occasionally, both at the same time).

Yesterday, we got to see the confluence of a few of our favorite fads: megaconferences, big privacy promises, and clashes between tech giants. At CES, the Consumer Technology Association’s huge annual conference, representatives from Facebook, Apple, and others sat on a panel together. The topic, unsurprisingly, was privacy, and how these companies are adjusting to the changing political and regulatory landscape. There were the usual promises about “taking privacy seriously,” and recognition that governments are becoming increasingly concerned about the disproportionate power that businesses have to collect, and monetize, personal data.
And then, more than once, Facebook’s privacy czar Erin Egan made the claim that Facebook is just as protective of user privacy as Apple is.

Uh…..What?
Who knows why Facebook decided to opt for that as a talking point because, even if it is true (which, you know, it isn’t), absolutely no one believes it to be true. It’s a scenario where saying something repeatedly is just going to make listeners angry, rather than believing that you’re right, like your uncle who keeps trying to tell you that Starland Vocal Band was the greatest group of the 70s. (It was obviously the Bay City Rollers). But given the last few years, it’s hard to imagine why Facebook would make a claim so bold, and so brazen, in a public setting.
Part of it is that Facebook and Apple are in entirely different businesses. Apple is a merchandise and services company that has an extremely potent data collection practice, while Facebook is essentially an advertising sales company driven by a massive personal data collection effort. It’s natural, if not desirable, for Facebook to have practices designed to pull in as much information about their users as possible, if only to be able to better market its services to companies that want to buy advertising services. But there is simply no question that the wholesale data consumption at Facebook is on a scale comparable really only with Google, and that allegations (and, really, proof) of malfeasance have dogged Facebook for at least a decade. Apple? Not so much.
The reality is that Facebook can’t make much of an argument about privacy, no matter how many times they repeat a claim. Consider their new “Privacy Checkup,” which claims to provide users with the tools and information they need to create the kind of privacy controls best suited to them. It’s interesting, certainly, and the UI has been made easier and friendlier, but there’s a catch: it changes the privacy setting only as they relate to other people on Facebook, and not Facebook itself. In other words, you can makes sure that facial recognition is turned off, or that your creepy next door neighbor doesn’t get to see you posts, but you still have just about the same level of control over what Facebook does with you data as you ever did: effectively bubkes.

So why does Facebook say things like this? Why make a claim that’s so close to being just flatly untrue that it risks being called out? It’s a game, really. A language game, and it’s one, in the privacy sphere, that’s been underway for a very long time.
Quit Playing Games
The concept of language as a game traces back to famously toussle-haired and famously

ornery Austrian philosopher Ludwig Wittgenstein, who always looks to me like a cross between a (somehow) crankier Peter Capaldi and a (somehow) moodier Samuel Beckett. Moods aside, Wittgenstein was a brilliant, epoch-defining thinker who changed how we conceive of our use of language. According to his theories, all communication related to its context, and so a word or phrase had no independent meaning: language was not a standalone tool that reflects reality or truth. Instead, language is a tool that we use, and it only becomes meaningful by the way we use it in a certain circumstance. For instance, if I shout “Traitor!” at you, I might be challenging your loyalty to your country, accusing you of changing your allegiance to a football team once they reach the playoffs, or making a Star Wars reference. You only know based on where we are, what we’re doing, the nature of our relationship etc etc. For Wittgenstein, when we communicate with one another, we’re playing the communication game, constructing its rules, and conveying ideas all at the same time.
Super! Except what happens when we’re playing different games? What if I use language in an attempt to induce the listener to believe I will do one thing, when in reality I intend the opposite, or something very different? Obviously, our communication is flawed, and wherever we take our interaction, it will carry the taint of that initial lie. More charitably, what if we’re merely talking past one another because our words carry different meanings based on the contexts from which we came to our meeting with one another? How could we communicate with one another in that situation and expect to reach the desired outcome? At least one of us is going to be disappointed, and maybe both.
This is where we find ourselves when it comes to privacy: Facebook isn’t playing the same game that we are, at least when it comes to privacy. When they say privacy checkup, we hear “control over my privacy and what is shared,” but Facebook means “control over other users’ activity.” Facebook’s meaning is unclear because they never come right out and explain what they mean, and because privacy is a very complicated subject that makes contexutalization difficult.
Think about it this way: when we normally deal with a company as individual people, it’s in the context of a purchase and sale. There’s little room for confusion because our context is clear: buyer and seller. Starbucks says “$4.00 for a latte,” I say “$*@$# fine, take my money,” everyone walks away clear about what happened. But Facebook (like other social media or tech companies) operates at the very center of our personhood, our identity. The difference is that Facebook uses our misunerdstanding and monetizes personal data at the expense of the very privacy it claims to promote.
“Privacy,” of course, is an extremely loaded term. When Facebook says it, they use it in the context of a commercial enterprise embedded in a complex regulatory scheme; when we say it, we’re talking about who gets the right to peer into our life. Two games, two sets of rules, two sets of meanings. That’s why we’re frustrated with Facebook, but it’s also why Facebook says that it’s just as good as Apple: we’re all is talking about different things, in different contexts, for different reasons.

Clearing up the Mess
GDPR didn’t clear up the confusion, and the linguistic hot mess that is the CCPA certainly won’t help either. What we need is a new taxonomy of privacy, a set approach to talking about privacy that gives everyone a shared context with mutually intelligible rules and parameters, if not outcomes. It’s a process that begins with changing the way we expect, and require, businesses to communicate about what they’re doing. The incessant legalese, the convoluted terms, and the byzantine clickthrough structures all have to go, for a start. From a consumer-facing perspective, that’s simply a prerequisite.
The only way that happens is if we change the way businesses think about privacy more generally. the conversations about privacy have to shift away from “what do we have to do about this privacy business” to “what do we have to do to make privacy our business.” That change is what GDPR was meant to inspire, but it has not materialized, even a little, as of yet. It’s more than just privacy by design, although PbD is absolutely essential. The real change is when businesses and individuals alike recognize that privacy doesn’t destroy the ability to deliver goods and services to individuals: we all bought things before our IoT toaster spied on us, we’ll continue to do so if the surveillance stops.
This year, we’ll spend a fair amount of time talking about strategies for doing that, including promotion of verified answers, identifying sources of truth and trust, and delivering on easily-made privacy promises. For now, though, it starts with speaking the same language when it comes to privacy. When that happens, I’d be very interested in hearing what Facebook has to say.
