The Internet went into full “give me a Drudge Report siren” mode last week about Facebook’s announcement that it anticipates a fine of three to five billion dollars from the FTC in the coming months. The fine, a result of Facebook’s violation of a 2011 consent decree with the Commission related to privacy practices, tracking, and usage of data. It would be the single largest fine by the FTC, the single largest penalty imposed for privacy issues in the United States, and a massive statement about priorities in a post-GDPR world.
It’ll also probably change nothing.

Look, five billion dollars is a lot of money — remember when President Trump and Congress shut down the government for a month over $5.7 billion? — but it’s only a lot of money in context. Facebook, as of March 31, 2019, has eleven billion dollars in cash on hand, and six billion dollars in A/R. And, of course, that doesn’t even count the $3bn in reserves that Facebook has set off against the fine it anticipates arriving any day now. Oh, and did I mention that, after it announced its financial performance for the 1st Quarter and disclosed its view on the massive, looming fine, its shares jumped six percent?
What gives? FTC’s largest fine to date was the $22.5 million it levied against Google in 2012, so how can a penalty 200 times bigger not have an effect? The answer is that Facebook is so big, so profitable, and so insulated against monetary penalties that it would take a fine perhaps ten times as large to make a difference. In fact, Kara Swisher suggests just that: tacking another zero onto the fine. Hitting Facebook for a mere fraction of its yearly profits isn’t going to have an effect, especially when this fine is an enforcement of a consent order entered into eight years ago. If Facebook continues with business as usual, reaping untold profits in the meantime, and then only lose some portion of its cash on hand in 2027, why would it change its behavior?

Playing the Game
All of the focus on the fine is, in our view, besides the point and, given the size of the fine, that should tell you something about how far off the mark our attention is. Facebook’s decision to announce the reserves it set against the potential fine serves two purposes. First, it reassures investors that the company is prepared for the financial effects of FTC enforcement actions, and can weather the economic consequences. As we said above, investors responded by buoying the company’s shares and giving their tacit blessing to current leadership remaining in place. And, of course, if the fine is somehow smaller than $3bn, Facebook has a windfall and stock prices will surge.
This well-worn strategy has political roots, which should be unsurprising given Facebook’s recent spate of political hires, including a former Deputy Prime Minister and an author of the PATRIOT ACT. And politics has become something of a second nature to Facebook, as recent scrutiny of Cambridge Analytica and Ireland lobbying scandals has made clear. How is this strategy political? Imagine that you have bad news, perhaps stunningly bad. You get in front of the story by making it seem as though the outcome could be even worse, perhaps catastrophically career-ending. Then, when the results are bad, but only “pretty bad,” the public either shrugs their shoulders or have already accounted for the news and moved on.

Meet the New Boss
Let’s ignore all of that, for a moment, and imagine that there is some kind of shareholder revolt leading to Zuckerberg and Sandberg’s ouster from their roles at the head of the company. Imagine, too, that new, outside leadership comes in and declares that it’s a “new day for Facebook, and now we’ll make your privacy our business model!” (Or rather, imagine that someone at Facebook says that again). That would be enough to change the situation and lead to meaningful change, right?
I suppose that depends on what you think meaningful change means, because your privacy is already Facebook’s business model. Or rather, the negation of your privacy.

We’re at a point when it makes no sense to pretend that Facebook is anything other than a data collection operation with a thin patina of social networking on top of it. And, in itself, data collection is not necessarily bad or immoral, as long as the collection is 1) explicit, 2) explained, 3) voluntary, 4) limited, 5) subject to oversight, and 6) kept within reasonable boundaries. You know: fair. But the very fact of the 2011 consent decree with FTC demonstrates that Facebook’s operations, even eight years ago, don’t meet those fairness standards and, in many cases, dramatically contradict them. Consider Facebook’s storage of hundreds of millions’ of user passwords in plaintext for years — given the plethora of personal data the company maintained, there was absolutely no reason for such a careless risk. And even though the passwords were, apparently, “never improperly accessed,” it’s the cavalier attitude to data protection that is the real issue.
And yet even if we ignore that in our hypothetical scenario, setting aside the shoulder-shrugging approach to protecting personal data, what would change at Facebook? It’s business model would still be the massive collection of personal data for use in advertising products and targeting ad tech. You can’t change what the company does without changing what the company is. Businesses have adopted strategies to try to change their image, but they are still only glossing over the reality of what they do: no matter how many flatscreens it installs in stores or woke, meme-filled tweets its PR team sends, Wendy’s is still a hamburger company that wants to sell you hamburgers. Regardless of who is at the top or what its public mea culpa sounds like, Facebook will still be a data consumption and packaging company.
So, What Now?
There are all kinds of recommendations for how to actually tackle the Facebook problem. As mentioned above, Kara Swisher thinks the fine should be in the 30-50 billion dollar range, just for a start. Sen. Elizabeth Warren wants to split Facebook up into multiple companies. Others are calling for more aggressive investigations like those just announced by the New York Attorney General’s office. Won’t a combination of some or all of those work?
The frustration we feel in our inability to “solve” the problems with Facebook arises from our own cognitive dissonance about simultaneously wanting targeted content and privacy, tailored experiences and anonymity, convenience and autonomy. We’ve imbued ease of access and transactional simplicity (“One Click Buy” “Use Facebook to Log In”) with a sense of both necessity and inevitability. There’s nothing wrong with ease or a frictionless experience, of course, but it has to be part of a transaction that, fundamentally, respects the rights of both parties. Otherwise, you get a very shiny, friendly-looking clickwrap agreement.
All of this is to say that actual change for Facebook (as opposed to the Fichtian Wrongdoing-Apology-Slightly Modified Wrongdoing that Facebook has done) requires us to change first. We need to decide just what kind of accountability model we want to have in place by deciding what the important values are, and whether they’re, relatively, more important to us than the services or products that Facebook provides. That’s true for businesses as much as individuals: relying on Facebook’s ubiquitous cookies or ad products is a tacit approval of their business model. Do you have to stop? Of course not. Should you? That’s a question for you alone to answer, but it’s probably time to start asking it.
