The news just has not been kind to Facebook the last three months. First the Cambridge Analytica scandal breaks, then the company runs those wrong-footed commercials (“Sorry for selling access to your data and, you know, for stuff like Brexit“), and now comes the revelation that the apologies needed to go much deeper. Despite an FTC consent order forbidding it, it seems that Facebook really was giving third parties access to personal data for many users, a fact omitted from Marc Zuckerberg’s testimony. The explanation Facebook gives is that users tacitly agreed to that sharing by not changing their privacy settings. As you can imagine, that answer has not been particularly well received.
We’ve made it clear before that this isn’t really about Facebook per se. This kind of controversy could affect any company, because no one has perfect data policies. It’s just that Facebook has been in the news, a lot. In one sense, this is a game of catch-up. It took only a few short years for Facebook to go from from clunky, internet-based freshman catalog to ubiquitous, Orwellian platform for sharing Farmville updates (“No one cares about your blueberry harvest, Kyle.”).
American attention to data security and privacy has been sporadic, at best, and it is possible that this recent Facebook drama is simply the news-cycle catching up with how data science and data sharing actually works. It may well be that, as the public develops a greater awareness of what their data is and how it is used, there is a push to get Congress to require greater transparency and greater accountability.
But probably not.
The reality is that there is little political wherewithal to undertake a massive overhaul of the American privacy/data security framework. The FTC will, of course, continue to patrol the outer limits of data-driven behavior. That oversight will necessarily be limited by the Commission’s resources, though, and instead of being a general “best practices” enforcement regime, is likely to continue to focus on companies that lie to users (say, for instance, when you promise users that their data is private and then sell their real-time location and text messages to a Chinese firm. Whoops!)
Assuming that you are not going to do that (and can I emphasize: please don’t do that), accountability and transparency will have to come at a company level. You have to decide to give users the kind of security and openness that they are coming to expect, even if it is in the absence of a statutory command to do so. It is far easier to tell consumers on the front end that you plan to use data than to explain, after a breach or an unflattering news report, why you did so without their knowledge. Don’t believe me? Then why is Google, which collects way more data than Facebook, not getting pilloried? Because Google is fairly open about the fact that they know everything about you, and we’ve all just accepted it, even when auto-complete reveals more about us than we’d like.
Openness is important, then, but so is awareness of risk. If you aren’t clear on the types of security and privacy dangers latent in your platforms, you’re unlikely to take meaningful steps to avoid them, and that is the lesson to be learned from Fortnite. If you have children over the age of six, you’ve probably heard of Fortnite, a free-to-play but costly-to-upgrade video game exceedingly popular with kids and millennials who refuse to acknowledge that games never improved after the original Zelda. By using ads promising free in-game credits, fraudsters have been tricking younger players into revealing personal data and payment information.
You may say that that doesn’t sound like Fortnite developer Epic Games did anything wrong, and, at present, it doesn’t seem that they did. But the headlines don’t say “Innocent Game Developer Abused by Fraud,” they say “Your Kids Are Getting Scammed on Fortnite.” If you aren’t aware of the risks (for instance, of a phishing scam related to your service) you can’t take steps to avoid them, and then you can’t control the story after it breaks. From what I can see, Epic Games has done a good job of tamping down on this issue and letting users know of the risks, but preliminary groundwork is almost always better than post hoc cleanup.