In a move likely driven by federal pressure and calls for an investigation, AT&T yesterday announced that it will no longer sell its customers’ location data to third parties, including data aggregators who, in turn, sell the information to others. The public response has been to say “It’s about time” immediately after saying “Hold on, AT&T is selling my location data?!”

The practice, which has gone on for years, can have some legitimate uses: for instance, by providing location data to roadside assistance companies or financial institutions, it’s possible to send tow trucks when needed or prevent fraudulent transactions before they occur. It’s part of the compromise that consumers make when they use technology: although there will be some diminution in privacy (that is, your location might be shared), you get important benefits in return, and that allows you to take advantage of your devices and your data in a meaningful way.
Hmm? What? The data was sold wholesale with seemingly no restrictions? And much of it ended up on the Dark Web? And it was a favorite resource for bounty hunters? Oh.

Didn’t know that AT&T (and Sprint and T-Mobile) were selling your location data to third parties? You aren’t alone. As the New York Times reported last year, what was putatively a tool to monitor parolees could translate easily to more comprehensive monitoring. The revelation follows a well-trod path demonstrated by data-sharing controversies in recent years:
Stage 1: Consumers understand that a company holds their data, and expects it to be confidential unless the government serves a subpoena or secures a warrant.
Stage 2: Consumers understand that companies may just hand data over to the government in national security cases.
Stage 3: Customers understand that companies may just hand data over to the government.
Stage 4: Customers understand that companies may just hand data over to the government, and sell data to third parties necessary to carry out services.
Stage 5: Customers understand that companies may just hand data over to everyone.
Stages 1-3 of this trajectory are what led the Court of Justice of the EU in the Schrems v. Facebook case to conclude that the United States was not a “safe harbor” for international data transfers. The pattern of American businesses handing data over to the government — particularly the intelligence agencies — convinced the CJEU that the US lacked sufficient guarantees of security for personal data.
Stages 4-5 are something different altogether. While both the GDPR and CCPA address the means by which data can be transferred, they do not prohibit that transfer. At most, they created boundaries for appropriate conduct: either get the customer’s consent to the transfer, demonstrate why the transfer has a legitimate purpose, or respect the customer’s command to cease the transfer of data to third parties.

To be fair, detailing all of the causes, occasions, and methods of transfers to third parties is rather difficult, particularly given the complexity and scope of the services telecoms provide is astonishing when you place them in historical context. In 1989, you could use your telecom carrier’s services to call to order a pizza; today, you use them to watch a video linked to a coupon redeemable on an app that tracks your location in real time to bring you a pizza that you’ll Instagram and which perfect strangers will downvote because they’re #pineapplepizza haters. This requires work with third parties, and, as we may have mentioned, we’re believers in the power of data partnerships.
But a customer can’t consent or object to practices they don’t know about. The entire premise of empowering data subjects is giving them control over their digital personhood so that they can’t be exploited or deprived of their right to agency. That’s nice in theory, but for it to work, the law effectively requires that the knowledge a customer has about what their data is used for and the scope of conduct for which they’ve given their consent overlap perfectly — something that, as a practical matter, is not really possible.

How do we resolve the crisis? We’ll talk about options in our next post, but whatever the solution, reasonableness has to be a component. If a reasonable person would either expect that they would have been asked to provide their consent to a data transfer or would have expected that the transfer would not have occurred, then that is the starting point for the inquiry. Not only does this allow for real-world expectations to enter into the solution, but it’s also a standard that courts understand how to apply, and frequently do — particularly in the privacy context for Fourth Amendment cases. If we want to empower people, then we have to begin by thinking about what they care about.
soooo…
IF CCPA defines Personal Information as “… unique personal identifier…” AND CCPA defines unique personal identifier as meaning “… a persistent identifier that can be used to recognize a consumer, family, or a device identifier… cookies, beacons, pixel tags, mobile ad identifiers, or similar…”
THEN must the business obtain the consumer’s consent prior to collecting & sharing unique personal identifiers such as third-party cookies?