The race is on! California jumped to an early lead to get the California Consumer Privacy Act (CCPA 2018) onto their November ballot, but look out! Here comes Vermont, from out of nowhere, to try to be the first State with a GDPR-like law on the books.
In this episode of the Are You DataSmart? podcast, the Ward brothers dive into the differences between each of these proposed privacy regulations, how they differ from the GDPR, and most importantly, weigh in on the likelihood of further States jumping into the race!
Jay: “Are You DataSmart?” the weekly podcast on data security, information management, and all things related to the data you have, how to protect it, and maximize its value. I’m Jay Ward.
Christian: And I’m Christian Ward. Today we’re going to talk about an Americanized, American version, or an American attempt at having GDPR. Jay, what we’re seeing in the news, people are starting to talk about this already. You wrote some excellent blog posts around California that is starting this process. It says that it’s sort of listed as CCPA on the website, but that sounds a lot like the Consumer Credit Protection Act. So explain to me why their naming convention is so terrible, and what is California doing?
Jay: I have to imagine that there are Irish people involved, and that the Irish-Americans are involved in the naming because, you know, we’re just not very creative which is why my first name is James and your middle name is James.
Christian: That’s possible.
Jay: But it’s just there’s not a lot to do. You know, you’ve got consumer and privacy, so you need Cs and Ps, and that’s basically all you can do. You know, California has COPA and so does the federal government. Children’s Online Protection and the California Online Protection, it’s a big mess. But the primary point that’s important here is that California is sort of wading into the regulation of data in a way that’s a lot more robust than has been in the past. And, you know, we’ve talked about this before. We’ve said, it’s unlikely that there’s gonna be a federal data security or data protection law other than the CLOUD Act, which dear Lord, there’s a lot to talk about there, but it’s not exactly a data protection act. But California has sort of waded in as we’ve predicted. You know, the states are gonna get involved and that’s what we’re looking at. Not just California as we’ll talk about later.
Christian: And isn’t it true also that, I mean, the whole reason we even have online privacy notices is because of the California law? So didn’t they start this whole thing?
Jay: They did. It was a ballot proposition a while back, Prop 65, I believe. And it essentially said, “Look, if you’re gonna be doing business in California, you need to have these online privacy acts.” And actually, Prop 65 was all those warning labels that, you know, this burrito you’re about to enjoy has chemicals in it, which have been proven to the state of California to cause cancer.
Christian: My most recent one of that was I think a towel, like a beach towel had this like whopping sticker on it. And I’m like, “Guys, I’m at the ‘Jersey Shore.’ I’m literally on the other coast, but sure, why not?”
Jay: No kidding. I had a bottle of water once that I picked up that said that. And I sort of looked at it and I thought about it, and I was like, “Nah, I’m pretty thirsty.” What are they gonna do in California?
Christian: You just described most people’s reaction to every cookie privacy pop-up window. Same exact thing.
Jay: Exactly. And by drinking this, you have consent to all the possible chemicals that we’ve put into your water.
Christian: And all of your privacy going down the drain with your chemicals, but awesome.
Jay: Yeah. That’s the new smart bottles they tell you and all about the drinker.
Christian: When we talk about California and what they’re attempting to do, I read through a few articles about it, but you had sort of covered a couple of the big distinctions. And naturally, I think, GDPR as we dove into various portions of it, have some really great forward-looking concepts that, you know, are trying to get the heart of transparency and be consistent with the transparent message. How do you see the California law stacking up against this? And what do you think the likelihood is that we’re gonna see California pass it? And then I definitely wanna also talk about Vermont.
Jay: Well, it’s interesting because California has this sort of funky ballot proposition procedure. And we’ve seen some pretty controversial things go on these ballots before. And the way that it works is if you get enough signatures, it goes before the people of California, they vote and there’s no enabling act, there’s no nothing. It’s just a law. And it goes into effect day one.
Christian: It doesn’t require a governor’s signature. It doesn’t require any sort of additional…
Jay: Zip. It is California direct democracy, which is fascinating. But it also, I don’t know why I just did Christopher Walken there. It’s a fascinating change. It’s an interesting way to deal with thorny issues that might have difficulty getting through in Sacramento, because, you know, if you look at this ballot proposition, it’s essentially self-funded by one guy, Alastair Mactaggart, who’s a very prominent and successful real estate developer in California. And then he has two other people involved with him, one of whom is a former CIA agent, and the other one who worked at BlackRock. So you need to figure out which one of them has had more power over the course of their career.
Christian: That’s incredible.
Jay: Yeah. But on the other side…
Christian: I feel like a “Lifetime” movie series coming on.
Jay: I think that’s probably right, or maybe an “E! True Hollywood Story.” But so what happens is, you get these groups that are lined up on the other side and California election law is fascinating. You have to disclose who’s on the other side. And it’s Facebook, Microsoft, Apple, Google, Verizon.
Christian: AT&T I think was also.
Jay: AT&T is in there too. Now, Facebook backed out of opposing the act, even after they gave $200,000 to oppose it, because, you know, as we mentioned, you know, Facebook’s got a couple of other things going on. They’re putting on a clinic about how to drink a glass of water in front of Congress convincingly. And that’s some other stuff about privacy. But there’s all of these big industry groups and companies that are opposed to the act. And the way that they’re framing it is it’s gonna be bad for business. It’s gonna put people out of work.
Christian: Well, that’s the name of the PAC, right? It’s something like Protect California Jobs, right, PAC. And it doesn’t mention anything about privacy, anything about what’s being done with data. But we wanna save those jobs. I also, by the way, I’m trying to think of the last time where regulation from a data or a privacy or a confidentiality stance actually hurt jobs. Typically, it means you need an army of additional people to stay compliant. So it’s very common for that to happen.
Jay: Well, so the interesting thing here is that the reason why they’re saying it’s gonna hurt jobs is that this law has, among other things, a statutory damages provision. And that’s really powerful. So in California, under their False Advertising Act, the FAL, and their consumer protection law which are really, really aggressive, they’re basically private attorney generals statutes. You can cause just a huge amount of damages in a lawsuit. You can, you know, impose massive damages on a company for a, you know, relatively little amount of misconduct. And, you know, it’s designed as well as these laws are to police the outer boundaries of bad behavior. But if you look at the cases in California, the primary case law is about a locksmith, or it’s not necessarily just about the big tobacco companies or people who have a massive economic impact. It’s I don’t like what this one company did.
And so the idea is by framing this law as essentially a new variant on the false advertising law or the Consumer Protection Act, the opponents of the CCPA or the CalCPA, whatever you wanna call it, are saying, “Look, we’ve been down this road before. You know what this looks like. It looks like a lot of costly litigation that doesn’t really get anything done.” So it’s an open question about whether or not that’s gonna be successful. They’re pouring a lot of money into it. And even though it’s on the ballot and it’s comfortable on the ballot for November, it’s not a sure thing in my mind that it’s gonna pass.
Christian: Yeah. And when I look at this, look, they have things like the right to know what data is collected, which obviously mirrors the GDPR. The right to know if the data is sold, which I can’t recall. I know GDPR, you’re supposed to talk about who’s processing data, but do you have to actually disclose if it’s sold? Because sold connotes a monetary benefit to the seller. Is that something that GDPR also says?
Jay: It does. Yeah. You need to identify it to the individuals or the organizations to whom the information has been transferred, like who has it, or once you’re a recipient of the data, you have to disclose to the individual that, “You know, hey, we now have your data.”
Christian: Right. But the reason why I bring that up is if it is specifically sold and it’s not just transferred, transfer is much more broad, right, because there are data brokering arrangements, I’ve probably done 30 of these over the years, where there is actually no monetary transaction.
Jay: Sharing not selling.
Christian: Sharing not selling, sharing is caring. So the whole thing to me is if this is only when sold, it’s actually very, very easy to go around that because there is an entire advertising network platform obviously built upon private data. So when I use those, by the way, I’m bidding on ads. I’m bidding on ad sales and I can target audiences based on their data. I’m seeing that data, but it’s not necessarily the same thing, because I haven’t paid for the data. I’m paying for the ads. So I think it’s kind of easy for people not to adhere to that.
Jay: Well, this is why lawyers have jobs, right, because laws are not written in a way that gives a perfect explanation or perfect clarity as to what’s covered. So you have to imagine that if this goes into place if this law goes into effect, there’s gonna be a slew of litigation of cases about, you know, is this covered? What constitutes an actual sale of the data? There’s a lot that you would have to think about before you could answer it. And there are other things, you know, divergences from GDPR, like GDPR talks about legitimate interests. It’s one of the hallmarks of processing under the regulation is you can process as long as you have a legitimate interest.
Christian: Yeah. And I think, you know, I wanna switch gears here to, well, actually before we get to Vermont, which I feel like it’s sort of California and Vermont, that each one was buying for, no, I’ve got this one, I’ll jump in before you can get to the vote in the fall.
Jay: Yeah, there’s definitely like a “hold my beer” here.
Christian: No, Vermont, it’d be “hold my double IPA“. So for this, I think that the question that you had brought up was the damages. And this is the reason why I think the California one might have difficulty passing on just the initial ballot is just the damages as you outlined in your article could just be staggering. And the real crux of it is that they don’t have to have proven harm, right? So there’s this massively wide open thing particularly when you consider, it doesn’t specifically outline but I always like to point out to people that when we talk about records, records compromised or records where there are a person’s record, you have to understand that a record actually consist of potentially dozens to hundreds to thousands of additional data points inside of the record, right?
So I’m really curious, how does this line up? What does it look like in terms of the potential fine. If someone has a million records breached, and inside of that, there are 50 million fields that are individually known or knowable about a person, what does the fine look like for something like that for a million under this California concept?
Jay: Yeah. I mean, I think the way that this statute would be construed is essentially that it’s a transactional issue. Like, if there is a breach with respect to this individual data subject, regardless of the extent to which they’ve been harmed, because the statute talks in terms of either a statutory minimum thousand dollar fine or penalty, or the actual amount of damages, if all 50 million fields about this data the subject have been compromised and their social security number and their credit card number and all that stuff’s been taken, and somebody, you know, charged a first-class flight to Bali, you get that.
But the reason why that’s less interesting to me is this law sort of rejects some of the case law that’s emerged in the past couple of years, especially after the Supreme Court Spokeo decision. It rejects this notion that you need to show an actual concrete harm caused by the breach. And it says, “Look, if there was access, we’re talking about $1,000 minimum fine.” And you might think, “Well, a thousand bucks, you know, that’s not a lot if Google had someone that was hacked.” But if Google was breached and there were 50 million records accessed, you know, that same 50 million field document is accessed, that’s a $50 billion statutory fine. That is an economy-stopping fine imposed by the state of California. And obviously, that’s an extreme example but lawyers love those because it makes us sound smart. So what do you do in the absence of clearly defined boundaries of what constitutes an actual harm versus minimal speculative harm? There’s a lot of issues that go into these things that typically relieve to courts. But when the statute itself says, “Boom, you’re entitled to $1,000,” that discretion is taken away.
Christian: Yeah. I see that. I think, look, that’s the part where I understand why there are PACs that are attacking this, that just literally the exhaust of a breach would be enough under this particular law to, you know, just completely attack the business model of a lot of these businesses. I wanna jump over to Vermont because I didn’t even know this was like on the horizon. And so we have Vermont earlier, I think it was last week started or enacted a law regarding data brokers. And I find this funny. Obviously, I used to be at the helm of the data for one large data broker here in the United States. And this law is very interesting because they define data broker in a way where this could be stretched in a whole number of different ways. So it says, “Data broker means a business or unit or units of a business, separately or together, that knowingly collects and sells or licenses to third-parties the brokered personal information of a consumer with whom the business does not have a direct relationship.”
That kind of almost means everyone if you really break it down. Because if I am gathering data in any way, shape, or form about my customers, for example, I’m gathering their email addresses, I’ve got some cookies tracking them online, I’m tracking to retarget my products to them, and then I join other a data co-op, or I’ve joined some sort of additional third-party data tool, if I augment my data, so once I’ve signed a license to share with a data broker my initial data so they could augment the file and send it back to me, I’ve sheared all that data. And there is a license. This license doesn’t say one way or the other. It’s a license in general. And ultimately, it could get very, very gamy very fast.
Jay: Well, and you touched on something at the outset which we were talking about, which is cookies, right? So you’re tracking the information of individuals with whom you don’t have a direct relationship, as long as that individual visited your website. So we’ve tracked their IP address, we’ve tracked whatever information we can glean from when they were on the website. Let’s imagine the situation where someone logs into your website and they are in the process of creating an account. They’ve entered their name, their address, now you have the IP address as well. And they’ve, you know, clicked on a couple of things that they think they’re interested in, and then they don’t create the account. At what point does all of that, because there’s still no direct relationship because they haven’t consented to you getting their information. They put it in and they’re like, “I’m gonna back off.” Well, that counts.
Christian: You’re gonna see the other side though, people go, “Oh, no, no, we have a relationship. They visited our site. We’re tight. We’re like that relationship is solid.”
Jay: And that’s the interesting thing because the EU has said, “No, no, no, that doesn’t count. Merely visiting a website doesn’t count.” And California has said, “Well, if you’re doing business in California or if it can be sold, if your product can be sold or reach an individual in California, that counts.” There is this very fascinating sort of fight between…or it’s fascinating to lawyers and legal nerds like me, like, between the jurisdictional notion of creating a relationship or what we call substantial contacts or direct contacts under European law, under American law, under state law, all of these things are gonna come together, and you’re gonna see a real fight over what is the type of relationship that counts.
Christian: Yeah. My sense is, now this one, is this already a bill? Did the Vermont one pass, because I see the governor is saying he might veto it. So it puts $100 fee, I believe 100 bucks on data brokers. And apparently, he’s saying that violates his pledge not impose a new cost on Vermonters, which is absolutely it’s $100. So I’m not sure if that’s not a political wrangling. But the reality is, this is the second one. I thought California might be the first. This is definitely right akin to the same thing. It covers the same things, what data is gathered, what is shared. You know, I think we’re gonna see a ton of these.
And a lot of times, anything in this arrangement, we’re gonna see a lot of blue states and then followed by red states. But in the end, you’re gonna see a lot of new laws that are taking some of the best pieces of GDPR. And we’ve been talking about this a lot, Jay. I think, you know, to leave Europe and to say, “You know, we don’t have a lot of operations there. We’re just gonna pull out of Europe. We’re gonna shut down the sites. We’re gonna pull an ‘LA Times,’ no more ‘LA Times’ throughout Europe.” And, well, at the same time…
Jay: Well, that just did not go well for the “LA Times”.
Christian: I know. And they were called out, you know, leave it to “The Wall Street Journal” to just completely obliterate any of these companies.
Jay: The “Journal’s” reporting was so gleeful. They’re like, “Oh, hmm, well, I guess who’s not ready for the GDPR, isn’t that interesting?”
Christian: Yes. Yes, yes, yeah. That the journalistic integrity. The snobbery around were you ready or not. I also loved all the tweets of, you know, this only shows how difficult this is. We’re like, “You’ve had years to prepare. Like years and still, you’re complaining.” Anyway, I think the point is, you’re not gonna pull out of California. It’s the fifth largest economy in the world. It’s literally its own country. So you can’t run from this stuff anymore. Maybe in Vermont, that could probably make an argument. I don’t have to be in Vermont. But generally, I can avoid California as any…
Jay: Dude, I’m not giving up my Cherry Hubby Ben & Jerry’s. I’d rather comply and pay the $100.
Christian: That’s true, that’s true. Yes, I agree. So I think from our perspective, from the DataSmart perspective, we’re both in vicious agreement. This is not going to stop. There will be a continued focus on providing not only the information about what is being gathered by sites, but allowing for a control to go back to the consumer or the individual, and letting them have transparency into how data is used. And you’re gonna see more and more of this. So while Vermont is definitely trying to, you know, hold my beer to California, California is gonna make a run out of this fall. But this is not going to slow down. I think we’re gonna see a lot more of this.
Jay: Yeah. And don’t anticipate any time soon the federal government coming in, because even though I think pretty clearly if it wanted to, it could occupy the field, which is it’s actually a legal term of art about preemption like the federal government could come in and say, “Okay. This is mine now. States are not allowed to make laws that deviate from our standards.” I don’t think that’s likely. I really don’t think that’s likely. So we’re gonna see an interesting degree of variation among the states when it comes to how they approach these issues.
And the question is, what impact is that going to have on the perception of businesses as to how business-friendly the various states are? I mean, we all know that Delaware, you know, sort of took the lead on being the most corporate-friendly state in terms of the law. And then other states like Florida and Nevada were like, “Well, we don’t really wanna come up with our own laws.” So we’re just gonna say, and then I think in Nevada, this is literally the law. It says, “Whatever Delaware says, we’re gonna that or more lenient.” The question is, what if Nevada or Montana are like, “You do whatever you like want with data here. We’re open.” And that’s an interesting question because the states love to compete with each other.
Christian: That’s amazing. Yeah. I mean, if this becomes another lever as to why people should, you know, leave their high-tax region or high-regulatory region to pursue their business interest elsewhere, it’s yet another thing that’ll be debated well outside the courtrooms but also in the political circles as well. So that’s it for this episode of, “Are you DataSmart?” Thank you for listening, and certainly, if your DPO isn’t thinking of how they’re gonna deal with each of the individual states and their own laws on privacy, you better get cracking. Thank you.
Jay: Thanks again.