Under Armour, Saks, and Panera, all announce privacy hacks and major data breaches within the last few days. We are learning more each day how each company responded, the good and the bad.
In this episode of “Are You DataSmart?”, the Ward brothers dissect health data, like the kind popular in fitness apps as well as the “don’t throw stones in glass houses” aspect of data breaches. About 60% of larger corporations have been hacked according to a Duke University and CFO Magazine analysis in 2015.
It’s time to get DataSmart, and have a plan.
TRANSCRIPT
Jay: “Are you DataSmart?” A weekly podcast on data security, information management, and all things related to the data you have, how to protect it, and maximize its value. I’m Jay Ward.
Christian: And I’m Christian Ward. And today we’re going to discuss, well, What’s Under Your Armour? The question regarding Under Armour’s breach of about 150 million different accounts through their app. Under Armour bought a company called MyFitnessPal a little while back, it’s a very, very popular fitness app, not unlike many out there today that focus on your health, what you eat, your diet, your exercise. And, you know, I can’t help but look at this, Jay, there’s a lot of press on this and there’s, I think been three different disclosed breaches in just a little bit of time here between them and another major retailer, as well as I believe a financial institution. And these breaches are coming at us so fast, we can’t even really keep up. And I think it’s, as one person put, just another day in the life of the internet of a major company struggling with maintaining protection of their data.
But, you know, when I think about this, it also dawned on me, reviewing the discussion, that number one, Under Armour did point out that it looks like it was really just usernames, hashed passwords, and potentially users actual names, so sort of what people used as their username in the app. So it doesn’t appear like any health data was necessary stolen, but it does bring up a big point, which is in reviewing GDPR, you know, we have article 4, section 14 that speaks decidedly on biometric data. And it’s fascinating to me that this is such a great area of innovation in technology and there are so many great new business models based upon this or looking to leverage this type of data. What’s your initial read on this, because, you know, I know you were following it as well and you actually happened to be down recently at the IAPP Global Privacy Summit.
Jay: Yeah, this was a topic of discussion for people as they were leaving the summit. You know, look, at this stage, there’s not much to know. People love to rush to judgment. Actually in my not completely informed, because I don’t have all the data, but in my opinion, it seems like Under Armour has handled this pretty well, I mean, they’ve done prompt notification to users and to their customers and they’re requiring password resets. And ultimately, there’s no way to be 100% impervious to a breach, you just can’t do it. And the question is, do you have a good plan in place to help you respond? Do you have a breach notification protocol? Do you have a data breach identification protocol? How are you handling zero-days? I mean, there’s a lot of things that you can and should do. And so far, I mean, it does look like Under Armour has been doing a lot of that, well, you know, we’ll see when we have all the information. But I will say that because we’re talking about an app that goes to what the European Union’s GDPR would classify, I think in part as sensitive information, biometric data, health information, that really does up the ante. So this is not just like, you know, there was a breach, we’re talking about at Saks Fifth Avenue and Lord & Taylor, that’s a slightly different thing. Even if it’s the same [inaudible 00:03:35] caliber of information, if it’s usernames and passwords, when we’re talking about a device strapped to your wrist and checking your pulse, it feels a little bit more intrusive.
Christian: Yeah. And so as I’m sitting here talking with you I’ve got a Garmin Fenix watch, which I absolutely love, but in addition to tracking with its little green lasers on the back, you know, my heart rate, it also is a GPS and knows exactly where I am at pretty much any given moment. It does provide waypoint data, which is truly amazing. It’s a phenomenal device. Many of the Garmin products I have tried and love, but a lot of them what the feedback and what the data that they have, you know, I think there’s this concern of, do you think that we see with privacy regulation, are we going to see a cooling on innovation? You know, when I think about the data that these sets have and what the potential benefit globally is, whether it’s people being early notified of heart irregularities so that we can get preventative medicine to them before they have a heart attack or the cost of insurance generally. You know, there are people now sharing their car data to try and get better insurance rates because it shows whether you speed or you don’t. You know, there’s an element of intrusiveness here, but there’s also an element of honest-to-goodness better efficiency. And so do you see that GDPR or any of the new regulations, do you think that starts to cool not just innovation in health, that’s certainly what we’re talking about, but other areas as well?
Jay: Yeah, and I think the question is going to be, which regulator strikes first? The general rule in the United States is if it’s legal if it’s not unlawful, and if it’s not violating a fundamental constitutional right, you can contract anything you want, and your terms and conditions will effectively do that. If you look at them, they take the law to its furthest limits. But in the European Union, that’s not the case. You know, the right of ownership of personal data and personally identifiable data is a fundamental right. And I’ve said it before, if you want to understand how the European Union treats privacy, think about how we treat the First Amendment. And so if your watch was somehow implicating the First Amendment, what would the courts do. Or you know, better still, when your watches have implicated the Fourth Amendment, right, because they’re providing the police with details about your location, how carefully do we craft the balance between what we allow devices to do and what protections we give to individuals. So I think there can be some stifling of innovation. I don’t know that there will be because I think the benefits and frankly the revenues from developing these products is so alluring. It’s more likely in my mind that they will try to create a workaround with regulators in terms of how to strike that appropriate balance, but there is always the potential that companies are going to say, “Look, this is just not worth the risk anymore.” Four percent of global turnovers is not worth the 1.5% increase in net revenues last year.
Christian: You know, it’s fascinating that, I think in particular the reason why the fitness data has been so popular is number one, it offers phenomenal ways to keep track and keep up to date with your customer. Many in the sports health fitness world have struggled with keeping an ongoing dialogue, whether programmatic or human, dieting programs, food providers, people that will deliver your meals and, you know, make sure that you’re having the right calorie count, fitness buffs, health care professionals themselves, just, “Are your taking your medicine?” There’s so many things that these devices could actually help us with, and the ability to capture the data literally with just a device around your wrist is kind of staggering. And I see a lot of great business opportunities here from everything, like I said before, we could be talking medical costs, lowering them, we could talk early identification of issues, early responses, insurance rates, pharmacological advancement, being able to see in real-time feedback of how different pharmacological tests are working on individual beta trials. These are all really good opportunities for advancement for humanity.
And at the same time, balancing that with privacy, as you said, and the right to privacy, is critical. And one of the things in the GDPR, the ability for someone to withdraw consent. I think in this particular space, as well as really any of this highly personal data, it’s going to be fascinating to see from a database architecture perspective. I don’t think most businesses are really ready for what they’re about to being asked to do by the regulations, which is if I have given consent because I bought a Garmin watch or a Garmin device, I was blown away. I had a Garmin device probably six years ago, one of their first, where I plugged in and used it for like a year and then stopped using it. When I rebought this Fenix a year ago and I plugged it in, it connected to the same account and all of my data from years ago was still in my account and immediately linked up with this.
Jay: That’s that data hoarding that we’ve talked about before.
Christian: Yes, absolutely. And how long can you keep this data? And these are the things, I think many that are subject to GDPR or may be subject to it, obviously people buy these products in different jurisdictions and different countries, including the EU, there’s a lot of opportunity for misunderstanding what their current data architecture has in place to stay compliant. And whether it’s data minimization or the refusal of consent, like, when I threw that device in the drawer, if it’s somehow, and after not being used for 30 days, in the database, that might be time to cleanse. And that’s the type of thing that I think we have to really be talking about because we want innovation, but we also really gotta protect that right to privacy and understand, what do the combination of those two things mean for the future of data partnerships and data architectures.
Jay: That’s right. And I think, you know, creating innovative solutions to these issues will make a difference. You know, if it is the case that a device has gone silent for six months because you’ve lost it or because you fell back in love with Krispy Kreme, it’s gonna be what it’s going to be. But, you know, maybe you should think about fading that data out, fading the information that’s been stored because again, especially, you know, we’ve talked about how helpful you’ve, Christian, you were talking about how important this information is to help create baselines, but it’s also by its very nature evanescent. Where your health was six years ago does have some value today, but not nearly as much value as what your resting heart rate was this morning. So you have to balance the need for this data versus its ongoing usefulness.
But I also think that this is an opportunity to create relationships with customers that are framed around a conversation, to say to them, “Look, we’re going to keep all of this information. We’re gonna keep all of your sensitive biometric information. We’re going to keep information about your weight, your exercise, all of these things.” And yes, you are consenting, you know, “You have to let us have this information. But we need it. We need it to be able to help you monitor your health. We need it if you want to be able to get the benefits of lower insurance rates and this is the trade-off. We’re gonna give you this great device that you can’t build in your backyard and you’re gonna give us the data, and here are the reasons how we’re going to use the data. This is all of the reasons we’re going to use it.” And maybe there can be a conversation about, “All right, look, I consent to the use of this, but not to use for that.” This is a whole new type of dialogue that we’re talking about creating here. Whether or not it’s a workable economic model remains to be seen, but I think that’s the goal of the regulation, is to create opportunities for that conversation to take place.
Christian: Yeah, I definitely agree that the conversation is going to happen. I would also point out, you know, just looking at a CNBC article here, you know, they talked about the breach was potentially known around March 25th, it looks like within just a few days, you know, as you said. It looks like Under Armour really had a plan and people have to be very careful not to throw stones in glass houses around the concept of, “Oh, they were breached. They weren’t doing what they should be doing.” I mean, it is just so easy at some of these companies, and again, I know this first hand, that someone walking in with a USB drive and walking out with the entire client base, that is really possible and it’s very, very hard to detect or block. Certainly, you’ve got to have a plan in place and I think the dialogue around these things can’t be, oh, they were caught, or they were this, it has to be more hopefully what it looks like, again, we don’t have all the facts, but it looks like Under Armour was very responsive. They started reaching out almost immediately just after a few days. I would argue it takes at least a few days to analyze what is the extent of a breach, how potentially did it occur, who potentially are the perpetrators. You need a little bit of time to digest that, but all that is exposed in your privacy framework and in your data protection frameworks. So at least upon initial review, I think, number one, it looks like they’ve done a good job of trying to respond. We hope that this isn’t something where you find out, “Oh, it was nine months before.” That’s the stuff that nightmares are made of, and I think that’s…
Jay: Yeah, but at least it was before May 25th, right? They’re still gonna be judged under the pre-GDPR standard, so…
Christian: Yes, yeah. Well, no, true. That kind of begs a horrible question, which is who’s gonna run to try and expose that they had a breach before May 25th. We might get a slew of early May surprises.
Jay: That’s right.
Christian: Yes, yeah. It’s…
Jay: That’s what we’re telling you, folks, make sure that you get hacked in the next two months [inaudible 00:13:35]
Christian: No, no, you just have to disclose that you were hacked by then, you might have already been hacked. No, these are really good questions and, you know, my focus obviously a little bit more from the entrepreneurial side is I am in love with some of these datasets and the data partnerships that are possible because of them, but recognizing how you have to change your approach not only to your data, how you store it, but also how you can unwind it. Most people combine datasets from lots of different platforms into one disgusting blob of data, and I don’t care how good your architecture is, if you’re not starting to think about [inaudible 00:14:08] layers that have regulatory compliance on a field by field basis as to who the source was, when their consent was given, and when the consent was renewed, you got to rethink your architecture. And that’s definitely coming.
So I think that’s it for this episode of “Are you DataSmart?” where we reviewed a little bit of the Under Armour breach and quite frankly all the breaches to come. And it’s important to focus on really, what is the value of data, holding onto that data, and whether or not we can have a good balance between new regulation and innovation.
Jay: Yeah, and especially how do you start the conversation with your customers about what you’re using their data for and how long you’re going to keep it.
Christian: Excellent. Well, thank you all, we’ll talk to you next time on “Are you DataSmart?”
Jay: Thanks again.