New laws are coming up frequently in the US and around the world that raise the stakes on protecting user privacy. In this video, learn what some of them are and how they affect your obligations to users whose data you collect.
- The second goal of a privacy program is preparing for legal compliance. This is the part that most worries a lot of the industry leaders who routinely reach out to me for advice on privacy. These folks are under a tremendous amount of pressure to meet quarterly goals and have aggressive product roadmaps. So on top of all of that, they see new laws being discussed and passed every single day, or at least that's what it feels like to them. A lot of these laws pose a serious threat to their roadmaps and, in some cases, to their ability to operate on an ongoing basis. So chances are you've heard of the acronym GDPR in the media. You may have even received emails from companies explaining how they're adapting to comply with this complex new law. I've spent the last three years working on helping companies comply with GDPR, but here is how it first impacted me directly as a consumer. I used to work out at a gym where I could sign in using my badge at an electronic kiosk. Once I signed in, the cardio devices would all display my name and my personalized workouts. All I had to do was click on my name and did not have to worry about remembering my user name and my password. Post-GDPR, even after I signed in on the kiosk, the treadmills would not display my name. I was told this was for privacy reasons. As a result, I had to enter my user name and my password all over again. A lot of my friends saw the same thing. In order to avoid having to type their user name and password, several gym users started using the guest mode. As a result, they could not use their personalized workouts. This also meant that the fitness company could not collect any data from the users, either. So what's the lesson here? You know, it's easy to point out that there are a lot of companies that do a bad job on privacy, and they get away with it, but remember, at some point, if companies collecting data won't do a better job on privacy, someone somewhere will pass a law that will hurt everybody's ability to connect with their customers. Bad privacy practices are a lose-lose for everyone. Remember that. As we saw with the VPPA example earlier, and when I say VPPA, I mean the Video Privacy Protection Law, and I'd referenced Netflix and Hulu and Judge Bork, so as we saw with that example earlier, laws passed end up having more far-reaching consequences than was initially intended. So let's start with the GDPR in detail. The EU General Data Protection Regulation, or the GDPR, is a new comprehensive data protection law. The GDPR updated existing EU laws to strengthen the protection of personal data. The European authorities felt this was necessary because in the last decade, decade and a half or so, there have been tremendous technological developments. Additionally, there has been the increasingly global nature of businesses and the more complex nature of the flow of personal data. The GDPR replaces the current patchwork of national data protection laws with a single set of laws directly enforceable in each EU member state, and the GDPR took effect in May of 2018. There are similar laws passed or under consideration as well in the U.S. as well. Now, notice the intent behind the GDPR was to have one unified law rather than every country in Europe having its own law. What's happening in the U.S. is we're seeing a lot of state-by-state laws, so this is an interesting observation for you to keep in mind. The California Consumer Privacy Act, which is set to go into effect on January 1st, 2020, has the following key highlights. It requires companies to notify customers of personal information collection. It requires companies to let customers opt out of their personal information being sold, and you as a company are required to allow the customers to ask for data deletion. So let's dig into these in some detail. The first requirement we talked about for the CCPA was notification of personal information collection. What this basically means is at or before the point of collections, businesses must notify customers that they are collecting customer personal data, what information is being collected, how their information is being collected, how their information will be used, and finally, whether and to whom their information is being disclosed or sold. So this is heavily leaning under transparency. The second CCPA requirement is personal information sale opt-out. The consumers have a right to opt out of a sale of their personal information, which means if you as a user want to tell the company, "Do not sell my personal information," the CCPA gives you that right. Businesses that collect personal information must post a do not sell my personal information link on their homepage, which allows consumers to easily exercise that opt-out right. I will add here that a lot of attorneys I've talked to, particularly folks who are experts in privacy, have told me that they're unsure as to what the definition of sale here means, so as you prepare for the CCPA, please make sure you consult with your legal department if you haven't done so already. Finally, personal information removal. Barring certain exceptions, consumers have a right to ask businesses that collected their data to delete that data, and the businesses must delete that data as long as the request can be verified as being legitimate. So you're probably thinking that the California law is complex, but let me just tell you the California law is not alone in the U.S. In June 2019, the state of Maine passed what many in the press are considering to be the nation's strictest privacy protection law. The state of New York is considering a privacy law that is even stricter than the California law, so as you see, if the GDPR seems complex and the California law seems complex, what we are really looking at in the United States is several laws on a state-by-state basis where different states would have different laws, and you as a business may require to adjust to multiple laws at the same time, so we are really looking at a very complex U.S. privacy landscape in the years to come, but what does it look like holistically? So let's take a look at the map of the U.S. and just get a better visual sense. So this is a map of the United States, and as you can see, the red in California and Maine and also in the state of Nevada indicates that there are privacy laws that are well further towards the stage of being passed or already passed, and then, you have green and blue and purple. What this basically means is any state with color in it means that there is something happening from a privacy perspective. When I first got into the privacy discipline, there was no color on this map at all, and there was no real privacy law to worry about as far as the U.S. was concerned. So if this is an indication of things to come, you're going to see a lot more color in this map, so you want to make sure your privacy program is responsive to this upcoming need, but this is just the U.S. Let's look at the map of the world as a whole. So when I looked at the map of the world in my early days as a privacy engineer, there was no red, and now, the U.S. and Canada and Australia are red. You see a lot of red in Western Europe as well, and you see a lot of green and yellow and orange in other parts of the world, too, and if I had to bet, you're going to see a lot more red in the times to come. So again, you want a privacy program that understands that in the U.S. and right across the world, privacy is going to be extremely complex, and you will need a program that scales and takes account of all these laws as they develop. So as I mentioned earlier, a lot of the business leaders I talk to on an ongoing basis are extremely concerned about the day-to-day consequences of all these privacy laws. In the aforementioned IAPP survey, failing to prepare for data breach notification was the highest GDPR compliance risk. Failure to conduct data inventory and mapping came in a close second, so what that means is when I say data inventory, what exactly do you have? How risky is it to have that data? What could happen if that data gets leaked? All of that was considered to be a pretty high risk as well. Not obtaining consent and improperly handling international data transfers were tied for third place overall. Among U.S. respondents specifically, not complying with requirements around international data transfers ranked as the top GDPR risk. So I'd mentioned at the very beginning that you would need a privacy-aware culture and a cross-functional privacy program. This slide brings home why that is. None of these risks are confined to one area of your business. You will need participation right across your business from top to bottom. As an example, data enters your business via multiple ways, so different devices, different countries, different customer types. That data is then stored in multiple locations, accessed by multiple teams for several different purposes. Understanding what you have, how it needs to be transferred from A to B, and what to do in the event there is a breach is a significant lift. So the post-GDPR world reminds me of a moment from more than a decade ago. When the housing market crashed, President George W. Bush stood somberly in the Rose Garden and remarked that "Wall Street got drunk, "and now, it's gotten a hangover." Our current moment, the privacy feels a lot like a cold shower of regulations after basking in the warm glow of data for far too long. The message is loud and clear. A privacy program is not a luxury but a necessity.
- What does privacy mean?
- Regulations, standards, and compliance
- Changing your company culture with respect to user privacy
- Winning customer trust
- Staffing your privacy program
- Building your program from the ground up
- The role of data science in your privacy working group
- The operational privacy team
- Measuring and scaling your privacy program