Should the USA Opt-In to Europe’s Privacy Regime?/transcript

From IGF-USA Wiki
Jump to: navigation, search

>> NOAH PHILLIPS: We'll begin here in just a moment.

The conversations that are happening around the room, let's see if we can bring those together to the tables for our plenary on privacy.

Take your private conversations public. I see Marilyn Cade wrangling more people from the outside. Thank you.

We now turn to the question of online privacy, the question we'll put to the keynote speakers and the panelists, how would things change if the U.S.A. adopted Europe's approach to online privacy such as GDPR and opt in for targeted advertising. The European Union's right to be forgotten. I'm Steve DelBianco and I'll moderate a panel of experts to try to answer that question. First, we're lucky to have Commissioner Noah Phillips of the Federal Trade Commission to set the stage with a keynote address. The FTC is our nation's top cop when it comes to protecting online consumers and to promote competition on the Internet. The FTC has broad enforcement authority over unfair and deceptive trade practices and then narrow rule making authority that is granted specifically by Congress such as in the Children's Online Privacy Protection Act or COPA. As father of three young children he tells me he hasn't had a good night's sleep in seven years and was sort of hoping that COPA stood for something else, the Children Ought To Give Peace To Their Parents Act, and I think he was disappointed in that regard.

Before his appointment to the FTCs in May of this year, Noah was chief council for Senator John Cornyn from Texas a and handled antitrust, constitutional law, privacy, fraud, Intellectual Property matters. He also handled more issues of increasing relevance to the Internet and Internet governance. Such as national security, Fourth Amendment concerns and due process. Prior to his Senate service, he was a litigator and is a graduate of Dartmouth with a JD from Stanford. Join me in welcoming the FTC Commissioner, Noah Phillips.

>> NOAH PHILLIPS: Appreciate it.

Thank you for the kind introduction, Steve.

I would like to thank IGF U.S.A. for hosting me today and for convening a policymakers, thought leaders, folks like all of you to discuss privacy and other critical issues related to Internet policy. I also want to give a special thanks to the Steering Committee members who volunteer their time to make this such a great event. I'm really honored to be here with all of you.

Let me begin by saying that my remarks today reflect my views, not necessarily the views of the Federal Trade Commission or any of my fellow Commissioners. The FTC enforces both competition and consumer privacy law. It is inevitable that in thinking about one we consider the other. More than inevitable though, it is essential. When we evaluate changes in privacy law internationally or consider the whether and how of privacy legislation here in the United States competition must be part of that discussion. This is a distinct question, by the way, of whether the values associated with privacy ought to bare on antitrust analysis a topic for another day. I want to register my concern that laws and legislation intended to promote privacy may build protective moats around companies that sometimes already have a significant amount of data about people. Making it more difficult for small companies to grow, for new companies to enter the market and for innovation to occur, and I want to insist that competition be part of our conversation about privacy. I'm not alone in this. My good friend Caroline Holland who is now an adviser to one of my fellow Commissioners wrote recently that "progress on one front, privacy, may lead to regression on the other." Joe Simons chairman of the FTC put it this way in testimony in the house of representatives last week, I'll paraphrase slightly, if you do privacy the wrong way, you may end up reducing competition, you may create a situation in which you entrench large tech platforms and make it very difficult for new entrants and smaller firms to get the attention of the consumers that they're trying to reach, end quote.

If our concern is warranted, the questions for proponents of new privacy rules then must include are we willing to allow a reduction in competition or innovation? What competitive price are we willing to pay for greater privacy protection? Are we willing, for instance, to allow the biggest technology companies, lately the focal points of discussion about both privacy and competition, to entrench further? I suspect that this audience is already familiar with the various privacy and data security rules that apply in the United States. For those that married this speech later, both of you, I want to set the stage: The FTC has long been the federal agency with primary responsibility for and experience in privacy policy and enforcement. It has been 20 years since our first major online privacy case against a company called GEO cities. I want to do an experiment. Raise your hand if it you remember like actually using GeoCities? Okay. For those that don't, when I was in college, it was a popular web hosting service and one of the most common sites that users would go to in order to access the worldwide web. In 1999 the year after the FTC sued Geo cities sold to Yahoo, a dominant player in search then. I think that tells you a little bit about the history of competition on the Internet and in search. At any rate, 20 years ago the FTC sued geo cities for engaging in deceptive practices in connection with its collection and use of personal identifying information, PII from consumers, including among other things falsely telling users that their PII would only be used to provide them with the advertising offers or products and services that they requested when, in fact, geo cities sold that information to third parties. The FTC brought the geo cities privacy suit nearly 30 years after passage of one of the nation's first and most foremost and significant data privacy laws, the fair credit reporting act which among other things regulates the sharing of consumer information by credit bureaus. As the Supreme Court explained, it was passed for the duel goals of promoting efficiency in the nation's banking system and protecting consumer privacy, end quote.

Since then, as Steve alluded to, Congress has passed an array of privacy and data security laws and has directed the FTC and other federal agencies to develop a host of new rules including updates to this, governing the collection, use, sharing and security of personal information. COPA I did not come up with that other acronym it is another example. I won't bore you with a recitation of every privacy statute and rule we enforce.

The point is, we have significant enforcement authority across a variety of commercial privacy areas and other federal agencies enforce other privacy laws such as HIPAA, the reason you spend so much time at the doctor's office not seeing your doctor. This is all not an accident of history as some would have you believe. When you step back, what becomes clear is that we in the U.S. have a principle risk based approach focusing privacy and security rules on the sectors of the economy where Congress determined such rules are most needed. Even within those sectors, our approach recognizes that all entities are not the same, and in many cases the law explicitly permits different types of protection based on the size of the entity, the data being protected and the like. The United States risk based approach imposes the greatest costs on businesses at the points where our democratic process has determined the greatest privacy needs exist. Limiting such costs where the need is less.

So while we may argue about whether the risks are being appropriately evaluated, whether leveling the playing field among firms doing different things with different kinds of data makes sense, or whether the quality and quantity of data being collected today ought to change our approach. We should not be confused about whether the American approach to privacy is deliberate or sensible. The American approach stands in contrast to the privacy regime now in place in Europe and the ones set to become effective in California in 2020. Fundamentally these regimes require everybody to adhere to one set of standards and one set of costs that apply to all online entities, regardless of size, regardless of service and regardless of risk. What might be the cost of such an approach to privacy? By their nature, regulatory regimes create can compliance costs that are durable, and maybe be more onerous over time. This is what economists call economies of scale. Costs that large companies can bare more easily than their smaller competitors or new entrants. Don't take it from me, as Mark Zuckerberg told Congress recently, quote, I think a lot of times regulation by definition puts in place rules that a company is larger that has resources like ours can easily comply with. But that may be more difficult for a smaller start up to comply with, end quote.

I should note that in the same testimony he signaled his openness to additional regulation. While we may want to protect privacy in new ways, and that's a reasonable question, we do not want the regulatory burden to be so onerous that it excludes potential market entrance or inhibits innovation. At the very least we need to have an honest conversation about the costs and the benefits.

Take the general protection data regulation, GDPR, it's new European privacy law. For Europeans it represents an expression of law that their view of the protection of personal data is a fundamental right, not less. For the U.S., it may provide a test case for how a different privacy regime than ours might work.

My concern is that early signs point to precisely the effects on competition that I fear. According to the Wall Street Journal, when the European Union's Justin Commissioner met with larger technology companies prior to implementation, rather than hearing the complaints, she reportedly expected the companies told her they would be compliant. She explained they have the money, an army of lawyers, an army of technicians and so on, end quote. Facebook, for example, mobilized hundreds of people in what it described as the largest enter departmental team it had ever assembled. Jedediah Hawkins wrote in Forbes ironically big techs companies such as Facebook, Amazon, Apple, Google benefit from a silver lining when it comes to being regulated what hurts their competitors more only makes them stronger, end quote.

That's not ironic. That's economic. Exactly how economies of scale work.

Resources devoted to compliance can be scaled, and they could have been spent on innovation, wage, so forth. Beyond standard economies of scale, privacy rules in particular may rescue to benefit large incumbents. Consider requiring affirmative consent and assume that consumers are, as maybe we want them to, considering the question and being selective with whom they share their information. Consumers are more likely to trust companies that they know. This is an application of what economists call brand effect. And to the extent large incumbents also provide popular services, for instance, as a result of the network affects in certain technology markets, the big guys may win again. On May 25th, the date GDPR went live, the Wall Street Journal reported that some digital advertising, ad tech companies, relatively small competitors to the large technology companies that dominate online advertising, they decided to stop advertising in Europe due to the consent requirement. Advertisers according to the journal, quote, were planning to shift money away from smaller providers and toward Google and Facebook, end quote. Bill Simmons, cofounder and chief technology officer of a Boston based ad tech company noted the paradox, GDPR is consolidating the control of consumer data on to the tech giants, end quote.

Snap CEO also has recognized this point about GDPR stating there are times in history when regulation has actually entrenched big companies because they're the most capable of complying. I think that's a huge mistake because I think that would inhibit innovation, end quote.

There is also another more insidious effect, large companies can manipulate legal requirements to their own benefit more easily than smaller competitors, public choice theory teaches us that the durability of regulatory barriers to entry is one of the reasons that incumbents find it attractive to spend resources on securing legislation and regulation that insulate them from competition. But the benefit to incumbents is not just lobbying for laws that favor them, it is also implementing seemingly neutral rules and regulations in ways that benefit them at the expense of their would be competitors. Just to be clear, I don't think that this is behind GDPR, I don't think it is behind California's law or anything that Congress may consider. It does happen, it should concern us. I'm also concerned, of course, about stifling innovation, several papers analyzed the effect of state privacy laws on the adoption of electronic medical records, EMRs which not only make the exchange of information more efficient, but can lead to improvements in the quality of care. The authors of the studies find that these laws, quote, significantly reduced the adoption of EMRs, end quote. Evidence that privacy regulations have inhibited the proliferation of benefiting consumers, in this case, patients and if the healthcare analogy holds, a poorly designed national privacy regime could impose higher compliance costs across the economy hindering innovation and again increasing barriers to entry. Finally, we may still face other unintended consequences to go back to Mr. Zuckerberg, he noted that the alternative to the big U.S. technology companies frankly is going to be the Chinese companies, end quote. It is my understanding from Twitter that Stewart baker made a similar point here today.

Chinese tech competition is real. Indeed, it is China's national industrial policy and it may pose more than just a competitive threat. This is something that the ranking Democrat on the Senate select Committee on intelligence has also recognized. The upshot is as we consider the benefits of new privacy protection, we must consider the cost too on competition innovation, other things.

GDPR provides us with a great opportunity to see how a large scale privacy regime works in practice. For us in the United States to learn from Europe's experience, we do not yet know the answer, but about the American risk based approach however, we can say one thing for certain already. It has both targeted the areas of greatest privacy needs and still permitted a tremendous amount of competition and innovation. The upcoming hearings hosted by the FTC will provide a good opportunity to discuss the important topic of how privacy is related to competition and innovation. I invite everyone here to join and weigh in with comments and proposals for the hearings.

Thank you very much.

>> STEVE DelBIANCO: Thank you, Commissioner Phillips. Your remarks are definitely going to help guide our discussion on how we in America should respond to the EU's new colonialism over our nation's privacy regulation. I wore my U.S.A. flag tie today to help remind our panel listers that this is not the Internet governance forum global, this is the IGF U.S.A. I'll wave the flag whenever we need to.

On our panel we have an outstanding set of experts that have different perspectives on this. I'll start on your left, Joseph Jerome, Policy Council at CDT, Center For Democracy and Technology focusing on legal and ethical questions raised by smart tech and big data. He was an associate in the cybersecurity and privacy practice of a major law firm covering ad tech and health and finance sectors which we talked about.

Next to Joe, Amie Stepanovich. She's the policy manager at Access Now where she ensures that surveillance and cybersecurity laws respect Human Rights and previously she directed the Domestic Surveillance Project for the Electronic Privacy Information Center.

Next we have Grant Nelson, counsel for the Network Advertising Initiative. He does annual privacy reviews of every NAI member company. You'll be busy this year, Grant. Fun fact, he founded a tech charity that gives food pantries the excess of leftover food that comings out of banquets and restaurants. The access alcohol from the cocktail party that follows here, it will go straight to grant.

Our fourth panelist, Berin Szoka, from Tech Freedom. They're the leading edge on the hardest policy and legal questions that are raised by technological change in the United States. Just ask anyone who opposed him in a debate or hearing. Previously he sharpened that edge at the Center for Internet Freedom and then prior to that at the Progress of Freedom Foundation and in communications law practice. A fun fact, today is his birthday.

Let's tee up some questions for the panel: As Commissioner Phillips set the stable for us, the U.S. approach has been strong, opt in privacy protections for data that can cause harm, risking based, for financial and medical purposes. Opt out in the U.S. has been the rule for data that's used for targeted marketing and advertising purposes. That's how I would stipulate, that's how America's online software services have been free to users and how they have come to be dominant and embraced around the world. If we reduce the targeted advertising, economic opportunity, I think that revenue decline means that users will see either more ads, less relevant ads for sure, more sites that will have to put up walls and charge subscriptions and sites will have less revenue to pay to create new content or provide new services. The first question, we'll start with Joe, work our way down, as a general principle how should the U.S., states, Congress regulators respond to the E.U.'s cross border approach with GDPR, and opt in? You may consider do we want to harmonize with the E.U., maintain our historical approach or a third way in mind?

>> JOSEPH JEROME: There is a lot of impact there. I look forward to the back and forth about the ad tech in a moment.

I suppose I should say first and foremost, we have long talked about the need for a baseline federal privacy law. We are far behind the 8 ball on this at this point. there isn't a third way. We're already being left behind. Mr. Phillips mentioned that it is basically us against the E.U. I would also add it is India working on the data protection regime, Japan having an adequacy agreement with the E.U. and a host of other countries that have data protection laws and still we wish to sort of stay here, we'll offer an alternative approach. Our alternative approach has been to do nothing and we're left in our wake.

You know, it isn't just the GDPR, there is a lot happening at the state level that I think we'll get into that reflects a democratic desire for privacy laws. We have I think we'll focus a lot on what's happening in California, but just in the past year we saw a data broker bill passed in Vermont with new restrictions and transparency requirements on data brokers, we should not forget that I will now has a vigorous bio metric privacy law that seems timely in light of recent stories, Microsoft's call for basically a broad discussion on what to do about these sorts of technologies.

So all of these things, they're already here. Instead of proactively having a conversation, industry has spent the past 20 years fighting against each and every one of these things. My response is it is clear that there's a popular interest and a democratic impulse for privacy laws. You know, instead of saying, well, let's just keep having further conversations, it is time to put pen to paper and put ideas out there. It has to be meaningful legislation. At the end of the day, I think companies are calling for certainty, flexibility, that seems to be the impulse of a lot of the comments that the Commissioner has made. That seems to, again, call for a grant of authority to the Federal Trade Commission to do some regulation in the areas that require specific demand. Instead, we have had 20 years of the FTC trying to do common law privacy through consent decries which I major others on the panel will say is a bad approach. That again calls for the need for actual legislative language, time is now, folks.

>> STEVE DelBIANCO: We'll bounce now to Grant.

>> GRANT NELSON: The last 20 years of American tech company progress is considered leaving us behind I'm all for it. The European approach is probably going to play out to prove that the economic incentives you set up, companies will adapt to. If you create an incentive structure that makes it harder for someone to start a new company, for someone to create a product that's free, hopefully add supported, it will be far harder for you to find a venture capitalist willing to back that, take it to market, create something free for people to use if there is zero chance that you're able to monetize it down the road.

If the approach of self regulation, which as a self regulatory body, we're clearly a fan of, if that's considered a failed third way then I think you have to check the facts. The approach of the NAI has been pretty strong, we have gone after companies, we have made companies make substantial agreements to the policies, practices, privacy, how they collect data. Additionally, if a company is collecting data for targeted advertising purposes and they're a member, we have strong restraints on what they're allowed to do with that data. Most of the harms, that is the harm based approach, the NAI is listening, adopting that, for I don't know 20 years now.

>> STEVE DelBIANCO: Thank you.

>> AMIE STEPANOVICH: There's a question of I was so busy responding, I didn't want you to hear what I was going to say!.

There is a question of if the U.S. should follow in the E.U.'s lead or not and stick with where it is. I actually do think there is a third approach. We should be implementing a baseline data protection law in the United States. It shouldn't necessarily be the GDPR. The GDPR in Europe has a long history of case law, interpretations, decisions that provide a really robust structure that the GDPR if fit into. A lot of that is not new. Most of the rights were contained in the data protection directive before it, they just weren't complied with because there was no real motivation for them to do that.

This is all things that companies should have been doing in Europe for a really long time, and were ignoring.

The approach I think we should take in the U.S., it is to learn from the E.U. approach, but to an approach that makes sense in the United States. We put forward a list of things that we think should be in a data protection law at the federal level, baseline, apply to everybody, one of the things we have in that list, in addition to rights, many of which are in the GDPR but not all of which, it is incentives for companies to it explore and produce business models that are not necessarily based on data. We kind of came to this point, yes, data drives the Internet economy, it doesn't have to. We don't know. We know that there are large amounts that companies set up that collect large large numbers of companies set up that collect large amounts of data because there is an assumption that they're going to make money off of it at some point. some of them don't even have a thought of how they'll make money off of that data some day. They're bringing it in, they know that data, this is a notion that data drives the Internet. If companies had a financial incentive to explore other business models, we may be able to come up with something more inventive. If we want to have an Internet economy in the United States based on innovation, we need to embrace that there is innovation out there about how to make money online as well. So there is a third approach. It is potentially like the GDPR in that it applies across the Internet but I don't think we need to adopt the GDPR here. I don't think we should discuss that. We should that you can about what makes sense here and that's specific rights that hey ply to everybody, it is not leveling the playing field between companies. Let's level the playing field between users so that people can't afford to get privacy if they come from a more privileged background and are left without any protections if they don't come from those privileged backgrounds.

>> STEVE DelBIANCO: Thank you.

The notion of learn from experience is a welcome idea. Although that would require that we wait for some experience to actually happen. If we wait for the new compliance regime, which kicked in in May to have its effect both intended and unintended on the American led business models and risk base, learn from experience doesn't mean let's just see what European Union companies have done for the last six years under GDPR but what does it do to the American business models online, it is interesting to see if we can explore what you mean by learn from experience. We'll go to Berin Szoka.

>> BERIN SZOKA: I doubt anyone in this room enjoys Europe more than I do. Put me in a group of any people in Europe, I'm as happy as can be. I think we also have to acknowledge that Europe has not distinguished itself in leading the way in developing Internet services.

This is not because Europeans are not brilliant, creative people, they are. I speak French often when I go to San Francisco, everybody on the street says it is German, other European languages, United States has created a place where people that want to innovate and develop those services go, that place does not exist in Europe and the reason is there are many reasons for it, it is not simply because of privacy regulation, also it has to do with labor market, tax environment, business formation, et cetera.

The life blood of the Internet economy, even more than advertising is data, it is what the services are built on, no the just to monetize, but to connect one service to another, to allow you to pull up your calendar in an app, all of the things we do every day on our phones require that data is used seamlessly and that the European Union approach to privacy regulation, it is utterly against that. It is not surprising that as the question was raised earlier how many European tech companies can we think of, I love Spotify, I used to use Skype. That's it. For me personally. There are, of course, others, I don't mean to diminish them.

The United States has had a great success story and the freedom to use, to innovate data without permission is a big part of that. It is more complicated than opt in or opt out, but if you boil it down to that, that's a good first approximation of how this works. I think that the U.S. approach fundamentally is the right one. I don't think it is perfect. I don't think it is going to stay exactly as it is. I don't think that we can pretend that there won't be further legislation. I think it would be unproductive to avoid talking about that. I this where this is right, we shouldn't take GDPR off the shelf, but that literally we can't, no matter how much you like it, it is not developed for the United States' legal system. It takes into account in no respect the First Amendment, GDPR covers all information and in the United States a lot of that information will receive the highest form of First Amendment protection. Even though the purely commercial speech, still receives protection in the United States in a way that does not exist in Europe. If you were to start from ground zero and you were to ask yourself what lesson could we learn from GDPR, what should the United States do, you end up with a different framework because of the First Amendment and also because of other basic American principles about due process, constitutionally, required fair notice, clarity with the law as opposed to tell you before you can be penalized and the regulatory takings, so on, so forth.

The second major difference, and you alluded to this, a lot of things in GDPR are not new, they have been legal requirements in Europe for a long time, but as she said, they have been ignored essentially. Why? The enforcement reality in Europe is fundamentally different. In the United States, laws, when written, they get enforced because we have a different legal environment because we have aggressive regulatory agencies that pursue companies and enforce laws as written and are not simply exerting their prosecutable expression to make sure that unworkable laws are not carried to their logical necessary solution. That's what Europe has essentially done for the most part until now. There is one final large important difference, it is that yes, there are private rights of action under the GDPR but if you were to take that in the United States and combine that with the U.S. system of class, certification, you would have a completely different outcome because we have a well developed trial, bar designed to take legal rights that are given to individuals and magnify them to the point of reaching enormous damages. You just can't take the GDPR as a starting place. You need to develop an American based approach. I'll just close by saying, the Obama administration to its credit tried to do that. They developed principles in 2012 as everyone knows that I think were in many ways a good approach to privacy, not exactly how I would have written them if myself, but I don't fundamentally disagree with what they did. The question is about operationalization, what the legal mechanisms are to enforce those.

Now that this current administrative, what they're starting to take comment on the U.S. approach, what it should be, the most important thing, it is on enforcement, on penalties, on who has legal rights, how they're brought to bear in the courts, the role of the agency, these are all questions that in some ways the Obama administration tried to push to the priority sector, tried to encourage private organizations, whether they be Civil Society groups, multistakeholder groups, industry standards setting groups to deal with. I think it is fundamentally the right approach because the Obama policy noted, it is a mistake to think that the government can develop a one size solution for these problems. We need in general for enter get governance, we have had multistakeholder formats, a variety of approaches for a variety of different problems.

>> STEVE DelBIANCO: In the interest of equal time for alternate perspectives, Joe, Amie, would you like to respond?

>> JOSEPH JEROME: The history of privacy multistakeholder efforts in the United States is one of abject failure. Absent enforcement stick or a penalty stick, we saw industry basically slow walk and agree to nothing.

>> BERIN SZOKA: To be clear, as the Obama administration proposed, you could have a model where you give them a legal incentive to do it, here are the requirements and we'll let you go develop a safe harbor and we'll talk about it. I'm not saying that we let the market work it out. There should be a dialogue between government and those other bodies.

>> JOSEPH JEROME: That doesn't work until you have a firm baseline in place to actually bring companies to the table.

My argument would be that self regulation in the United States, I mean, when it comes to privacy, it doesn't give any substantial rights to individuals which is something that I think Amie is concerned about, it defaults over and over and over again to just providing more transparency through privacy policies. This sort of transparency is supposed to be the thing that alleviates the harms here. I think from our perspective, you know, we look at ubiquitous data collection, I understand permissionless innovation is a wonderful thing here, it comes with costs. It comes with costs that impact people's autonomy, their individual agency, and over and over again when we talk about this harm or had risk based approach it is so narrow minded and focused on a narrow Schiffer of physical, financial harms, that it eliminates things that are serious concerns to individuals.

>> STEVE DelBIANCO: Joe, you said the U.S. cannot continue to do nothing and then you said that our entire approach has been an abject failure. With that.

>> JOSEPH JEROME: Perhaps that was strong on my part.

>> STEVE DelBIANCO: I believe so. I wish Commissioner Phillips was still here and would real off the cases that FTC has brought. The U.S. focuses on harms without apology, that harm based approach without apology says we spend less time to the things that don't have consumer harm. You may believe that once I established that you have a right, then you need to minutia that right from harm. From that perspective I see where you're completely unsatisfied with the U.S. approach but it will be incumbent on all of us to maybe depart from the rhetoric and get down to the reality here of consumer benefits and consumer harms.

>> AMIE STEPANOVICH: My name has been used a lot on people on either side of me. I'll interject.

Before I respond to that, there is an important point that was brought to light, how much we agree on. Should there be more privacy in the United States? I think even people who don't necessarily want there to be recognizes we reached a point where there is going to be an additional data protection law, regulation, something that goes in place in the federal law in the U.S. Companies are pushing for it because they're seeing California, they're seeing potential action in New York and other states as creating even more burdensome needs for them to act in different states. We're there. Now we're just talking about what that looks like.

I think that's really important. Because this conversation actually starts the step before that, and if we can take that together as a room, take that step forward and say what are we looking at, not are we looking at something, that impacts the debate.

>> STEVE DelBIANCO: It does. If the driver for national legislation is to overcome and preempt a patchwork of inconsistent state law and right of action, that is federal action, it is driven by the need to counter dumb actions in the state not because of some fundamental need in the hearts and minds of every American that we need more rights to privacy.

>> AMIE STEPANOVICH: I think a lot in California would object to you calling their new law dumb.

>> BERIN SZOKA: It is dumb. When you write a 10,000 word law, because in a matter of what, a few weeks because some wealthy businessman who essentially put a gun to the had he had of the legislature and said if you don't there is much less clapping to that, but hold on a second.

This is this is a matter of the democratic process, that's an insane way to regulate the entire Internet to say we're going to put this crazy quilt of idea out to the voters that couldn't possibly understand something so complicated, right, the process, it is not appropriate for that kind of legislation. That's exactly what it is. By the way, we'll use that to amend the constitution so it can't be fixed unless you just go out and write something in the legislative process which a number of legal academics looked at it and noted in many respects, it is riddled with errors, inconsistent, poorly drafted, I don't know how you call it anything but dumb. Not to say there couldn't be shatter laws, no one I hope will defend that as a serious effort to deal well with privacy. What we should be doing here, it is avoiding legislation by holding a gun to your head. I'm happy to talk in detail on what that should look like. I'm by no means saying we shouldn't change the current U.S.

>> JOSEPH JEROME: California and GDPR are the gun to the head. Absent this, we're on the do nothing.

>> STEVE DelBIANCO: You're making my point. we're being driven by other laws that have completely unappetizing effect and unreal living effect on the American business model. Not driven by a need for privacy.

>> AMIE STEPANOVICH: Those laws themselves were driven by total inaction, not only inaction, but pressure against action by major companies, both at the federal and state level for two decades which Joe pointed out. Once you reach a certain point, where they're like they're looking at a data breach every single week, looking at Facebook Cambridge Analytics, that's harm in the name of manipulation of users in advertising. It is not the harm, the financial harm that you are talking about when you talk about harm based approach but there is harm there.

>> BERIN SZOKA: Could we deal with data breach for a second?

>> STEVE DelBIANCO: Let's see if we can segue to this notion of which data we're talking about, the question we had presented earlier how should the U.S. both normatively and what we have to do in terms of being driven by reality of GDPR, how should we define personal information that has to be regulated? This is a change for us, we have typically regulated personally identifiable information and Commissioner Phillips talked about that. The E.U.U. and Joe now, California, regulate personal information of any kind that could potentially be linked to a person or even to a device and not a person. So the scope of regulation, of what we'll regulate, it is under the GDPR in California approach all data. I think you'll concede that my shoe size doesn't carry as much risk as my Social Security number and there ought to be a sliding scale maybe of, well, risk based attention to data or do you believe that all personal information should be regulated?

>> JOSEPH JEROME: First, I guess I would push back on the notion that we only have just regulated PII, we have only done that in the context of ad tech, we have other laws that have broader definitions.

>> STEVE DelBIANCO: Others that have been an abject failure and do nothing.

>> JOSEPH JEROME: I didn't call the other laws an abject failure, the generalized approach to privacy an abject failure. I look at this, see good things and there is a tremendous gap. There is data that falls out of HIPAA. To pivot back to where I was going to say, A, there are plenty of laws that have broader definitions of PII and there is something to the point of a spectrum and I think that's difficult to legislate. I also think we have to acknowledge maybe as a panel we'll have to agree to disagree, we're not going to agree on what is sensitive. I think it is my contention that sensitivity is in the eye of the beholder. The NAI will trumpet the fact that they have, you know, progressive thoughts on what should be sensitive information in advertising

>> GRANT NELSON: I get suggestions every day.

>> JOSEPH JEROME: I quibble we don't have consensus on certain things, what is precise Geo location, some may say health information should be sensitive, under the NAI code, I imagine targeting pregnant woman for various things, autism tests, that's not covered, that's not sensitive. To me, if I if my spouse was pregnant, targeted with adds for various things, that may unnerve me. We have to have a conversation on sensitivity and also I think we have to acknowledge that all information is sensitive and data protection framework, it needs to acknowledge that all data, companies have a responsibility to be good stewards of all data. What that stewardship requires should be flexible.

>> BERIN SZOKA: I think the point you're trying to make, Steve, you don't serve privacy interests well if you treat all data as equally sensitive. I think the point I think you agree with this on some level, Joe, but the question is how you operationalize this.

If you end up saying all data has to be subject to standards of stewardship, you fail to incentivize companies to devote resources where they're most needed. The example I think sensitivity is not the place to start. Let me give a better starting place for this.

Let's talk for what is actually within the scope of GDPR, I'm not an expert, so I'll paraphrase this as I understand.

>> AMIE STEPANOVICH: Nobody is!

>> BERIN SZOKA: Indeed, nobody is! It is not even a finished law, there is a lot of stuff that will be either subject to enormous discretion to be done in the implementation or still has to be dealt with by the national legislatures. A part of this, at least to some extent is clear, that there is a distinction drawn in the GDPR between anonymous information, to say that making it impossible to ever identify anyone with that information, if you're able to do that, then it is outside of the scope of the GDPR, period. That's versus deidentifying the information which is again paraphrasing here, but essentially it means to reduce the likelihood that someone could potentially reidentify it using other pieces of information. Now, you know, obviously that's true, right? Of courses there is always going to be a possibility for certain classes of information that you could identify. The problem is, I don't know of any guidance given, I have done research on this, as to what that means.

If you don't give any guidance on that point, and if you don't therefore, give any incentive, to companies, to try to deidentify data, you can wind up with some perverse results from the perspective of not a business, but of privacy advocates. You say that the company, you know, it is all or nothing, you ought to add completely out of the realm of GDPR or don't bother to deidentify it. Keep it basically with all of the attributes in that form. That's one very important example of where failing to do risk base assessment or to wave trade offs, to think about incentives may end up producing the scenario where you hold a lot more data, or at least the big companies that are able to navigate the system, able to get consent, they're able to do so. That can't be a good thing for the competitive landscape or for privacy.

>> AMIE STEPANOVICH: So I think Berin has a point if we were only looking at what he's looking at. The problem is, there are other incentives that companies have in order to do things to minimize the risk to data. We're talking about the privacy space but there are other incentives to make sure that data is secure and if somebody breaches that data, it is as minimally invasive to their user as possible. Even if it is nothing and the company chooses to do nothing and they suffer a data breach, that's an issue. They have further incentives to actual take actions to protect data internally other than just the privacy regulations. These things operate together, we have to look at them together. We can't say you're not you're not creating any incentive to go halfway or to do some things. Those exist in other places.

>> BERIN SZOKA: That's fair but I'm talking about guidance.

>> GRANT NELSON: There are few companies saying we don't have data, they spend a lot in security, no matter the data you have, you don't want competitors to find it. The companies I work with, how about that.

What I have seen across the industry, it is when you have a structure saying all data is the same whether it be a Social Security number, some observation about their shoe size, they say, well, what the heck, we may as well collect everything, spend $10 more on whatever security service we use and just collect everything and get consent for everything.

I think that setup the incentive structure that we all seek to avoid in terms of companies, they should have a low barrier to entry to start doing things in the space and as they get more competent, more funds, get bigger, they do more and more interesting things that require more trust, more investment, security, more investment in privacy programs. I think to say something very nice about the GDP, are, that's one thing that the GDPR attempted to get right that I think California just missed the ballot on.

>> AMIE STEPANOVICH: One more point on sensitivity of data.

A year and a half ago now there was a report on privacy engineering, that if I'm not mistaken they're looking at doing more with. There is a really important provision in that report because they talked about who you should evaluate the sensitivity to databased on, the risk to databased on, traditionally we have always looked at the risk to the company because that's generalizable, you can see when there is a lawsuit that could be brought, what harm is to the company, where there is regulation, the risk the user, it is different. For example, companies put a great deal of time secured credit card information, there is a lot of requirements to protect financial data, there are requirements to notify people if financial data is breached in some way, there is a lot of incentive there.

I have talked a lot to people about this. A lot of the sensitivity that people hold, it is not on credit card data, if my credit card is breached, I can call the company, I know who you to cancel it, know where the recurring payments are, I have that process down. If the searches in my Amazon echo get breached or the activity on my Fitbit gets breached, like the non financial data that companies done have the incentives, I don't know how to handle that necessarily. I don't know what harms come from it, I don't know how to mitigate it, if the photos on my phone are breached, I don't foe what to do to mitigate harm from my photos. I think that movement from risk to the company to risk to the user is going to be really important. We haven't seen that shift happen yet but we have seen it recognized in that one report. The other point, this is a really white panel. Risk needs to be evaluated from the people who are most at risk, most marginalized. You can't have this conversation and think that we're going to get an end to it here. We have to go out and talk to other people. This is not we're not the people who are most at risk, ever, online. They're not on this stage.

>> BERIN SZOKA: I beg to differ in one respect, growing up as a closeted gay man, I do appreciate they extreme sensitive I of data and the very real privacy risk. Think about millions of gay teens across America of all colors, I get it. I really don't need to be reminded that there are there is diversity of sensitivity of information. When I defend the U.S. approach, I'm not saying it is perfect or saying that the only risk that matters are financial risks. I think rather if you look at the FTC history of using existing authority, you will see that they have been aggressive at, for example, when XY magazine, targeted to gay teens, when they were in bankruptcy and up for sale, the Federal Trade Commission intervened to make sure that the customer list was not sold. The current approach in the United States is a lot more, if you will, woke than one may think. We can talk about systemizing it, you may say we need to have a more consistent approach to doing that. It would not be fair to say that the United States is only focused with what happens to your credit card number, Social Security number. When I say earlier that the Obama administration did a pretty good job of summarizing the principles of the American privacy law, I referred to that. The question, again, how do we operationalize them, I don't think that either the California approach, which was very rushed, not well conceived, or the European Union approach, not drafted for the U.S. context will work in that sense, yes, to answer the first question, yes, there has to be a third way. Indisputably to address these issues.

>> STEVE DelBIANCO: Meanwhile, we have to find a way for companies to comply with GDP, since it already applies to U.S. companies serving Europeans, whether they're on the continent of Europe or here in Washington DC, there will be opportunities for European regulators to bring the data protection authority actions. We will get to learn from experience as cases are brought.

I want to turn next to what Commissioner Phillips really focused remarks on. We have talked about the notion of harms based versus broad. He also suggested that whatever regulatory approach is taken in Europe or here, that we have to be so careful to competitive concerns that regulatory barriers like that, the Europe's approach to on preferring opt in, that will give you, Joe, a question I put to you very first, an opt in regime is injury to a start up that can't get the opt in consent to anywhere to the same degree to an established company that people already know. I wonder how under an opt in regime we could build the economy we have of the Internet today.

>> JOSEPH JEROME: I guess, first, I can already see the response though, it is when we this single minded fixation on opt in versus opt out consent.

A, the GDPR is not all about consent. There are other legal grounds for processing of the GDP, are which are always lost in conversations here in the United States because this opt in, opt out distinction is almost exclusively the domain of advertising. I have to remind people, every time people say innovation in these conversations, do they mean actual innovation? Better education, smart home, smart cities? They're just talking about a new advertising technology? I think you have to split those two. Not to say

>> STEVE DelBIANCO: Would you concede that search connectivity, social networks, content, they're valuable services, Joe, they're not advertising, but if they're funded by advertising I'll ask you, please, connect the fact that many of the things you use in the daily life that are free are funded by advertising.

>> JOSEPH JEROME: Most of the A, I have willingly opted out of many of these things. I'm not using a lot of them in my daily life.

You know, I'll concede that a lot of these things have tremendous utility. That said, you know, again, I want to push back on the notion of, yes, Google is ad funded, does a lot of stuff, but then what are the other awesome innovative advertising services you're talking about? I just don't know what they are.

My bigger issue here, it is again, single my fixation on opt in, opt out, what's important about the GDPR, it also has a whole lot of components that are about increasing accountability within organizations. What I really think we need to do, it is find out a way, grant was getting at this a little bit, it is to ensure that companies are thinking about and caring about these things from day one. Just because you're a start up doesn't mean you should be cavalier about privacy until you have X number of users or get into X type of data. I realize that that is a tough ask to make. At the same time, I think that the goal is to put sort of skin in the game.

You know, one of our proposals has been, we would like to see companies, instead of dumping information to the privacy policies that no one reads, that can be twisted, you know, just a bunch of legalese, to do disclosures where CEOs are testifying to the fact that they were considering the certain types of risks and I think that Amie is on point, the risk is not just to the company and to the growth, but the risks to their users and individuals, and I think that's the type of maybe forward looking way.

>> STEVE DelBIANCO: Berin, grant?

>> BERIN SZOKA: We should talk a lot less about notice of choice, everyone agrees that's not the most important way to protect consumers.

I'm much more concerned with things like purpose specification. What really has driven the Internet more than advertising, it is the fact that you didn't have to know in advance what purpose you were going to use all the data you had for. Getting abstracting beyond the GDPR and California, I think one of the essential features of the American approach has been a tried and true example, Google collected a lot of search data and discovered there were a lot of other uses for it, predicting flu outbreaks, unemployment on a high level. There are things like that with nothing to do with advertising that we potentially end up restricting if we say that you have to specify the purpose of which you're collecting data upfront or similarly, that you have to minimize the data that you're collecting because we know that there won't be any beneficial uses out there. There are many real world Examples of where that approach breaks down, the most important one is the example of Vioxx, it was discovered several years too late, it was the leading cause of death in the United States and explained a spike in the death rate. As I understand it, from health researchers, it would have been possible to identify that trend years earlier had it not been for the rules that covered health data. I'm not saying that makes privacy regulation bad, it just means that we have to take into account that there are real world trade offs and think very carefully about limiting purposes, requiring overly specific specification in advance and requiring people to minimize the use of data for those unanticipated discoveries where those are beneficial discovers or discovers ever trends that could be bad that we should do something about it.

>> STEVE DelBIANCO: Does anyone here share Commissioner Phillip's concern that this is a barrier to entry for new firms?

>> GRANT NELSON: Absolutely.

To address a thing you said, the concern about startups that are starting to get information, wantonly, whatever, you don't want people to collect data for whatever they feel like, predicting flus, trends, et cetera in the future, it is obviously something that we want to encourage as a public good. You can find out where the flu outbreaks will happen earlier. In the case of advertising though, our inability is sitting here to predict what the next entrepreneur will create as long as people use it should not detract us from creating a situation where they can flourish. A situation where you have opt in consent just to do something as it targeting ads, that is showing somebody an ad that you think they may be into, it is I think probably going to significant lit curtail the useless apps in the future. The fun apps, entertaining apps, things that people use to waste time as well as someone that may turn into something valuable like predicting flu trends.

>> AMIE STEPANOVICH: So Berin and I have fought about this issue so long that I know the flu example he used from him using it before. The reason I bring that up, the competition argument, it is not that old. It really only started being used regularly when the GDP, are was just about to go into force. We started hearing companies say this just entrenches the major companies. We weren't hearing that when the GDP, are was being debated at all.

>> STEVE DelBIANCO: Your point is?

>> AMIE STEPANOVICH: Because now we have privacy and we have the threat of privacy coming in the U.S., they're now becomes this idea and competition is a big issue right now, there are a lot of headlines on competition entrenchment when competition was brought up, I wrote down and showed to Joe what competition because where are you going when you get off Facebook? There aren't major competitors out there. The funny thing is, when there is regulation in place, there is growing pains. We talked about how larger companies could handle those better, the Commissioner did. I definitely read this morning a story about how Facebook in discussing their precipitous drop in valuation recently said it was because they're putting so many efforts towards privacy and security right now that they're having to divert them away from our measures, even the large companies seem to be feeling this, if not at the same level, close to the same level as small companies. What is more concerning to me on competition, where I come into this, it is competition tends to preference the first out the gate, the very first company that comes out of the gay I, offering a service, people will flock to, once you get a number of people on the service, they become entrenched to the service, it is

>> BERIN SZOKA: Like Myspace was.

>> AMIE STEPANOVICH: The reason companies are able to get out the door request I can is because they ignore privacy and security. The companies that do well on privacy and security, that invest more time, more effort, they're slower, they're not the first out of the gate, the second out of the gate, they lose the market share and the companies fall by the wayside. If we were to level that playing field, give companies that invest in privacy security the chance to get out of the gate first we would see more users being able to have their privacy be protected and services that we all use.

>> STEVE DelBIANCO: If we deviate from the base, going to a fundamental right to privacy, does that mean that that is something new that the U.S. needs to create? As you know, there is no U.S. Constitution right to privacy, I think there is one in California's Constitution. Does it is it necessary to create a right to privacy to be able to be sweeping enough to cover all forms of data, not just harms?

>> AMIE STEPANOVICH: I don't know if it is necessary. I would support it.

>> GRANT NELSON: Increasing the cost to start a business, it is no the way to go about telling them to toe kiss on privacy and security. Myspace didn't succeed because Facebook offered a better product. Facebook's concern about privacy on the earnings call, personally I think waves a bit of a red herring, they ran out of users on earth, they ran out of user growth numbers to support it I don't think it was as much about privacy as they

>> STEVE DelBIANCO: They had an absolute decline in the U.S. and Europe. The richest users, only a third of the base, they had an absolute decline not just a slow in growth. The privacy backlash, the tech Lash, it is definitely having an impact.

>> BERIN SZOKA: Just want to respond to a few things quickly. First of all, this competition, this is not new. I encourage you to go back to read the 2010 paper, opt in about it

>> AMIE STEPANOVICH: I don't want to say brand new, it just is not

>> BERIN SZOKA: I have been making this argument for a long time. So have a lot of other people. They have been pointing out, again, it is just basic very intuitive, you know, that bigger companies are better able to deal with compliance costs. More importantly, they're better able to get consent to negotiate, because they're the ones with the existing relationship and they're the killer apps if you will, they're the ones with whom individuals will say, sure, sure, I'll sign on to that, with I leads to the second point they raised, desensitization and the excessive scope problem. They're well documented problems in any opt in regime. We have to be realistic about those problems. I want to push back on something you have been saying, Steve, the U.S. approach is not just a harms based approach, section 5, yes, in part, in part, but it is also about making sure that consumers go et cetera the benefit of the bargain. So deception, for example, a bigger misconception on the FTC, it is that you have to prove harm in a deception case. You don't. All you have to show is that consumers were promised something, they didn't get it. There was a material omission, somebody reasonably expected it know about something.

The FTV or C approach right now is more flexible than people give it credit for, and it is able to go after situations where there is no clear proof of harm. I think the U.S. approach is better referred to as there is a comprehensive vehicle that addresses what I mentioned, it really grows out of advertising law, but marketing more generally deals with harm in that respect, as well as other concerns.

As to this last point about startups, right, I think we really genuinely do want startups to be able to succeed, of course we also want them to be responsible, but you can't expect every company out there to have a data compliance operation like Microsoft's ready on day one. There is a certain set of players right now that have that mentality.

I use the example often of how many people here ever tried to implement or used Microsoft share point? If you know anything about it, it is really a great tool for a really big enterprise. It is really complicated. I made the mistake of trying to actually get my smaller organization tech freedom on that years ago only to discover it is just not it is insane for us to try that, we're better off using Google docs that does 90% of the same things in a much simpler way. My point, big companies, they can handle internal data compliance set of controls, basically akin to share point. That's fine for Microsoft, Google, Facebook, Amazon, when we talk about it is not just startups.

But the GDPR, it applies to everyone. It is essentially every company in Europe or anyone, any other company in the world that holds data about Europeans, we're talking about butcher shops. This is the entire ecosystem. We need a system of regulation that works for everyone. Not just the biggest companies.

The final point, we don't know why Facebook just lost 19% of its valuation. We know that the earnings are down. We don't know what exactly is driving that. My hypothesis is that it could be both things, GDPR can hurt Facebook and Google, but still in relative terms favor them. Right. So we could actually see that all these companies suffer but other companies suffer more.

Now a final example, which is Spotify, I look at the earnings report that came out a few days ago, they, too, are saying that they're losing users because of GDP, they're having to devote huge amount of the internal resources to GDPR and small companies, you know, they may spend thousands of dollars to get the online video game to comply with GDPR, a situation like my colleague dealt with recently, that's real money, real engineering time that's being diverted and those have real affects.

>> AMIE STEPANOVICH: Don't you think that cost will go down as we figure out GDPR? There is a lot of costs right now.

>> BERIN SZOKA: It may or may not. Some costs, yes, upfront costs. You're quite right.

Other costs are totally dependent on what the approach taken by each of the DPAs is. We just don't know.

>> STEVE DelBIANCO: We'll open up for audience questions in 5 minutes. It would be great to get the panelists to respond to something other than GDPR. At the same time, it is something that the Europe's approach to privacy, it is likely to bring to our shores.

This notion of another right, the Right to be Forgotten. The notion that if you find information about you on the Internet, that it is in your opinion irrelevant, that you can petition to have the search engines never show it to people that search your name. This notion of Right to be Forgotten is a potential application here. Does it smash head long in the U.S. First Amendment protections if a government is doing it? Does it begin to affect the U.S. ability of consumers to know who they're dealing with, employers to know who they're hiring? Businesses to know with whom they're partnering and do you believe that the Right to be Forgotten is on the list of European rights you want to import to the U.S.?

>> JOSEPH JEROME: First, I have to correct this, we have these privacy sections and have talking points. Is the conversation about Google trends, it is worth acknowledging, the trends, it didn't work, it was broken, it was inaccurate, that's something, this innovative benefit, I have to say that. That's a Right to be Forgotten.

I don't think anyone on this panel will be a fan of how the Europeans handled the Right to be Forgotten, that said, users should have rights to be able to delete and move the information they provide to platforms and services. That said, it is a really big problem when we're asking a company to make really important free expression determinations on behalf of society. That's where I think deletion rights become really problematic in data protection.

That's worth acknowledging. That said, the tone and tenure of your question seems to suggest that my concern about how a Right to be Forgotten could be implemented in, you know, ability of people to access information via search results, that sort of gets boot strapped on to opaque data ecosystems, the question is also going to try to get at what are thoughts on regulating data brokers, other folks like that. That's where I think that's not necessarily a Right to be Forgotten issue, as again, an incredibly opaque ecosystem that wants to use the Right to be Forgotten and the First Amendment as a shield about either cleaning up their acts or even basically self regulating.

>> STEVE DelBIANCO: Opaque may not be good but the transparency you said was a complete waste of time.

>> AMIE STEPANOVICH: A 6 of 0 second hopeful lecture on Right to be Forgotten, it is actually two different rights. There is the right to erasing, if I no longer want to be a Facebook customer, a user, whatever, I can go to Facebook, say delete all my data. I'm leaving your service, I don't want to be a customer of yours anymore. Delete all of my information. That's half of a Right to be Forgotten.

The other half that people focus on is the right to delist. I don't want Google to have certain search results come up when Amie Stepanovich is typed into Google. That's only half of it. It is really important to separate the two things out.

>> STEVE DelBIANCO: Do you like either?

>> AMIE STEPANOVICH: From our perspective. The right to eraser is vitally important in a data protection regime. You should be able to say I am no longer interested in you having my information, I don't want to be here anymore, because once you opt in initially, if I have to do something on my phone for two seconds, I download an app, they have that data on my phone up until that point, if I use it for 2 seconds and take myself off, I can keep them from getting more data, but I should be able to take that other data away from them as well.

>> STEVE DelBIANCO: You would access now feel about the other half

>> AMIE STEPANOVICH: The right to delist, we have an entire report on this. It is about 10 pages long, it is really pretty, bright colors, what we say it is the right it is really beautiful, an amazing designer. The right to delist, it is not something that we support. If it is enacted, we give a laundry list of protections that would need to be in place with it. We say it is not something that can be done without abuse. There are too many issues. It is not something that should be imported here.

It is important to have that entire list of safeguards as well. It is important to take those two things out and unpack them. I highly recommend if you're interested in the Right to be Forgotten to read the report. There is a lot of information that's packaged up in this like don't take search engine information off of the Internet. That conversation, it tends to be really shallow.

>> STEVE DelBIANCO: Right to be Forgotten. Let's quickly cover that.

>> BERIN SZOKA: First a question, do you think that the right allows me to delete my comments in a public Facebook thread or page or private messages with you?

>> AMIE STEPANOVICH: Private messages, potentially, yeah. Public postings.

>> BERIN SZOKA: What if it is something where you shared something along with friends and I commented on it, should I be able to delete my comments in that context?

>> AMIE STEPANOVICH: Yeah. I think you are able. I go to Facebook, I'm able to publicly delete my comments now. I should be able to delete Facebook from having those comments too!

>> STEVE DelBIANCO: If I tagged you in my photo on my Facebook page, it moose photo, you take the tag away but it is my foe tote.

>> AMIE STEPANOVICH: That's the right to delist.

>> STEVE DelBIANCO: Still my photo.

>> AMIE STEPANOVICH: You can't take the content out, you can only take out the search result. The content is all still there.

>> STEVE DelBIANCO: That right exists today, if you take yourself off Facebook, the tags disappear.

>> AMIE STEPANOVICH: The right doesn't exist, the ability exists.

>> GRANT NELSON: I have very little to say.

As far as the NAI is concerned, we don't have a position on rights to be forgotten. We do privacy in the United States and self regulation for digital advertising.

>> BERIN SZOKA: I have a real problem with any system that allows people to sanitize the historical record. You know, that happens all the time. People, as we have seen, especially recently, people may reveal true selves and try to hide that and, you know, we have a conversation about how we should adjust as a society and realize people make mistakes, opportunity for redemption, a good opportunity to have. We shouldn't allow people to delete parts of the public record.

Different point, it is to the other things that you mentioned, I don't want the government to deal with this, I prefer this be dealt with by the platform, as you'll note, they'll strike their own balance, but as it is a purely scripted matter, this right will not fly in the United States.

>> STEVE DelBIANCO: We shall see.

Audience questions, we have 15 minutes.

>> AUDIENCE: Pretty colors distract me. I was looking for this report that you were talking about, and actually access now just tweeted about how important healthcare, you need to be able to be connected. One of the interesting effects of GDPR has been that there is a lot of actual challenges about security systems that are if we were to have an attack on a system, a place you would go, long history, it is the who is. So one of the things we haven't discussed today and the importance of privacy, it is also when we're blocking who is a natural person, a legal person, what do with he do about the information we need to have so that we can make sure that if with he go to 5G, healthcare, we need that constant service that we can't find the person that's causing the trouble? I realize I'm just pointing to you because I was looking for the pretty still looking for the pretty report, but just to the panel, what do with he do about the problem on this, which is not knowing who you are dealing with.

>> STEVE DelBIANCO: Does the law limit the ability to get the information we need to protect consumers for instance? Anyone want to respond?

>> AMIE STEPANOVICH: This is just a really cheap answer. Our European offers is our GDPR experts to the extent that there are experts as you told me earlier, Berin, if we do this approach in the U.S., we have to have a U.S.A. based approach and we need to take that into consideration because we have learned on the back end of GDPR that there are some negative consequences there that we're going to have to grapple with.

I don't think that doesn't necessitate putting rights into U.S. law. I think that's just an impact that we have seen that we're going to have to deal with when we look at a U.S. approach.

>> BERIN SZOKA: To me, this is an important example. It illustrates how I will cone received the GDPR really is, people sit down, oh, the European Parliament, they had so much work on the commission, so much work on this, it well vetted, but it produced this insane result wherein the basic mechanism by which cybersecurity is handled today, as you see traffic coming to you from a particular site or domain, you want to see who is behind that. You use the who is database to look them up. GDPR had the absurd result of essentially making that information private. Right? That's that's nuts.

So what I said earlier, that the GDPR envisions that these sorts of unworkable crazy things will just be dealt with by the DPAs exercising discretion to make policy, just to smooth out the edges, that's the European approach to writing laws, you don't care too much about the real world application applied because they won't be strictly applied. In the United States, however, laws get strictly applied. If you take anything away from this example, it is that this is I think probably the tip of a larger iceberg of an approach to writing legislation that will not work in the United States and needs to be much better thought through.

>> STEVE DelBIANCO: If Amie is right we'll learn from experience.

>> AMIE STEPANOVICH: Quick to respond, there was a lot of disingenuous opposition to the GDP, are, had somebody brought this up in advance, I think it was well considered.

>> STEVE DelBIANCO: There is a balancing act, the GDPR says that the privacy fundamental rights have to be balanced with needs and purposes of other requirements of law. It is sort of in there, but it is impossible to know how that balancing act has to be done precisely.

>> AMIE STEPANOVICH: I think there is a lesson here for doing this in the United States, which is if we all come to the table honestly, and we're honestly flagging issues, we can honestly make sure that those are considered in the law.

>> STEVE DelBIANCO: I honestly hope you're right.

>> AMIE STEPANOVICH: People in opposition of data protection comes to the table with disingenuous, hyperbolic points that happened in the California data protection debates long before the provision that got passed ultimately was even drafted

>> STEVE DelBIANCO: It was on both sides, right, you would agree?

>> BERIN SZOKA: This is the important thing we're discussing today, what we're talking about here, there are two modes then of, if you will, making law. One, the legislative process, where you say, we have to make sure that concerns are addressed, be specific, clear about them in the law, fair, but everyone knows that's not the legislative process is not the end of law making. Laws continue to get made and shaped after that and my point is if you simply leave everything to the discretion of the regulator, say don't worry, they'll Figure it out, they'll work this through, that puts an entirely separate set of questions on not having clear guidance, favoring bigger companies that navigate that uncertainty, that can manipulate, lobby, capture the regulators and there is another final dimension of this, it is the more discretion, the regulators have, the more room there is for politicization. What do you think imagine if just for a moment, if we have a data protection authority in the United States that we're not an independent agency with a bunch of case law to constrain them, instead we're I guarantee you, that person would be very carefully chosen by the White House, the question that was asked of that person, so you're going to be loyal to me, when I call on you and say to screw Jeff Basos, you'll do that, right, yes, absolutely. This is not a fake concern. This is the kind of concern we have to deal with now that these platforms are subject to enormous political pressure, we have to recognize that privacy, discretion and enforcement of privacy regulation, it will be used as a political codule so we have to be careful on stacking the deck, how much power we give to government to extract against politically disfavored companies.

>> AMIE STEPANOVICH: That could happen today.

>> STEVE DelBIANCO: We don't even have to pass a law for that, unfair, deceptive trade practices.

>> BERIN SZOKA: But the safeguards, it is important, the FTC has to prove the case, the defaults matter, if the default is you have to justify what you're doing, it is a lot harder to politicalize the agency.

>> STEVE DelBIANCO: More audience questions.

>> AUDIENCE: I'm with the George Washington University. Thank you for the afternoon fireworks first and foremost.

At the end of the day, it does sound like everyone on that stage kind of agrees on the middle ground. If we kind of put the U.S. in a 1 to 10 scale, a 1 and the U.S. GDPR on a ten, do you agree we should be somewhere with the U.S. based approach on a 5?

>> AMIE STEPANOVICH: No. I don't believe in a middle ground, I believe in a not GDPR ground. I don't think it is it is not this, it is more three dimensional.

>> STEVE DelBIANCO: Another question.

>> AUDIENCE: I have a different approach to a similar question and a great thank you for keeping us awake on a Friday afternoon, this is a surprisingly passionate topic.

There is a formulate core high level principle on data privacy, the administration is looking in this, the GDPR, California privacy law exists, we're talking about something at a federal law. Assuming all of the people in this room, and on stage, they're going to talk to the commerce department as they formulate their high level principles on data privacy, what's it that you're going to tell the administration that they should be considering, thinking about, recommending?

>> BERIN SZOKA: Well, the Obama administration struggled with this. It took them three years between putting out their set of principles and their legislation, which, of course, never went anywhere. That reflected the work of a lot of really bright people who really have thought this through. I don't necessarily agree with all of them. That approach reflected a well developed bench. Unfortunately, the bench on the other side, it is it is not it is not well developed in part because of the U.S. approach, it has not been systematic. That's a deficit. What we really need to do, it is to abstract just as the Obama administration did, but then try to be more concrete about what operational realities will look like. The short answer, I'm going to be very focused on concrete examples, I'll give you one, which is that to me the most disappointing thing on the lab MD case, the FTC's first litigated case on data security on the merits, the most striking thing with that case, when it went to trial after administrative law judge, the FTC put up an expert witness, here is what we think the data security should have been with regard to the insulation of peer to peer file sharing software, that witness could only speak to the data security practices of fortune 500 companies. They had nothing to say about a company like lab MD with 25 employees and 10 million in revenue. My point, that kind of level of granularity really matters, how do we make sure that the implementation takes into account local circumstances? Could there have been a framework that lab MD could have signed on to? That's what I'm going to focus on.

>> GRANT NELSON: First, a carve out safe harbor tenet for self regulatory program was any sort of teeth, backing, et cetera, for example, a strict requirement for how some types of data are allowed to be used, not allowed to be used, for example, sensitive health information, those sorts of things we can generally agree, you shouldn't be receiving, for example, it is prohibited, the targeting of adds based on perceived sexual orientation without opt in consent, we receive calls with legitimate interests in wanting to advertise to who they perceive to be people interested in their gay cruise, et cetera, they're upset, they feel they're discriminated against, our o position has been that no one should be outed by an ad. Things like that, I think built into intelligent proactive self regulatory harbors depending on industry, scope of the problem, et cetera, it is smart. The NAI recently announced and released our TV guidance. We're up to speed on how data collection is happening on smart T Vs. Another small example of how a self regulatory harbor provides an opportunity for people to react quickly and it is the best of both worlds.

>> AMIE STEPANOVICH: We're out of time. I promise to buy you all a drink in the reception afterwards! Stick around for a couple more minutes! This is actually really a manipulative question. I'm leaving for vacation, we work together. I'm giving him advice on what to do when I'm out of the office when the Department of Commerce comes calling.

>> STEVE DelBIANCO: Nathan is nodding.

>> AMIE STEPANOVICH: There are two things here that I'll empathize.

One, principles from our perspective I don't think will be enough. If they're going to be principles, they need to look at what more needs to be in place and where we should be having affirmative rights.

The second thing, Berin brought up lab MD, interesting, you brought up a 25 person operation and in my experience, a 25 person operation or a 200 person operation can equally impact the information of 2 billion people. There is a need to look at not only the size of the company, but the size of the database that they're bringing in. Making sure that the protections are in place to deal with that.

I don't necessarily want this company to handle my electronic medical records which came up earlier just because they may compete with some other company that could do it a lot better.

>> STEVE DelBIANCO: Joe.

>> JOSEPH JEROME: Three quick things, Steve, you were digging me for my statements about transparency, and I guess what we have conveyed, transparency for regulators should not be equated as transparency for users. The way that we operationalize things through privacy policies ensures that these things are inflated in the way they shouldn't.

When we look at things like Facebook Cambridge, the strata heat map, a financial social network, basically by default putting out public information about your financial transactions, our thought is that transparency to users really needs to incorporate elements of design and U.S. experience and that increasingly companies deploy these sorts of things in a way that is manipulative, desirable of as much data as possible. Second, the FTC rule making authority, we may disagree here, but the procedures, they don't work, if we're going to expect the Federal Trade Commission to do data security rules, data protection rules, it shouldn't take 18 processes and 5 to 10 years to get into place.

I mentioned this earlier again, enforcement. Since the beginning of the FTC privacy work, they have taken what were pretty detailed set of things in that's the fair information principles we said that on the panel, they took these and condensed them to notice and choice and ultimately just to notice. One thing that was really missing the entire time, it is enforcement. By enforcement, we're talking about actual penalties, the ability to sort of actually hit companies with a stick.

>> STEVE DelBIANCO: I can tell he's upset! The last question we have to one last question and then we're out of here! Make it quick, please.

>> AUDIENCE: Innovator network, a comment, we have talked about the GDPR being an impetus, now it is time for us to do this too and yet when we talk about something like ability to be forgotten, it’s not time for us to do that. I guess I want to pause on the notion that because it is happening in Europe, it means it is time for it to happen here, it may be time to watch how they end up handling it and how European companies end up handling the regime that's sort of being it in place there before we leap on board.

The other thing to address, the competition issue, it is not just size, but also business model that comes into effect as well. One of the examples, a settlement that was reached with Google having found that Google had misbehaved and misused data, they reached a settlement dealing with sharing of data with third parties so that the settlement and the result, the precedent set, you shouldn't be able to share certain types of data with third parties and in the case of Google, because they're vertically integrated, there wasn't third parties and only small businesses that by definition through the business model have to share with other parties that were affected. Google by behaving badly was able to achieve a settlement that did not apply to them but in fact effected anti-competitively smaller rivals.

>> STEVE DelBIANCO: If fairness to the panel, you have 15 seconds each, a final word. What can you do? Should the U.S. immolate some of the European privacy approaches?

>> JOSEPH JEROME: It already is. Companies will be moving toward that model anyway. For us to continue to say that the text space in the United States should go unregulated, it is just untenable, we can continue to do that, if you want to be global company, presumably Al you of the innovators are moving that way anyway, they have to do that.

>> STEVE DelBIANCO: Already is your answer.

>> AMIE STEPANOVICH: The time was 20 it years ago, GDPR means it will happen. I think now it is just a matter of what's in it.

>> STEVE DelBIANCO: We're already late.

>> GRANT NELSON: If we adopt something, adopt a scaled approach to sensitivity of data and how to use it.

>> BERIN SZOKA: There is 140 laws in the United States that regulate data in one respect or another. The data ecosystem is already heavily regulated, in a variety of ways, it is a mistake to call it unregulated. The European fixation on comprehensive systems biases this debate, everybody likes to talk about a simple, across the board solution, the United States has a lot to build on, it can be it is an iterative approach and we should continue to it rate it.

>> STEVE DelBIANCO: Fantastic. Thank you.



This text is being provided in a rough draft format.  Communication Access Realtime Translation 
(CART) is provided in order to facilitate communication accessibility and may not be a totally 
verbatim record of the proceedings.