 Hello, is this thing on? Cool. I hadn't tested it before. Thank you all for coming to today's event on privacy legislation and how it could or should affect online business models and what effects that legislation would have on businesses and users. My name is Eric Null. I am Senior Policy Counsel at Open Technology Institute here at New America where I have the pleasure of working on lots of consumer privacy issues. Today's panel will discuss key questions in the privacy debate. Should privacy legislation affect or curtail certain business models and if so, how would such legislation affect internet users and the online businesses they frequent? It may be that as a society we want Congress to affect certain business models for the sake of protecting online privacy and we think that's a debate worth having. But first, I want to introduce our opening speaker. Natalie Marachal is a senior research analyst at Wrangling Digital Rights, an organization that ranks all of your favorite companies on their privacy, expression, and governance practices. Their 2019 Corporate Accountability Index was just released in May and I highly recommend you read it if you haven't. Natalie has spent a lot of time with privacy policies, so much so that she probably dreams about them. But as a result she has a deep understanding and knowledge of the business practices of a lot of online businesses and she has graciously agreed to share some of that wisdom with you all today. Without further ado, Natalie. Thank you for the kind introduction, Eric. So before we get started on to our panel, which as Eric said, we'll look at what business practices, what business models we should either be aiming to regulate as part of federal privacy legislation or conversely what business models we should be careful not to interfere with in the process of passing privacy legislation. I want to spend a little bit of time going back to the late 90s, not only for the terrific fashion and the music, although that is making a comeback these days from what I can tell, but because that is where the roots of this business model really comes from. So in the late 1990s of course we had first a tech boom and then a tech bust when the bubble exploded, leaving a lot of different companies including emerging ones that are very much still alive and with us today to search for a way to generate revenue. And one of those companies was Google, which at the time was consisted of a search service and they had a superior search product that unfortunately lacked an immediate way of making money and investors were happy to pour money into the start-up for some time but eventually said, you know, you've got to find a way to become a grown-up company and find some money. And so, you know, the people at Google at the time kind of look around and see what are our assets and how can we turn these assets into money and realize that as part of operating Google Search they were incidentally collecting all kinds of information about their users, what they typed into the search box, what they did before and after searching different terms, what they clicked on, what they didn't click on, how much time they spent visiting different pages that they clicked on and realized that actually they could use this data surplus to figure out what ads were most likely to be clicked on and therefore more likely to be clicked on by what types of people and therefore be able to charge more money for online advertising than had previously been possible. And so this really was the birth of what, of surveillance capitalism, the term that's been popularized by Harvard scholar Shoshana Zuboff in the past few years. And so from Google, this business model, the idea that you collect a lot of data about your users and then you figure out ways to turn that into money, spreads around Silicon Valley and beyond, bringing us to the status quo where the predominant business model for companies in many industries, not far from the tech sector alone, is to offer a free or a cheap service, collect user data, monetize it, chiefly by serving ads but also through acquisition by larger companies and rinse and repeat. And while there were some critiques from the very beginning of this arrangement, for a while to most observers it seemed like a win-win. Consumers get new, cool, useful services typically for free. Companies could then largely focus on developing the products and services that they were interested in building and leaving much of the work of revenue generation to automated advertising exchanges. Advertisers then in turn can figure out which half of their advertising budget is wasted, right? I'm sure at least some of you have heard the old trope that half of all advertising money is wasted. The problem is we don't know which half. Well, this innovation allowed advertisers to figure out which the wasted half was. And in turn, that meant that these ad networks that place the ads can charge more per impression. And finally, or so the argument goes, consumers see ads that are more relevant to them and to their interests and therefore everyone wins. Or at least that's the argument that many of us by and large bought into until very recently. So what's the problem? The first problem is that the cornerstone of this business model is non-consensual data collection at a massive scale. That in and of itself is a privacy violation and it's really a testament to how embedded these business models are into our society and into how readily we've accepted Silicon Valley's narrative that many people don't see that as a harm. But even so, there's mounting evidence that targeted advertising business models and the company choices and practices that they incentivize contribute to a host of other harms. For example, well, first and foremost, maybe not foremost, but certainly first, there's the real risk and even the reality of data breaches. Data that is collected is vulnerable to breaches, end of story. Data breaches are going to happen and that poses risks for consumers. Second, as we learned from the Snowden leak six years ago, these troves of user information that were collected for commercial reasons can be and are accessed by governments, not always with due process and appropriate legal safeguards and that poses a significant risk again to people around the world. Third, there seems to be a consensus that these platforms, when they optimize for engagement and the reason that they optimize for engagement is to hold users' attention as long as possible and show you more ads and collect more data about you, tend to reward emotional, inflammatory and otherwise click-baity content for lack of a better term. And that can lead to, you know, rage spirals piling on about certain issues, certain people, certain ethnic groups, it can easily lead to group think in terms of hateful dispositions towards certain groups, even incitement to violence. These are all things that we have seen happen on social media platforms that are optimized for engagement. Beyond that, at a certain point, personalization, because this is really all about personalization, becomes discrimination. I expect many of you are familiar with Eli Pariser's work on filter bubbles, you know, and my question is what happens to society when we no longer inhabit a common reality, when the perception that you have a reality based on what you see online, based on what is targeted to you, is so vastly different than what I see that it's impossible to find common ground and to have a conversation about the problems facing society. And I would argue that we are really seeing this happening already with all this talk of a post-truth era in the United States and beyond. And finally, targeted advertising enables political manipulation in unprecedented ways, just not to say that political manipulation didn't happen before, it always hasn't, I'm certain that it always will. But it's qualitatively different today than it was in the pre-targeted advertising era. And many of these pathways to political manipulation are all but impossible to detect precisely because no one is able to know what all the messages are that are circulating around there. It used to be that when everybody watched the same nightly news together, if there was a blatant falsehood on the nightly news, an immediate response of counter-speech would be possible. It's much more difficult when the misleading information is shown to a small subset of people who are already primed to believe it and there's no opportunity for counter-speech. So for these and other reasons, governments are looking to regulate the collection and use of personal information. The most prominent and wide-reaching of these efforts, of course, is the European Union's general data protection regulation, which went into effect in May of last year and has been closely watched and even copied in other jurisdictions, including several U.S. states such as California. So where we are in D.C. on this hot July day. Industry groups and their allies in Congress are pushing for federal privacy legislation that would preempt stronger state laws like the CCPA, while privacy advocates and other public interest groups favor stronger limitations on data collection, processing and use, that would allow states to implement stronger standards than the national standard, strengthen civil rights protections against discrimination, and provide a private right of action for individuals who believe their rights are violated. Of course, this already contentious debate is playing out in a rather volatile national political context and it remains to be seen whether anything will pass this Congress. But nonetheless, at some point, Congress will act and the better informed Congress and those of us who also work on these issues are the better. And so we're here to consider the effects that federal privacy legislation could have on various companies' business models. Can legislation create opportunities for business model innovation? How would Congress do that? What would be the effects of the legislation? Could companies adapt and what would that look like? So without further ado, I turn it over to your moderator and to the excellent panel that OTI has put together for us today. Thank you. Thank you for setting us up for this conversation and thank you to New America for making it happen and for not having handheld mics. I feel very free to just circulate as much as I want. So that's nice. We've been talking, you know, the panel amongst ourselves for a while and I think you guys are in for a real treat. These are very smart, awesome people. So I'm going to go ahead and let them introduce themselves. I'll introduce myself first. I'm Natasha Duarte. I work on privacy and automated decision systems at the Center for Democracy and Technology. I've asked our panel when they introduced themselves to ask or to answer a little icebreaker question because at the last OTI panel I attended. Brandy Collins did that and I thought it was a fun way to start. So I've come up with a question that is related to the topic at hand but also lighthearted. So I'll have each speaker introduce themselves and then answer this question. If you received VC funding today to start up any online business you want, what would it be and how would you monetize it? I sure. So hey, I'm Keela Mont. I'm a policy council with the Computer and Communications Industry Association. Thanks to Eric and everyone at OTI for putting this on and putting this panel together. We have a trade group that focuses on open markets, open networks and open competition and we're one of many, many groups that has been supporting the passage of a federal baseline privacy legislation hopefully in this Congress. So if I would start an online company, I would say kind of a service that uses keyword searching and AI to find and prevent spoilers that you said. So if I couldn't watch Sunday's episode of Big Little Lies, I had to wait until Wednesday. I could kind of enter that in, hey, whenever I go on Twitter, like find anything that says Renata or any other keywords and block it out so I can still browse as usual and won't receive any spoilers about sporting events, TV, movies, what have you. And I'd say I would monetize it with a subscription model or maybe a free model where a spoiler would be blocked out. Maybe an advertisement would appear in this place. Yeah. So I'm Li Tian. I manage and run legislation at the Electronic Frontier Foundation. We are a public interest group that's based out in San Francisco and we work in all sorts of areas from free speech to consumer privacy, government surveillance, fair use, transparency, and so on. So we're sort of in the crosshairs right now because being in California, we are dealing with a lot of really interesting battles over the contours of the CCPA, the California Consumer Privacy Act. And that has been, we've really been sort of buffeted both by a lot of industry amendments that are trying to weaken the legislation as well as trying to push legislation on behalf of a number of privacy groups that we believe will actually make the world even better for consumers. I will give, as my answer to Natasha's question, actually a very concrete answer in that my wife has a startup. So my natural answer is I would want the VC money to go to her startup. And her startup is actually, I think, sort of interesting because we would not be monetizing data per se, even though. So what she does is her software is basically designed to work with a lot of digital photography front ends. People are digitizing a lot of photographs. And in particular, they're doing it for ancestry and genealogical purposes, right? And so what her software does enables these people who want to build family stories, family history, the ability to feed once you have fed in a lot of digitized photographs to tag the people in the photographs and to be able to annotate those pieces of the photographs that hold historical or emotional or family significance so that you can really do a better job of sort of tracing your family history. And we would not, being married to somebody who does privacy at EFF carries with it sort of an automatic obligation to not exploit the data in any way that I would ever get in trouble for. So she takes that very seriously. And so it really is on a service model, right? They have no intentions whatsoever in this business of monetizing PI other than that you pay for the ability to do these annotations and create these kinds of albums or any kind of work out of it with your own personal information. But they are not going to be monetizing that at all. Thanks. Good afternoon. My name is Gabrielle Rage. I'm a lawful at the Center on Privacy and Technology at Georgetown Law. It's a think tank focused on privacy and surveillance law and policy as it affects marginalized communities. And I want to echo the thanks to Eric and to OTI for convening this panel. So the startup that I, or the business that I would launch would be a platform that helps people write books. I have a lot of friends who are interested in writing books. But it's like, so something that would collect all the information that they have, like the characters and that they're building a different world, like a place to store all that information. I would monetize it through like a freemium. So kind of like a Spotify way of monetizing where you have contextual ads, if you have the free version. And if you pay to have subscription, there are additional resources and there are no ads. I'm Megan Gray. Thank you, Eric, for convening this panel. I'm the general counsel and policy advocate for DuckDuckGo, which is a search engine, most widely known product. It also has apps and extensions with additional features. For the search engine, it doesn't track you. So we make tons of money, but we don't use your data. And for my idea, for VC money, I actually had this idea just a couple of days before you sit around the icebreaker. So I'm very excited, not so much for the business idea, but because in a prior life, I was predominantly a trademark lawyer. And I came up with a great name. So it's a web index that would crawl the web and then license the index for money, whoever would like it for whatever purposes. And the name would be more or less. Thank you for humoring me in that icebreaker. So we are going to dive right into talking about some of the proposals or ways that privacy is actually being regulated and the potential impacts on business models and how we feel about that. So I'm gonna start with a question that has to do with the California Consumer Privacy Act. And I'm gonna start with you, Lee, since you are in California and you are following this very closely. There is a proposal in California that called the Privacy for All Act, which would amend the California Consumer Privacy Act in several ways, including by prohibiting businesses from discriminating against the customer that exercises their right to opt in or out of the sharing of their data. So that means you couldn't refuse or degrade service or charge them a higher price or offer a discount in exchange for someone to agree to turn over more of their data or not exercise their privacy rights, which is commonly known as a pay for privacy. So Lee, how would a non-discrimination provision like this work for companies whose business models are based on offering free services in exchange for the ability to monetize customer data, such as a social media platform or a news website supported by behaviorally targeted advertising? Would they be required to have users who cannot, required to have users whose data cannot be used for personalization and advertising? So excellent question. One that we have been wrestling with in the actual legislature. The funny thing about our version of the California Privacy Law is that the existing CCPA already contains a provision that is sort of or ostensibly aimed at the pay for privacy problem, but it does it in such, it does it in a very confusing and unclear way where it's almost sort of self-negating. That is, it allows for some reasonable value, but it's really unclear how that is calculated. So our goal in AB 1760, which is the Privacy for All Act, was to try to clarify the way that the general price discrimination will work. As a practical matter, our bill was killed with what became a two-year bill pretty much within two or three months, so the proposal is not on the table, but nevertheless, our general answer is yes. We think that it should be possible for consumers to enjoy these kinds of services without having to be subjected to that level of profiling, surveillance and targeting. And if you're familiar with that go, you know that the sort of solution that we think is likely to work is advertising that it's not behaviorally targeted. We think that there are lots of ways that businesses can take advantage of their platforms and still be free without completely monetizing all of the highly granular data that they have. And it's one of the other issues that comes up in this area is also loyalty clubs, and we think that loyalty clubs are something that can work, but in a fairly restricted way. One of the main concerns that we have about loyalty clubs done on an opt-in basis is that they're within the context of a business. There's one set of uses, but then when the data leaves the organization, that's where the greatest problems are. So just the last week, one of the amendments or proposed amendments from Assemblywoman Burke attempts to sort of clarify the rules about loyalty clubs. We have always actually believed the loyalty clubs were okay under the CCPA, but they wanted the language to be clarified. So there's a bill that's going through to do that, and one of the signal points of the current version of that bill is that it would not allow sale of information by a business that's doing a loyalty program. And the normal loyalty program is one in where you are basically exchanging your PI in return for a discount, but then the second level of the loyalty program is then when the data is shared outside of the organization. And I invite anyone else on the panel to jump in on this. I wanna talk a little bit more about pay for privacy, meaning the idea of offering people a discount in exchange for, for example, allowing the company to use their data for behavioral advertising or for something else. This has come up with broadband internet providers or cell phone providers, providing a discount on service in exchange for some ability to use up person's data for things other than just providing the service. So my question is, we know that in some circumstances, these can be coercive to users who often don't know what they're really opting into, who may not feel like they can make the trade-off and cost and pay a higher price or forego a discount, but are there any circumstances in which getting this kind of discount in exchange for more access to one's data could be beneficial to consumers by giving people an option to access services in a way that's more affordable or is it always detrimental? And so I guess the logical expansion of that question is, should we be looking to outright ban pay for privacy or discount offers or should we try to be somewhere in the middle? Should we try to draw some lines around, what is a coercive offer versus what is an offer that is reasonably okay for consumers? And everyone, please jump in and discuss whoever wants to. On the idea of discount. So it's not, the way that you framed the question is, how, what would be the advantages or disadvantages for the individual? And I look at it as much broader. It's like environmental. It's like cars. So you may have a more green car, right? And there's trade offs for a consumer who's deciding whether or not they wanna pay more for an environmentally friendly car. But that individual's choice is not limited to that individual. Like it affects me, it affects you, it affects everybody else. So there are externalities to that choice. I think there can be circumstances in which an individual who wants to not make pro-privacy choices for whatever reason, that I wouldn't wanna constrain him from doing that. But I think the way that we as a society deal with the externalities from those choices is if we set a baseline and a default so that if somebody is going to go in a different direction at least most people are not going to change their behavior to go out of their way to share more data that impacts the rest of us negatively. I sort of bounce off of that. Like we have to acknowledge that we enjoy getting services without cost to a user. But what we're seeing this data driven ad economy is has grown like way out of what a normal person would expect their information to be used where you have whole profiles being created where it's collecting all the information of all of your internet use. And so there needs to be some restraints that are put in place even if we do keep a free to user or low cost to user system. I mean I would say in the broadband privacy area we had the benefit of some of a basic fact that the services that we were talking about the pay for privacy issue were services that you were already paying to use. They were not free. The classic AT&T U-verse situation where oh if you pay this then we will not mine your traffic. And that leaving aside the sort of bias question where it always feels worse to lose something versus not gaining it. Leaving that aside, you are already paying for your ISP and there is no real argument that your ISP is going to go out of business or have a problem with its business model. Comcast will make money either way. It is a little more, it is a different set of equities. We recognize that for businesses that are based on free in the first place. But that is where many of the issues that Megan just raised really kick in because we have seen now the larger social costs of that kind of way of treating data. And so between those two different buckets of pay for privacy and price discrimination I think it's actually pretty clear that it needs to be controlled in some way. And even the CCPA which is as I said relatively friendly on pay for privacy in its base version, the version that's currently in the law does maintain this sort of out there limitation on usurious, unreasonable, unjust, et cetera. But as you pointed out we don't have any good metrics for that right now and there isn't even really a good theoretical metric for figuring out how would you do that. So we are in many of these areas we're watching innovation in the law that are gonna be followed by innovations in enforcement and compliance where folks like the California AG are gonna have to figure out well how are they going to exercise their enforcement discretion and compliance in terms of adjudging something as coercive or not. We just don't have, I mean I'm not very optimistic about it given that the current FTC has done such a sort of a weak job with a lot of infomercials and a lot of sort of free offers. I mean Chris Huthnagle and others that have written fairly extensively about some of the behavioral economics problems with free models. But that's still relatively new in the return it's not something that a Chicago school style agency like the FTC really takes seriously. So if I could chime in I would say that while there should be some baseline protections for data that apply to it no matter where it's collected or where it moves throughout the economy for the most part a user should be able to choose whether they would want to access a free ad supported service or potentially pay a subscription for that service and I would be bringing it back to this kind of anti-discrimination provision that's probably appropriate in circumstances where a user exercises their right to access or correct or potentially port that information. It gets more difficult in areas like exercising right to deletion or a right to object to processing because in many cases these online services have developed to be responsive and tailored to that individual and the information they produce or create using the service. And then I would also say that in many cases in many studies in much research users kind of place a value of hundreds if not thousands of dollars for being able to access and use many of these free online services as search, mapping, storage, communications and I would be hesitant to say that the government should be able to intervene there and potentially disrupt those services that so many people find valuable. When you say that people value those services very highly in specific dollar amounts can you talk more about what you mean by that? Is that how much people would be willing to pay for the services or what is that? There's kind of many studies in this space and there's lots of different kind of approaches you could take to trying to discern how much of how much they value this. In some cases it might be asking how much would you yourself have to pay to give up a certain service over a year or something to that effect. So we hear a lot of rhetoric about the potential for privacy regulations to negatively impact the digital economy in part of course by cutting into ad revenue or advertising-based business models. One example that comes up a lot is the fact that some US publishers pulled out of Europe after the GDPR took effect but on the other hand the New York Times has said that it has actually grown its ad revenue by cutting off third-party ad exchanges in Europe and focusing on contextual advertising. Megan could you talk a little bit about the difference between behavioral and contextual advertising and can we really expect companies that offer free services to survive without selling behaviorally targeted ads? What really is the added value here of behavioral versus contextual advertising? So the difference between behavioral ads and contextual ads. Behavioral is personalized to the person so it's based on you and what you have done online and offline, a profile that has been created about you and then the advertisement that is displayed to you is based on AI predictive algorithm that decides that you will be most emotionally triggered by seeing a certain ad for a certain type of product. That's personalized behavioral advertising. On the other side is contextual ads. Contextual ads is what we had before there was the internet. It's just context. It is the ad that you see in the newspaper that you pick up from the newsstand back when there were newspapers. It is based on the context. It is based on the page. It's based on the content of what you are looking at, not you. DuckDuckGo's business model relies on entirely contextual ads. We don't do any behavioral advertising. We're privately held companies so I can't tell you how much money we make. That's if I say it is a lot. We definitely fall, the CCPA has revenue floor for whether or not you are subject to its provisions of, I think, 25 million. We are well above that, so we are definitely subject to CCPA. We are certainly, I would assume, subject to GDPR. GDPR, which is the General Data Privacy Protection Reg in Europe, all these companies that you mentioned pulled out of Europe because of GDPR. I don't believe that. I think that was a handy excuse. Our business model is such that our compliance efforts for GDPR were really easy. We didn't really have to do anything because we already are set up as a privacy respecting company. So the business model that we have is an advertising model. It is just contextual ads so that when you type in vacuum cleaner, you get a vacuum cleaner ad. If you then type in Volkswagen, you're going to get a Volkswagen ad. You're not going to get some ad that is a result of things that you've done on other sites or on our site. From our systems, we don't even know that the person who searched for vacuum cleaner is the same person who searched for Volkswagen. Every search is entirely new from it as far as we can tell. So we show ads. We're not anti-advertising. We are pro-privacy and pro-advertising. So I hope I answered your question. And so what is the value that consumers are getting from behavioral targeting and privacy notices and hearings? And here was just making the point that people value personalized services. Gabrielle, do we think that this sort of trade-off that we're presuming between getting free or highly personalized services in exchange for giving access to a lot of our data is really in consumers' best interests and should privacy legislation speak to just sort of try to get at the really bad actors and leave most of this sort of business model or bargain intact? Or do we need to curtail some of the prevailing business models to protect people's rights? And then should we be concerned about the impact of legislation on the availability of free services, particularly for low-income people? Yeah, so I guess like working backwards, I think that there are, like Megan said, that there are options available to continue providing these free-to-user services where you do place contextual ads rather than collecting everything that I've looked at for the past 30 days and then using an algorithm to spit out what I may be thinking about wanting to buy from Amazon today. So we need to also acknowledge that the exchange isn't equal for those reasons, the fact that there are these algorithms that are collecting all this information in ways that less sophisticated users are not aware of. So oftentimes you hear people talk about, I was just telling Megan that I'm planning on going on this trip and suddenly my phone popped up an advertisement for an activity in the same location. I think my phone is listening to me. When in reality, there are these algorithms that are collecting all this information on you to place these advertisements. So it's clear that consumers- I've seen these breadcrumbs that you don't even know you're leaving. You're leaving, yeah. And these algorithms are so sophisticated, they're using proxies to determine you went to this high school and so we're gonna predict that this is your race and there are all these ways that our data is being used that we as consumers are not aware of and so there needs to be protections that are in place but without removing these free to user services. Yeah, it's interesting you bring up the issue of people thinking their phones or listening to them. When that first started popping up in reporting, as a privacy advocate who wants people to be informed, I think I and others sort of rushed to say, well, no, that's not what's happening. And I still say that but I also say like, I guess does it matter if that's what's happening? Like if that information is being collected in ways that are not transparent at all and used in ways to target personalized personalization or ads or make decisions, recommend things, does it matter if your phone is listening to you or if that your phone is just acting as if it were listening to your conversations? One thing that I like to point out with personalization, I think personalization has a positive connotation so I try to stay away from it because we all want personalization, right? I want my clothes to fit but I don't want to be behaviorally targeted and that's what actually is happening. So in the vacuum cleaner ad example, consumers want relevant advertising, right? So relevant advertising is personalized but it doesn't have to be behaviorally targeted so you can see an advertisement for a vacuum cleaner but it doesn't necessarily have to be a pink vacuum cleaner because somehow it's listed in some system that that's your favorite color. So. I wanted to piggyback off that as well. I know that Miranda from Upturn also, she will say that personalization is, the other side of that is discrimination, right? So you think about, you're searching for job applications and the algorithm determines that you're a woman. What jobs are you then being shown because it's determined that other women don't look for a C-suite position versus if the algorithm determines that you're a man it may present these job opportunities to you and so that's another thing that consumers aren't aware of that they're thinking this is an equal exchange. I have a law degree so I'm looking for law jobs but because the, so you're expecting that exchange but in reality that's not the job that you, you may not be receiving job that you're adequately qualified for. Yeah I just wanted to hit the business side or the website side of it because I, I spent a lot of years ago there was a real push for the do not track system, the do not track protocol to be built into browsers and for that to be respected by companies and that was thought to be, we hope that it would be a good way for people to be opt out of tracking consistently across the web. One of the things that was really interesting in the dialogue with companies at the time was that they were very concerned about do not track and they were afraid that folks would buy into the idea of not being tracked but they did not care at the time at least about ad blockers. They really, it's like, yeah, we know there are some people who don't like ads and they're going to block ads and that's okay. Our research shows that the number of people who block ads is relatively small and a lot of times all they're doing is blocking the ads. They're not blocking the sucking up of quickstream data. They are not changing the ability of a company to track you via like buttons or other widgets across the web. So they were happy that consumers had been deceived essentially by the association of advertising and surveillance to then reach the false conclusion that if I don't see ad then I'm not being tracked. Today I think there's a little more sophistication in the ad blocker world. EFF itself, we have a product called Privacy Batcher that really focuses on the tracking side of it and that has, and I think that's changing, that's changed some of the ways that the companies are attracting people. They're relying much more on browser fingerprinting to be able to then figure out what your consistent identity is across the web without necessarily using a widget. In some places I suspect that they're not already basically saying if you're using an ad blocker you can't use our service. And that's actually sort of the flip side of what we've been talking about, pay for privacy is that this is the way that without law that the businesses can essentially enforce you have to, if you want to be able to do X then you're going to have to pay us with data. So a lot of the pay for privacy discussion really only makes sense when you think about it in terms of what's been the default baseline for the last 10 years and how are we going to restore some level of balance so that the consumer can actually not be sort of coerced into giving up their data. So I think that's a really important point about the advertiser and the ad blocks. Because it used to be the ad blocks, just block the ad, it didn't really block trackers and all the hidden code that would suck up your data. It just literally would put something up so you did not see the ad, but the ad's still there and it's still sucking up all your data. And as the ad blockers have become more sophisticated the ad blockers are blocking the tracking. And that is what is ticking off a lot of the large tech companies because they need that data. They need that data more for their current business models than they need to show you the ad. Because what the data does is it is then used to maybe not show you an ad, but to show so many millions of other people ads and every pipeline that they can get to feed into the big data machine that helps them determine what is going to be the most manipulative content to show you so that you click on the ad. They need that a lot more than they need to display the ad to one particular person. So I think one of the catalysts for this conversation to maybe pull back the curtain a bit is that we're having a lot of conversations on the Hill about what U.S. privacy legislation should look like and on the surface there tends to be agreement between advocates, companies and others or at least it looks like there is that there should be federal privacy legislation. When you get a little bit deeper sometimes that breaks down and it sometimes breaks down when we talk about specific provisions that would start to affect certain business models that are incumbent. So here we hear about this tension between innovation and privacy legislation but what are some of the ways that privacy legislation might actually encourage different types of innovation might positively shift investment toward other types of business models and by overemphasizing the framing of privacy legislation versus innovation is there a risk of blessing or over-prioritizing the prevailing incumbent business models and are we missing out on what innovation could look like if we had some ground rules around privacy and it wasn't all about how much data can you get? Sure, so I'll say that privacy law versus innovation is not the correct framing here. Talking about privacy laws generally they do impact kind of a series of different interests. You have privacy itself which is notoriously hard to define and implicates a series of different values. You have potential impacts on freedom of speech and right to know and you also have business interests in involving kind of competition and involving innovation the ability to use data for new and socially beneficial ways. So I don't think there's anyone out there who views kind of innovation as this kind of magic talisman that you can invoke to shut down all conversation about privacy protection or rights but it is true that if it was kind of an ill drafted or kind of ill considered privacy law it could negatively impact innovation. You can imagine a privacy law that might be so prescriptive that small innovative disruptive players would have a more difficult time entering the market. You could imagine a privacy law where the standards are so ill defined or enforcement is so out of proportion to the risk of harm that it would chill businesses from pursuing new and innovative use of data. You could also kind of imagine several ways that a privacy law could promote competition and promote innovation on privacy practices and I think we should pursue those in federal privacy legislation. One is transparency. The law should require that companies be upfront about what data they collect, how they process it and under what circumstances it could be transferred to a third party that will allow consumers to think about what is the business model of the company that I'm signing up for doing business with and potentially switch. Another way to make that more probable and to promote competition on privacy preserving interest and values is some form of a right of data portability where consumers would have an easier time moving the existing information and transferring between different services. You could also see a kind of federal privacy law that promotes research, promotes investment and knowledge sharing on privacy enhancing technologies and there's a lot of promising types of technologies out there where you could do privacy protective data processing, homomorphic encryption, secure multi-party computation among a series of them. On the innovation, what I would love to see is innovation in contextual ads. There has been none. We've had contextual online ads for what? Since 1998 and nobody has spent any time trying to innovate in that. So it's a very blunt instrument. You search for vacuum cleaner, you get a vacuum cleaner ad. What the research hasn't done is well, if you are searching for a vacuum cleaner, if somebody is searching for a vacuum cleaner, that also indicates that they would be interested in cleaning products. So it would be interesting when you go to read an article in an online news source that the subject of that news article could then be aligned with the ads that you see and there hasn't been that kind of innovation. There's been all sorts of innovation on behavioral ads. Way too much money has gone into that and it would be wonderful to see folks give that up and try to focus more on advertising innovation that is healthier for us as a society and also provides more relevant. One of the other issues here I think that's confounding is that we historically or traditionally associate the innovative innovation with the smaller startups and that sort of thing. But we also I think tend to assume that the smaller company can't do as much harm in the first place. And yet one of the truths about the app economy is that you could have a very, very popular app and collect an enormous amount of data and if you are careless with that data and you have a data breach, you might actually have a greater security problems and privacy breach issues, which is a different kind of issue from some of the other privacy, but associated with very small players. That as we have some of the pushback that we've gotten with respect to the structure of the CCPAs is CCPAs only applies to businesses and it only applies to businesses of a certain size. And while many of the companies have complained that, oh, it reaches too many businesses, we've heard from federal policy makers when we've talked to them about CCPA and they go, we really don't like the fact that you are, that small companies aren't being covered. And I'd say for myself, it doesn't cover yet that we're a nonprofit, but we have people's personal information too, you know, nonprofits can, you don't have to be a business to do abused personal information. And so the idea that you, I also think it's problematic to think that you wouldn't say also require that nonprofits observe, you know, be beholding to privacy rules because it's super important. The other thing that I think is interesting when we consider the change in computing is the shift to mobile, right? There's so much more of what we do online is on mobile devices and mobile devices have small screens and that just makes it harder to display ads with any kind of, well, it just makes it harder because any ad is a real cost to what you're doing. You cannot see half of your screen and I haven't seen the research. I would assume there is research that shows that as we really move, as people, the number of sort of person hours spent on these increasing, on these smaller devices has actually correlated in some way with complaints about advertising because it is... It's close, it's close fine too. Yeah. Yeah. Just sort of aside while we're on the topic of other types of business models, I have a bit of a background in journalism and have been noticing lately a lot more models where articles or journalism is sort of recommending products and then they get a cut if you click the link and I think it would be interesting for a future conversation to talk about, talk to people in that field who, you know, journalism is a field that has gone through this huge shift in sort of losing the model that they were used to getting the revenue from and having to find new creative ways to fund their businesses. So not for this conversation but I think for a future conversation it might be interesting to talk to that industry about what they think about the changes that have happened and in general, you know, should we be talking more to looking more to other sort of historical or other sort of sectors or areas of law examples of where either regulation or economic forces have caused changes in the ways that business have to work, you know, regulations around automobiles requiring changes in how you have to build cars. These are just sort of... We're going to start wrapping up is because with privacy, you know, we had the regulations of the cars and everybody claimed the complete automobile industry was going to be devastated by having emission limits. And then you had the invention of the catalytic converter, right? And the automobile industry is still with us. So... Right. But what you were saying, it's interesting because I think that the, you know, the FTC has, you know, one part of the FTC's jurisdiction is over deceptively formatted ads. And one of the things that they've been concerned about even on the platforms, on the Facebooks in particular is how the advertising seems to be kind of hearing native format and when it's native format, then it falls within their jurisdiction as commercial speech, whereas when it's not actually an ad, it doesn't. But many, many people can't tell the difference. And we saw that with, you know, going back to children's TV in the 70s where basically there were program-length advertisements. So I think actually this FTC, but maybe not this FTC, but I know some of the commissioners are actively thinking about those questions. Yeah, and I think that outlets themselves have gotten more transparent about native advertising, some of them at least over time as there have been more enforcement actions. But I'd also be curious to see what the industry itself thinks about this sort of replacing other models like more traditional advertising models or subscription models. Anyway, we don't have time to get into all that today. Thank you to our panelists. I'm gonna open it now to questions from the audience. And I believe we have someone with a mic. What are we doing on the tables outside? So be sure to grab someone you need. Hi, great panel. And I'm attracted to the idea that you can make a lot of money using just contextual advertising. And so I was just gonna ask the panelists to imagine a scenario where suddenly overnight like all the say American tech companies who do targeted behavioral advertising just switched to contextual advertising and like what would happen economically? I feel like it would be safer, but can anybody imagine what would happen if it just? It's very doable. Like this is not some la-la land. You could pass a law that says do not track outside of your website. And that would go a long way to solving the problem. And they would still be hugely profitable. They may not be obscenely profitable. And I'm okay with that. But they would still be able to do their moonshots and have all their businesses. There would not be mass layoffs. And anybody who says to the contrary is just hogwash. I know ads, it's just not using ads that are generated from like a user's information. I know there's some time the distinction of like private information, but like proxies can be used to generate something that's more private. Or even like your credit card information is being used to determine whether or not you have cancer for instance. So things like that where we just need to draw a line. I think if you saw a complete abandonment of targeted advertisements, you'd see displacement very quickly from new industry coming into the space that would use targeted advertising. And I think for the existing services, you would see worse services. And I think you would see far more paywalls across the internet. And I think it would be really, really interesting to see what it would do to political campaigns. Because one of the, we talk a lot about the company and the advertisers as consumers of this online surveillance economy, but it isn't just them. It is also, it's used for gerrymandering. It's used for campaigns. We've seen how the Obama administration or the Obama polls basically said, yeah, we're able to pull out some victories precisely because we're better at data mining than Romney. So this is, it's gonna, there are gonna be, there will be I think a lot of impacts, but I think that, but we're not interested. And I think most of us aren't interested in the specific health of any specific company. Rather we're asking the question whether on balance for the internet economy, for our society, whether or not the loss of this ability to target so precisely, would we still have a panoply of services? Will we still have technical innovation? Will we still be able to express ourselves online freely? I think we would. Hi, I have two questions. And the first one is I read an article not too long ago in the Wall Street Journal where they said that if consumers were actually paid for the data that they give, the average American household would get about $20,000 a year. So the question is should consumers be paid for the data? That's my first question. And then the second question goes to transparency. For example, outside of the Beltway, I don't think people really understand how much of their data is being taken. So should there be an approach sort of like with your credit score that you have access to everything that's in that file so that you know what they've collected but also how they've collected it. And so you're not just signing away rights that you have no idea about. So I mean, and also the privacy notices shouldn't they be written, and I'm a lawyer, but in non legalese so that people actually understand what they're giving out and specifically and what things would you include in a privacy notification for consumers? So the privacy policy one is the easiest one, right? I don't think that can be fixed because you can explain and there's just so much to explain and people are not going to read that. So I don't think that can be fixed. Can we talk about paying people for their data? Sure, so I would say kind of data valuation is very difficult and I haven't, I think seeing a model where kind of that would be clear and make sense. And I would also be concerned that if you've been paid for your data and then it's gone, then maybe you lose all rights and it can be used in any way. So I would be more hopeful to see a model where certain kind of rights attached to that information and stay attached to it no matter how it is used or where it goes. Yeah, we wrote a blog post that might have gone up yesterday which I had, I didn't bring my computer highly on my phone so I haven't even seen it yet. But EFF has a problem with the general sort of direction of these valuation approaches. It's not that we don't think that you should, it would be really useful to know what your data is worth. It's just that, I think that it's, as you said, it's difficult to figure out and it's not just difficult to figure out because it's like hard math, it's because the way that a company values data in terms of oh, I have this much information that enables me to get this much revenue from advertising is a very different thing than saying, oh wow, because a particular bit of information about me was made available, my employer didn't give me a promotion or I didn't because the college admissions now is staring at all sorts of stuff on social media and looking at your scores that I didn't get into the college I wanted, et cetera, et cetera. You can go on, but you see my point. This monetary evaluation has got this commodification effect on data whereas if you look at data the way we do that your privacy, your personal information, it's a fundamental human right under the California Constitution, it's article one, section one. That's not something that we really want to, that we think should be thrown into the economy as a completely sort of alienable asset and indeed one of the problems is that so much of how things work in the larger data economy is at scale and so the value of it gets that way because you have so many people with it if you're looking at it from a me perspective, I've actually, the 20,000 figure that you quoted, I don't recall that from, I didn't read the article but I've had many other people say, geez if you do the division it's like $10 a person. So now all of that is not to say that we don't have an issue with the so-called digital dividend. That is something that I keep hearing more and more but we think it's highly problematic and we think that it's, or I'm thinking for you, I thought that it makes it very difficult to really address the privacy issues in a meaningful way. And also, I mean it gets back to the externalities too, right, when you're building a railroad, the first parcel of land that the railroad company purchases they may get at a very low rate so nobody's being able to figure out that it's for a railroad and as they continue buying parcels of land and then the person on the last bit, the holdout gets a ton of money, right? But all of this also has effects on where if Lee is willing to sell his data for $20,000, well it's not just his data, I mean I have interactions with him, it affects me and now I can't sell mine for as much because he's already sold his. Right, well the genetic stuff is where you really see that, right? Because people talk about, oh, consent, DNA, et cetera, it's like, well wait a minute, anyone in my genome has very, very clear factual implications with regard to my kids and to all of my relatives. Like is that really something where we think from a human agency perspective that I should be able to foreclose all of their choices by the choice I make? I think that's really problematic. Thank you for taking the time today to speak with us. I just wanted to just make a quick point about the auto industry. I mean they weren't necessarily gonna be with us if it wasn't for a bailout so it's not like, it was actually Japanese innovation that pushed the American auto companies to change a lot. So there's that and this comes down to a question about the market then and it seems like a company like DuckDuckGo is so popular because the market decided there needed to be a service that provided a way that didn't track and so now DuckDuckGo's popular because of other companies that failed to respond to the consumer and what they wanted so it actually filled a spot that was required by the market and so we need to be extremely aware that we cannot predict the future so when we lay down any regulation we don't know, it may start with behavioral advertising but we don't, now my phone tells me or you spent 45 minutes on Instagram today and I let it track me and tell me what to do and maybe there will be some service that's like, hey, you need to get your stuff together and stop being on social media so much so it may be beneficial and I don't know if it needs to know that I'm trying Cunningham to do that for me but what is in your mind of ways to be cautious with regulation because right now we've talked about like, let's do it, let's regulate but I just wanna, let's not rush into it and what things could be affected besides just advertising. Can I just say that I'm not worried, frankly, that we're going to rush into it. I mean, I think what we're seeing here and I think what you're seeing on the Hill generally is even though that is probably as much of a product of hyper-partisan politics as anything else, nevertheless, we have time. I don't think anyone expects federal privacy legislation within the next three years. I hear people saying maybe we can do it in five but and granted that's not a huge long time but it's also not like we're just going to do something. And there actually are a lot of people thinking about these things. So while I take your point, I'm also not really worried as a practical matter that we're going to be rushing and we've got experiments running right now. GPR is a real world sort of experiment for some of these things. California is an experiment for some of these things. We're gonna see more innovation in the States as long as the feds don't somehow get a pure preemption bill to out of Congress right away. So it's not, yeah, we're gonna be working on it. And when that bill does get passed, I know that there's certain provisions that can be in it or like that will allow regulation to be a lot more nimble. So you don't need to wait on Congress every time something new comes up. So some of that would be giving the FTC or another federal agency rulemaking authority. They'll be able to see that, okay, there's this new app that's coming out or there's this new data harm that's coming out and so they'll be able to do something. There's a way that you have a federal floor that allows States to innovate and so you have California that'll come out with the CCPA. You'll have things like the Vermont data broker bill where other States can experiment and then on the federal stage can adapt to that. You could also have state attorneys general who will then be able to go after these harms and on the litigation side be able to also come in and say that okay, this is a harm that's happening. We have X many cases coming out. I think we should just have a state law that's coming that comes out on that. So there are levers that are in place that will allow us to be a lot more nimble. So I don't think we should just say, oh, it's gonna be too difficult or there's gonna be something that comes up that we can't anticipate. So we should just, we can't leave it the wild quest that it is right now. So the only thing I would, DuckDuckGo's been around since 2008, right? 10 years and more than 10 years and we're now at 1% of the search market. So. And you're really, and you're still making money? And we're still making tons of money, right? So I'm not worried so much about Congress, one, passing a law anytime soon, two, that they pass a law that is going to make it more difficult for search engines to enter the market because it is already so difficult. And I think that our competitive advantage that we've been able to get to the 1% is privacy. I would love to see other search engines. It used to be that there were tons of different search engines. Yeah, well let's, yeah, I mean, we should turn that around because in many ways, the non-privacy that we, that I would argue that we enjoy today has led to significant concentration, significant income and see advantages. That has actually one of the byproducts, unintended or intended, has been much, it is much harder to invade a lot of those markets. And of course, now we're getting into some merger and acquisition behavior, right? Because we are talking about the tendency of the large companies to buy up the small companies that might come in because we can just afford to buy them, Instagram, whatever, all these companies that belong to Facebook, that might otherwise have been challengers and innovators in a different way. And that's part of, I think we have to count that as part of the cost of how we've been doing business so far. Good afternoon. It's where, is it where, yes, there we go. This is such a complicated subject. And we know that there's a huge divide in our country between people like me who are 60 and above and people like most of you in the room who look like you're 12 and going on 13 to me, but that's just cause I'm old. And we know that, well, what can I say? And I find even when I teach cybersecurity that the 18 year olds don't have a clue how any of this functions. That's what my question is about. Regulation and political advocacy arise from the populace calling for something or supporting something that is suggested by experts. So what is your sound bite for each of you that you would use to gain the support of not those sitting in this room because we're here because we're fascinated by the subject and I have something to say, but from, you know, Hatsi Rosenblutzi on the street who really doesn't have a clue how any of this works and yet receives both the benefit and the harm and doesn't have, you know, which way do I go? I read this, I read that. So what's your sound bite to get people to sort of wake up and say, privacy is important. This pay for play is important, but we need to do it right. Tip sheet that they did, I think last year or the year before. So in the summer, a lot of people go home to their parents and their parents are saying, you know, clean up my computer, I've got all these viruses, but how do I figure all this out? And this tip sheet gives you kind of a roadmap of how to explain why privacy is important, why you need to have this setting and so forth. I thought it was just fantastic, but the sound bite that I use, and it kind of depends on the person and what I think is gonna appeal to them, but I believe, I mean, most people don't value privacy because their reaction is I don't have anything to hide, which is a bunch of bunk. And rather than trying to counter that, right, because it's very hard to convince somebody that actually you do have something to hide, I tried to talk about the externalities, that it's not what you have to hide, but it's that your son is probably doing something that is going to cause him problems later on, and you need to think about the world that he's gonna be living in. So if I could take some moderators for how I'm gonna take a stab at this, I actually disagree that people don't value privacy because they think they have nothing to hide. I know sometimes people say that, but what I see with most of the people I know is that they have some inkling that they probably shouldn't be just opting in to all of the privacy notices they get, that they wanna use these services, they're in the back of their mind somewhere, they have this concern about what's happening to their data, but no real ability to actually see what's happening to their data and create a mental map of that that allows them to do anything about it. It is daunting and overwhelming, and so what I say to them is that's okay, I feel the same way, it's okay that you feel guilty that you never do anything about the privacy concerns that are in the back of your mind that you just push down because you need to use this app. That's where we all are, let's just be honest about that. No one has time to self manage their privacy and the burden should not be on us to understand what the business model is, what's happening to our data in this black box and then try to make the choices that align perfectly with our personal privacy preferences, in part because the market isn't just an array of all of the potential choices, the market is determining what options we have and pushing us towards the ones that it wants us to make. So this shouldn't be a burden on people and it's not surprising that people don't know what to ask for in terms of privacy. The burden should be on the industry to have a baseline standard by which they protect the information that people entrust them in order to access the services they need to participate in the digital world. So my dilemma with that is, and I think that's a great sound bite in perspective, and it's all true, but since we're not going to have that law anytime soon, what do you tell people give up for now? I mean, what do they do now? Yeah, I mean if more people start asking for it, I think it's not an uncommon refrain to hear from people on the hill that my constituents aren't asking for this. And I think that that's the disconnect that the sound bite would address. Right, speaking up, I think speaking to elected representatives actually is a big deal. That's going to make a difference. Now, how much of a difference is it enough? Here in Cambridge Analytica was a very, very big deal. And in California, it's what led Alistair MacTaggart to really, really get to push this idea of a consumer privacy initiative over the hurdles, because it pulled well. Privacy always pulls well, but until you get some extra juice, it's a widely held, but can be ultimately sort of shallow kind of preference. It's like, yeah, that's important to me, but convenience is more important, or X is more important. But then at a certain point in California, I think it became much more of an outrage sort of thing, more like the network neutrality issue. Network neutrality was great as an issue to campaign against because everyone hates Comcast, right? I mean, seriously, this isn't just, I'm not just saying that to try to be funny. I mean, everyone really does hate Comcast. And so we could, our actions, that pointed out that this would, this is what Comcast wants, and you will be taking something away from them, incredibly effective. Now, the fact that we have that super high levels of support at the grassroots does not automatically translate though into support in Sacramento or in DC. You have to think those things, but you do also really have to make those views known. And so the person who is distressed and disgruntled, that doesn't say anything to their representative or a letter to the editor or any, if you're silent about it, that's when your views don't count. There's also other ways. So in our do not track legislative model. You could contribute money to the FF, but I'm not, I'm not showing. Yes, thank you to you enough. Is that about a third of Americans have the do not track signal set on their phones and on their computer browsers. So they, like that should be enough. I don't need all of them to write to the legislators. I can just say, look, this is a fact and it's measured and it's reported. The do not track setting does nothing. So folks don't understand that part and it's unfortunate, but this is, it's like putting up a sign in your front yard. I support privacy and I would really like this. If they would like to do something. The other thing I'll real quickly point out is there was a study done maybe 10 years ago that showed that more than half of Californians believed that if a website or a website had a privacy policy then it did not share your data. So that is false, right? But people thought that, right? And so it is natural. You can't expect people who believe that the existence of a privacy policy means that they're not doing anything. You can't expect them to get all upset about giving information to that website. They are just really, really sadly misinformed, but there are a lot of things like that that we find out in the activism and advocacy world that we don't expect that we have to try to overcome in order to try to make something happen and it's reality and reality is messy and it's full of those sorts of things. So yeah, it's gonna take a lot of sound. Okay, we're over time now. So there's a reception waiting for you all. Let's thank our panelists. Thank you. Thank you again to OTI and especially to Eric for making this happen. Yeah.