 So my name is Letitia James, but everyone refers to me as Tish. Please call me Tish We're the first digital generation in history Suddenly connected in ways that past generations could never have imagined But as President Obama once said if we're going to be connected then we need to be protected And we were just reminded by that by representative Welch And we want to thank him for that and we want to again thank him for recognizing that we are the custodians of the rule of law And it's all really critically important that all of us understand that no one is above the law And that we have stood together To save and to protect our democracy and that really is our role as we go forward I want to thank the National Association of Attorneys General for all that you do. I want to thank TJ Donovan the Attorney General of Vermont for this wonderful conference. Let's give him a round of applause And I know he's not in this room But telling me was an honor and a privilege to meet his lovely bride Jessica yesterday at the reception and of course to my very good Friends A.G. Komatsu and A.G. Racine. Thank you for being here as well. Thank you this week we've watched CNN and MSNBC and all of the television shows on the hearings on Capitol Hill about privacy and security in social media And we've heard a lot at this conference Over these last two days with respect to privacy and data Collection and of course of social impacts and for all the tech representatives in this room I want you to know that I think being connected is more positive has its positives and We have immediate access to information. We can stay in touch with our loved ones During the pandemic obviously social media we all relied upon it And so big tech has a role to play in our society. We recognize that but We also know that there are issues and they are concerns and There's evidence that some big tech companies use their wealth and their information to stifle competition And at times they use behavioral Advertising models which is terrible for our personal privacy It rewards them for collecting as much information on individuals as they can We talked about AI a few yesterday. We talked about my experience And so as attorney generals with business and tech leaders, we need to work together Let me say that again. We need to work together Public-private partnerships, we need to have discussions and Try to resolve some of the issues that our society is facing I'm proud that attorneys general have taken on these issues across party and lines and Across regions and I want to thank the 48 of my fellow AGs who for joining with my office and our shared anti-trust lawsuit against Facebook and for all of Us who are part of the two similar lawsuits against Google our anti-trust laws were created specifically to prevent monopolies from using their power to harm competition and consumers and When it comes to privacy, we need better laws that we can enforce to protect consumers Because most current laws focus on requiring companies to disclose their data collection and give consumers the ability to opt out I want to see laws that provide clear limitations on data Collection and what companies can do with that data. Let's be honest our laws Unfortunately have not kept pace with current society and we need to change that So I am going to introduce members of our panel Allow them to say a few words to you and then pose some questions to them and then open it up to the audience Our first panelist is Pam Dixon. Miss Dixon. Are you in the room? Okay, yes, I am and I'll bring up my video right now. Okay. Oh, you should be seeing me Let me just give them a proper introduction if you don't mind would you like to have a seat? You could have a seat sure first miss Pam Dixon is the founder and the executive director of the World Privacy Forum a respected public interest research group and author and researcher She has written influential studies on privacy and identity biometrics AI health and complex data Ecosystems and their governance Tom Galvin. Thank you so much for being here. He is the director of digital citizens Alliance Mr. Galvin has been active internet security and safety issues for over a decade as Executive director of digital citizens Tom is focused on bringing a voice to consumers including those who have been victimized Online by putting a face on the victims of online crime Digital citizens will serve our fellow citizens an issue a wake-up call to policy makers and Internet companies that must do more to protect us is Tim Sparassini here Spara Connie the founder of SPQR strategies is he online? I am I'm here general. Thank you so much. Okay, Tim is a legislative legal and strategic consultant Who helps companies and consumers consumer and privacy advocates? Understand and respond to the pressures created for businesses consumers and governments by emerging technologies Tim specialties are privacy cybersecurity technology anti-trust and Constitutional law and prior to joining Facebook Tim was senior legislative counsel at the ACLU where he helped advance the Constitutional principle of the right to privacy representing the ACLU before Congress the executive branch and the media Jennifer urban Are you on? I am Jennifer is a clinical professor of law at the University of California Berkeley School of Law where she is director of policy initiatives at the Samuelson law technology and public policy clinic She was appointed by California Governor Gavin Newsom in March 20 of 21 to be the inaugural chair of the California Privacy Protection Agency Board The California Privacy Protection Agency was created by the California Privacy Rights Act of 2021 and 2020 and is charged with Implementing and enforcing the California Consumer Privacy Act of 2018. Let's begin with Pam. Pam. Can you give us some opening remarks? Yes, first. I am absolutely delighted to be here and thank you to nag into the Vermont Attorney General for your kind invitation Vermont has done so much for privacy and I'm deeply honored to be here I just have three brief points to make which I'll elaborate on in our discussion The first is that in order to really focus privacy legislation effectively We need to be concrete and focus on practical solution that actually address problems And we need to ensure that our solutions are evidence-based not just rhetoric That's incredibly important today. And then secondly, we've got to pay attention to the context of the legislation We need to make sure that that legislation is inclusive of all stakeholders Including people of color, including people who live under the threat of poverty or in-poverty And we need to make sure we're including other vulnerable groups such as older Americans and others So these are very important constructs. And then finally we need to think about our resources What resources does government need to Effectuate enforcement of the law? What resources do businesses need to comply with the law? And what resources do consumers need to be able to effectuate their rights under the law? In evidence-based studies, we've seen that not all consumers can effectuate their rights easily. We need to take care of these issues Thank you so much. Thank you. Can we get a microphone for Tom? Tom some opening remarks that working great As you could see by my lack of photo, I take my privacy very seriously Someone take his picture and see if you can find it. Yeah, exactly first off Sincere thanks for the invitation to be here today as was mentioned digital citizens has been working for about a decade trying to put a spotlight on online harms and Risks everything from the sale of illicit drugs to working with state AGs on the proper disposal of opioids to How hackers bait and trick consumers into giving up their personal information to the role of platforms and Unfortunately, how some of them have been used to facilitate or enable Facilitates too strong a word to enable bad actors both home and abroad to disrupt American life So I just want to make three quick points and I know we're gonna have a robust discussion The first is there's a parable about the frog in the boiling water And that is if you put a frog in boiling water, it'll immediately jump out But if you put them in tepid water and slowly raise the temperature It will be boiled to death because it didn't realize it was happening. I would argue over the last 20 years as Digital platforms and a digital world has been created Americans have been the frogs and that there wasn't a significant amount of forethought to what was happening by the way, not just for Americans, but the companies themselves which in oftentimes were overwhelmed by what was happening These were largely engineering driven companies Who were making decisions largely about whether something could be done as opposed to it should be There wasn't a lot of sociologists in Silicon Valley And I know that because I lived and worked in Silicon Valley for technology companies and Tim knows this as well So we haven't had that forethought The third point I'd make is we need to bring Consumers really deeply into this conversation because as frogs they haven't had that chance for the last 20 years And we need to make sure they are part of this conversation Moving forward and to that I will offer you three quick numbers The first is 55% say their privacy has been breached in the last couple years. It's a significant number second one is 29% only 29% say the fact that they largely get free access or services in exchange for their data is a good deal 48% say it's a bad deal third 64% say that they should be the owners of their digital data and Finally fourth 61% said they should be compensated for that data in some form or fashion and I throw those numbers out because we're at a point in society where we have to decide along with digital platforms, which I You know, we're gonna work obviously very closely with to Consumers and public officials to make sure we get this right because we have to get this right because This is our future. It is not going to change Everyone in this room relies on digital platforms And if we don't do a good job of making a better future, it's bad for everyone and I'll leave it there Thank you. Well, I know everyone in this room Refuse to allow our democracy to die like that. And so we thank you for your comments. So Tim opening remarks Yeah, general James. Thank you so very much for the invitation and for leading this panel My apologies for not being with you. I have a young daughter not yet able to be vaccinated or else I would be with you all I regret that I am not I Have one of the few individuals who has the privilege every day Not only this day But every day for the last 20 years to work with both consumer and privacy advocates and with innovators each and every day Very few people get to do both And I'm here to share with the the generals and their staff that there has been a false dichotomy set up That we can either have innovation or privacy You've been lied to These are false narratives We have because of this set of false narratives had neither privacy nor innovation and It is time. I hope for us to raise our sights higher And I think the generals can be a leading force to drive a better discussion going forward. We can have better truer privacy laws to protect the real Rights and the real privileges of Americans in many of the ways the PAM Has begun to touch upon already this morning and we can begin to force companies to provide innovation That actually provides real value In the way that tom was suggesting already just a moment ago consumers feel that is lacking From what they've been provided here to for and so I'll be happy to talk as we go forward in the panel about some ideas About how legislation can be written in better ways to drive that forward. I think we've had a market failure I've certainly believe we've had a regulatory failure because of that And I believe the generals can be and should be the force to drive forward to change the narrative going forward for consumers in America Thank you. Thank you And lastly, jennifer opening remarks Thank you. I general james. It's wonderful to meet some of you and pamp. It's wonderful to see you And to be here with all of you I will first say obviously i'm not ashkan sultani And I apologize for that We at the california privacy protection agency Hired ashkan to be our inaugural executive director On monday the announcement went out So I am filling in for him while he gets his feet and i'm really honored and delighted to be here Secondly Because of my role as the chair of the board of the california privacy protection agency And also as my role university of california berkeley I need to be clear that i will be speaking for myself and not for the agency the board my university or any of my clinic clients I know everyone in this audience understands that I'd like to follow up on the other panelists a little bit in my with my opening thoughts One of the things that is most striking to me about the new california law That created the agency is that it came from the people It is a prop it was a proposition It was a ballot initiative that was voted on by the people and I think that this is really a reflection Of the clamor for more control over their data and their destiny that americans have had now Growing for at least the last 20 years In my own research I did I did research to look at californians and americans Privacy attitudes and desires and they did not match up with business models in any way shape or form And we I think are really seeing that That people are asking for for better solutions And i'm really glad to see that there is more focus on that at a policy level at this point In terms of what we need I think other panelists have made Points that cover a lot of the of the main things, but I want to emphasize That whatever we create needs to be usable for consumers And it needs to reflect what is Realistic, which means we need an understanding of technology Which ashkan brings and it means that we need an understanding of people And as pam said that means all people that doesn't mean Just the people who are looking at user Experience and academic centers and companies It means it means people from all communities All all all groups and we really need to be thinking carefully about that about equity and intersectionality Third we need resources which is a point that's already been made We need to put substantial and Sufficient resources into this problem If you're thinking about what you might want to do in your state That is an important question And then finally we need to avoid tunnel vision Privacy does not exist in a vacuum Privacy exists in a world that involves a marketplace that involves people's lived lives That involves the way that they interact with one another and with technology Businesses Across the board. So when we're thinking about privacy, we need to not forget about things like security We need to not forget about things like automated decision making because all of those things are interconnected And they all need to be taken into account. Again, I'm really delighted to be here and I look forward to the conversation Thank you so much. So my first question to all of the panelists Representative Welch just made a recommendation with respect to a digital authority A digital authority to some extent would be responsible for rulemaking some degree of regulation and recommendations to Congress, what are your thoughts? Anyone could take that question I'm happy to jump in although I obviously am a little biased because we just created the digital authority in california But I think this goes to Some of the discussion that we've already had in our introductory remarks Which is both resources And I'm thinking holistically about the problem So privacy has been sort of parceled off into different departments into different laws for a very long time And privacy doesn't just involve that topic. It involves all kinds of digital issues. So Thinking carefully about how that would be constructed. I think it's important but having Having a sort of a dedicated digital authority. This is again my personal opinion makes a lot of sense to me and we do have models to look at You know california you can watch us and you know learn from our mistakes and hopefully from our successes Obviously there are data protection authorities in europe that can give a lot of experience also In the americas. There are examples that we can look to In order to think about how to build this Anyone else? Yeah, I would just mention two things one I know part of this discussion is whether or not we have this patchwork of state laws or whether we have something that's more broad Having worked in companies. I know that having certainty really helps in terms of planning So I think anything that covers a broader range and isn't patchwork is frankly better for consumers And I think it's better for society. It's better for companies Because at least they have some certainty into what they're dealing with So I think that's a fair request on their part or a fair expectation on their part that they're not going state by state Or locality by locality So the answer is yes, I think there is a value to that But I think it has to be done in a very smart way. We know after 9 11 We created dhs, which we thought somehow was going to solve it all by putting it all together Frankly mixed results on that so whatever we do the answer is yes But I think we have to be very smart And make sure it's targeted and doesn't try doesn't become a bureaucratic morass in itself Which is not good for consumers. It's not good for the companies who are providing Services, so ultimately it isn't good for society I'll jump in. Um, thank you for the question. It's it's a it's a very good question and an important question to really think through There are now 146 jurisdictions in the world that have comprehensive national level privacy regulation of these about 145 Have very remarkably similar characteristics and policies Among them is the creation of what's called the data protection authority office So the data protection authority offices throughout these jurisdictions are responsible for analog and digital privacy It's super important that we understand that not all privacy is digital We can't just focus on online privacy The the eco the data ecosystems of our world are not just digital They are analog and digital and merged and all sorts of other things When we go into a store and we um if we use a debit or credit card Um, we are purchasing something in person, but that information is also digital So there is a real blending of these ecosystems in very complex ways So I think we've got to really focus on the people and on the governance And on the enforcement and also auditing and making sure we're in a continual process of improvement I think those really basic administrative tasks looking at a larger picture Is something that we need to be very careful to keep in focus and I'm going to emphasize again That whatever is created we must ensure that it is fully inclusive And that we are not creating an elite form of privacy for elites We've got to think more broadly and we need to learn The lessons we've seen in the pandemic In particular and really open up the gates and really give people who have not had a voice We need to give them a voice in these kinds of decisions. Thank you Thanks. If I may add to my fellow panelists remarks There's an urgency that needs action. I think we all feel it in the marketplace right now Put me down as someone who has seen the agis act with great authority And on a number of important cases over the last two and a half decades in the privacy arena You have written the body of law that we call privacy in the united states largely through your actions And enforcement actions. So for me, I would not want to eschew your valuable expertise. I'd rather see Greater resources much greater resources brought to bear To your offices immediately In the interim while we're considering whether or not to have other data protection authorities set up in addition to Or you know in conjunction with your offices, but I would not want to see the common law authority that you've Attained over, you know, literally 800 years of English Legal tradition and the state statutory authority that you've obtained or your constitutional authority Which may allow you to take particular actions to protect consumers Be set aside while we're creating a separate data protection authority in your in each of your states Because those powers are so important To allow you to enforce a particularly pernicious acts that may happen by Bad acting companies or bad acting individuals in the privacy space So if we can have an office That has your authorities and has your resources and then some i'm all for it If not, let's give your offices what you need so that you can go do the work that we need to do On behalf of all consumers everywhere. Thank you. And we obviously would welcome more resources all of the attorney generals Who are here today and who are not would welcome that but we do know that there is no comprehensive um, federal legislation Governing data privacy in the united states and as was mentioned There's a patchwork of some states that have passed comprehensive privacy legislation They include california, virginia, colorado and vermont and if you if you don't mind I'm just going to briefly go over that legislation california has passed several privacy statues First the california consumer privacy act which was passed in 2018 Went into effect in 2020 and then the california Privacy rights act which was passed as a ballot initiative in november 2020 and will go into effect in 2023 I'm sure jennifer can tell us more about that and it established um the the uh The california consumer privacy act established the right to know about personal information a business collects the right to delete personal information The right to opt out the right to non-discrimination and The ballot initiative basically strengthen The california consumer privacy act vermont in 2018 passed the nation's first law to regulate data brokers What is a data broker you ask? Well, i'm going to tell you data brokers aren't entities that knowingly collect or sell to third parties The personal information of consumers with whom they do not have a direct relationship The vermont law which went into effect in 2019 Requires data brokers to register annually with tj donovan the vermont oag and pay an annual 100 registration fee virginia virginia consumer data protection act was signed into law earlier this year and makes virginia the second acts Excuse me the second state to enact a comprehensive privacy and data security law after california It provides consumers with six main rights the right to access the right to correct The right to delete the right to data portability The right to opt out and the right to appeal calibrato Colorado is the third state to pass a comprehensive data and security law with the colorado privacy act It includes it gives colorado colorado residents the following rights the right to opt out the right of Access the right to correction the right to deletion and the right to data portability and as for my Own state new york. There's some legislation that has been introduced, but unfortunately has not passed The assembly and or the senate and so my question to the panelists Do you think the remaining states and territories should be adopting some variant of these laws? Or should they do something entirely new? What sort of alternative approach? Would you consider and or recommend? Is there language or concepts that should be deleted? Or that should be included or Again, should we just pass? What the congressmember indicated and that is a digital authority. I'm going to first begin with pam Oh, thank you. Not a difficult question at all. All right So That's no problem. I'm going to To propose that There is a very important role for the states. I know that many bemoan that the united states does not have comprehensive legislation But there are folks who have been working in privacy for even longer than I have and there are consensus is that you know Maybe there's a reason for that. Maybe there needs to be more time here in the us For the consensus to truly emerge about what's right Each state has a different context and in privacy studies that we've done That are very very data focused. We've understood clearly that there are significant differences in privacy Affection between urban and rural communities So therefore more rural states will have different needs and will need slightly tweaked regulations They'll be more effective for them in in ensuring that privacy can reach all the way through from the The leaves of the the tree all the way down to the the deepest route and I think that we've got to consider that so I think that Having said this, I like the idea of each state looking at its context Doing its own analysis based on its stakeholders and really having a robust conversation Not of everyone in the united states, but of that region I'm based in portland, oregon and here The attorney general has been convening a what they call a central table task force to convene. It's comprised of 10 Um individuals that are meant to represent You know groups broader stakeholder groups I do sit on the central table and I represent consumers and Privacy interests along with other consumer and privacy groups in oregon But the thing is that's very unique about this is it's really it's orgonians and we're talking about what fits for oregon I think that if states are going to look at this That's probably the most important thing that can be done Is to really conduct local stakeholder Meetings and inputs and really from from all stakeholders from business to consumers To various sectors health education, etc We really need that and I think that there's a lot of room For a robust conversation to be had there And I would love to see that as opposed to a replication of what has worked for other states Which may be much more populace or may have different Population characteristics and economic characteristics. Thank you. Thank you. Pam So argon believes that one size should not fit all led by their fabulous attorney general elisabeth. What about the others? Patchwork or one authority I'll jump in I prefer brand icy and laboratories of democracy to patchwork um, and while I do think that I'm having Some uh consolidated expertise as I mentioned earlier can be beneficial I agree with pam 100 percent that States have their own context and what makes sense for Individual states should at this point be paramount For a few reasons. I mean one of course is our brand icy and tradition, but also The issues continue to evolve. They evolve quickly. Um, and so they combine with the different contexts of different states In a way that I think makes it especially valuable For states to be thinking about what works for their own populations that said I do think that is not as though it has to be You know a complete, you know free for all with things being very different As pam said naturally the data protection authorities in europe and the and the laws that have developed there I tend to have some commonalities that are very well conserved across jurisdictions And in our own developing Our own developing Set of options here in the united states A lot of those have been conserved So the rights to correct the rights to delete the right to know The right to understand what is happening with your data and to have some control over it and having those kind of touchpoints Is something that I using those kind of touch points is something that I think will naturally happen Because states will be comparing what's available to their context And to and to how things are developing I will say I think there are some emerging things that are particularly promising The first is not even emerging anymore. Um, thankfully you seem to have had a decisive move away from a list based Personally identifiable information It's really constrained definitions of what personal information is to more Flexible and dynamic definitions that actually represent how people interact With the world and how their data interacts with the world I automated profiling decision making is starting to be recognized as a key area And I think that is a really important thing For states to continue to look at as well as remembering the connection between privacy and security And there are also more wonky sort of detailed things that I think are getting traction And I think that that is valuable For example, the concept of cross-context behavioral advertising The concept of businesses that a consumer intends to interact with Rather than, you know, if you just hover your mouse over something that that indicates some sort of intent I think these are concepts that are valuable And that it's good that states are thinking about Sort of a set of norms But those add those set of norms and common practices should always be filtered through the lens of what is appropriate for that state Tim, I saw you shaking your head It was mostly in approval of Jennifer's remarks and those PAM There's a lot of commonality of opinion here. And so let me just maybe extend The remarks if I may of Jennifer and Pam rather than disagreeing with them To my mind, the best privacy statute out there is the one that hasn't been enacted yet I would point you to legislation pending before the Massachusetts state legislature. It's s46. It's the Massachusetts Information Privacy Act I would suggest to all of you that if you're going to model state legislation This is the model that you want to work from And let me tell you why I think it's superior to what's been passed here for and I commend This those three states that have taken the leadership role that they have and enacted those three comprehensive bills They are a good start. Um, I hope we can do more And I also want to point out that the Vermont data broker bill is groundbreaking As a transparency effort and really pointing Um, a shining a bright light on what I think is the most important privacy problem that doesn't get any attention Which is the data broker industry which buys and sells the data Of every single american all day every day and literally turns each of each of us into a commodity Without our ability to do anything about it that that is the the problem set that needs the most attention But gets the least attention in the privacy world right now back to the massachusetts bill for a moment if I may What I think is important about the massachusetts bill and what makes it different from what's been passed here To four is it sets up bright line Rules and it does it on important things. It sets up bright line rules that both work for innovators and for consumers By pointing out a series of thou shalt nots It tells businesses you can't do the following things with data that you've gotten However, you've gotten them and it does that at the most important moments in consumers lives It says you can't use data or misuse data with respect to people's education their housing Their credit um their jobs Their health, etc. Etc. Etc. The most important moments in their lives That we've identified and Pam and I have worked on these questions for a number of years trying to identify some of these critical moments And and so this bill has taken these and added to them and said, you know, you businesses if you get this data Um, you can't use it to red line against people You can't use it to discriminate against groups of people either intentionally or accidentally And so it can you know, it points out that algorithms, which are simply math Need to be full-proofed to make sure that they aren't used to either intentionally or accidentally Discriminate against groups within society And I think that kind of bright line guidance That takes off the table the worst things that can happen with data the misuses the accidental uses that can be truly harmful I think that's the next and most important thing we can do to really protect consumers Because from my vantage point as a as a consumer advocate as a lawyer The bills that have been enacted so far into law Give us rights to understand um what data has been collected about us to to be able to access it We have to correct it But they don't give us yet a way to stop the misuse Of data that is happening right here right now They really can change and alter the direction of our lives And so I'd point you all to that massachusetts bill and say that really is where A lot of good work can be done and it can be done in concert with the vermont bill Which actually points out which data brokers are buying and selling the data to the businesses Which are then misusing or intentionally abusing the data of consumers So tim you mentioned that these private these data privacy brokers Data brokers are part of the privacy ecosystem So if states were to adopt the vermont model Is there anything that you would add or delete In with respect to any legislation Absolutely, and thank you so much for the question And I certainly would love to hear my panelists co-panelists views on this The the registration statute is a good first step It it forces companies which have been in the shadows to come out Into the light And what it does is it? Makes them register with the state if they meet a series of criteria of being companies that are collecting data from consumers from all the moments in our lives from what we Buy at various stores from what we're You know subscriptions were we're using online You know people were interacting with Where we're getting our data from what you know online interactions We're having all this data is being bought and sold by literally thousands of companies And then is being resold to every other company to market against us Frankly to make decisions about us about what goods and services were offered And at what prices And and this kind of you know price discrimination Decisions about housing decisions about education Decisions about health and education These are being made without our consent without our knowledge and so To your question general what could be added to these registration statutes the two that are out there of vermont in california Which have led the way Are a series of these thou shalt nots that prevent these data brokers from selling data over a series of categories To a whole bunch of businesses. I would I would try to take out Um this model of commodifying Americans Um, especially the way that it happens in the third party way where businesses that none of us even know are collecting this data are selling Data to other parties when we as consumers can never even control it tom talked at the beginning of his remarks about having Um consumers not feel like they have any ownership right of their data The data broker industry is the entity that takes away that ownership right and then gives our rights to other Businesses because our data if it's being bought and sold without our consent without our knowledge Without our input or control Really disempower so each and every one of us that's where I begin general tom. What are your thoughts about? Yeah I'll be quick here The seemed like the question was a fundamentally comes down to accountability Which is the reason your registration so you can they can be accountable We have plenty of constructs where we have accountability both to a federal government You know a federal entity and estates for bad acts or behavior So I don't see them as necessarily mutually exclusive. I think a lot of this Plays out in the way we approach this So I don't know that we have to make that choice necessarily. I don't think they're mutually exclusive and I'll leave it there Any other panelists want to comment? Oh, definitely First I want to commend in particular vermont because they did something amazing and brave and just Really, they were the first state that passed the data broker anything and I have to tell you If I wake up and I'm having a difficult day, I still remember that and I'm pleased It was meaningful. I want to talk in very practical and concrete terms here So first there's a couple of things that vermont did other data broker Statutes in other states have not done Number one their definitions of the vermont definitions of data broker and other definitions were very sharp and clean Other states have not followed through with this great clarity of definition and it's caused a lot of I would call slippery issues In terms of who gets registered and what information is displayed and how I think that could be improved I like the vermont model The second thing that vermont did that was absolutely groundbreaking and no one no one really knew it at the time My understanding is that someone in the ag's office added this as part of You know thoughtfully as during the negotiations and that is to require Information about the knowing collection of the information of children This is extraordinarily important to include in any data broker registration system and I'll tell you why so We monitor the vermont data broker registry very closely. This is why in january of 2020 We found an entry from a little company called clear view ai which Willingly admitted that they were scraping the information of miners without consent so This provided extremely important light To that activity, but there have been other entries which have been Extraordinary and just a bit mind-boggling for example A data broker a very large national level data broker registered in vermont And dug way down in their registration One could find that they they knowingly Utilize the data of miners to create An inference score for parents Yeah, that's not okay. So this is the use of of course ai and machine learning to create this type of scoring We have an entire report about this called the scoring of america. This is a big deal But I do think that vermont could go another step in a very concrete practical terms Let's explore an opt-out because now we know about what's being collected. Can we please be able to do something about it? I recognize that this would be a very big fight But I do think that the Definitional parameters of what vermont has done Is narrow enough to allow for this and let me raise A very significant issue that we are researching right now and that is this During the pandemic because of the length of time the pandemic has gone on We have a public health Data set that is emerging into public view as I think everyone in this room knows HIPAA does not cover public health data. The state's Regulation covers public health data. So we have a bit of a mess in that a lot of people are now having to have their Private data which is public health data being circulated in all manner of vaccine credentialing systems and and elsewhere as well So we need to take a look at this. This is a very important data set and it's getting larger and whether a person is vaccinated or not That is not what I'm bringing into question here when I'm bringing into question is the emergence of huge and unprecedented Amounts volumes of public health data of individuals moving into the wild What do we do with this because we're already seeing that the data in aggregate mind you in aggregate not micro data Or personally identifiable data, but in aggregate is being Utilized for public health purposes, but also now for other purposes as well So I think this is something that we're going to be grappling with for a long time But it needs attention. Thank you Pam that raises another issue for another Segment, but we really need to talk about Those data sets that's very troubling and the fact that that might be out in the ecosystem But you did bring up really an important issue and that is allowing consumers to opt out and basically businesses should be should be required to Obtain an affirmative opt in before collecting data and another hot topic issue and that is They're right to a private action a private right of action What are your thoughts let's begin with tom sure so We are at the heart of the business model of digital platforms with this question I actually wrote it down in advance because I wanted to think about it So users plus traffic equals data data equals monetization monetization equals revenue Revenue equals a stock price and a stock price Offers the ability to attract talent and acquire companies and fundamentally i'm leaving some stuff out there for simplicity That is the business model Where if you don't pay for your if you don't pay for a service, you know There's an old adage if you don't pay for a product you are the product that is fundamentally a business model right there And so in an opt-out situation you we are disrupting that business model doesn't mean we should do it But we're going to a fundamental Issue there so I look at that and say this is what we need to get to when I say as consumers and authorities such as yourself And the companies that who are part of this we need to get to this question of business model What it means for consumers? What are the implications for these companies? If there's an opt-out it will disrupt this business model. That's not a bad thing I actually probably support it But that's what we're getting at here. This is a really significant question That we have to answer moving forward So, um, california, virginia, vermont and colorado, they all have opt-out Versions in their legislation jennifer. Do you want to speak to that issue should more states have opt-out? And I know nobody wants to mention it. What about a private right to action? I think those are both really hot topics And I'll actually start with what pa mentioned about the data broker registry Vermont really is the leading light here The california did amend the ccpa in 2019 to add a data broker registry It's run by the california attorney general and It does include a right to opt-out so you can look at the disclosures and each disclosure explains how A consumer can opt-out. There are things that vermont has that I Personally again speaking for myself wish we had Disclosures about collecting data from children as cam mentioned also disclosures about data breaches But the question of the opt-out which is not just the data broker question But the broader question that general james brought up They're there One of the most important things I think for everyone to think about is what do you mean by opt-out and how or opt-in? But if we're but the model has so far has been opt-out. What do you mean by opt-out? What is the mechanism by which people can opt-out? Opting out has to be realistic for both Consumers and for companies in order for it to work So in california, we have two things and we can see how they develop one Is that people can designate others to opt-out for them? So and that could either be a company or it could be ngos and both ngos and companies have Stepped up to provide this service secondly california has what Sort of a global opt-out. Um, it doesn't go into work in the offline world, but in the online world the The excuse me the The attorney general's initial regulations refer to this global privacy control The cpra refers it to an opt-out preference, but it's an automated way for consumers to express their rights and for businesses to respond automatically and this is something that that Policymakers could look at carefully as well in order to create an opt-out regime That provides uncertainty for businesses and also provides a realistic way for consumers to express their rights Pam or tim anyone want to weigh in I'll jump in to the blazing fire of the private right of action issue I I love a good discussion and this is just about as good as it gets So the position of the world privacy forms. We do not take a position on private right of action And please let me explain why Private right of action has been a sword that all parties in the privacy debate that are typically present fall on and die on It has stopped meaningful dialogue about other aspects of privacy So I I believe That private right of action should be left to the states and to their practices and most states have a very significant Set of legislation which will either include or exclude or they'll have their own policies So that's that's where I sit with that Having said that I think there's something very important here, you know In the privacy world, there's there's a certain subtle lack of maturity in working with people It's and working with people that disagree with the positions. We may have It's really important to take a cue from the environmental movement Which is older than the privacy movement and to understand that the way we can make progress is the cooperative dialogue We need to be able to listen to other people and to other stakeholders and their points of view And and the private right of action is the same There are many nuances in how a private right of action can be effectuated So instead of simply falling on a yes or no sword question Yes, pra no pra death I think we would really gain a lot if we learn how to trust each other more Work together better across all the sectors and stakeholders all of them and learn to have a more inclusive dialogue and a more Compassionate dialogue and I mean that we we need to be able to regard the views of others without instantly Putting up our shields and instantly blocking those views and I think if we can learn how to communicate better And not to sound you know airy fairy, but this is just strictly How more mature sectors are are doing business if you look at environmental regulation You see that very very challenging questions such as the pra Have been acknowledged and discussed in a collaborative Um way While still understanding that there are differences I think they've got to get there and I think a good first step is to simply be able to have Discussions without falling on our swords. Thank you But if I may the abuse or misuse of data To engage in deeply discriminatory practices Against individuals and groups within society protected particularly Protected groups within society which are not protected because of the abuses of data Is so significant right now and so unaddressed That we do need to bring greater attention to that and so put me down as somebody who is open to the idea of a private right of action which is reserved specifically for these kinds of redlining actions these abuses of Um discriminatory actions which otherwise would go unaddressed Until and unless the ag's are fully resourced until and unless the f tc is fully resourced to address these particular questions I would see a private right of action is something that could be really meaningful now what I would not do Is do what I think most businesses are really deeply afraid of which is give a wide open private right of action for every tiki tech foul Out there for for businesses and I think that that could be really harmful It would you know really draw attention away from the most meaningful Harms that are not being addressed and so I would really truly carefully craft a private right of action To address these moments in our our lives in groups Of individuals lives that where data is used to alter the course or direction of people's Financial moments in their lives their their their their families histories And and if we're if we're careful we draw that kind of statute I think a private right of action could be very helpful again until We've properly resourced your offices So jennifer let's go let's step into the fire a little More um so in addition to just breaches just as giving consumers a property right Um in their data would that be technically workable? What do you say to that? Yeah, here. I'll have to lean on my academic background. Um, this is definitely not just anything to do with california policy directly Um, I this is this is this is and this has come up many times in my um in my academic career all the way back when I was in Law school at least 20 years probably longer periodically Um, someone will say hey, why don't we have a property right in data? It is a beguiling idea We like property in the united states. It is an obvious sort of it, uh, it signifies a locus of control And it it appears to uh, perhaps provide some way for individuals to recoup the Prophets that are being made um off of off of data about them Uh, I have put into the CLE materials an article by my colleague Pam samuelson from 2000 entitled Privacy of intellectual property question mark, uh, which in her way professor samuelson worked out all of the issues Before most people thought of them And the my funded again, this is my academic opinion my fundamental Thought about this is that it is beguiling. Um, but it is fundamentally unworkable for a few reasons the main two being data doesn't always neatly Connect to just one person Um, DNA for example connects families And it's very hard to kind of think through how that might or should work That's a practical intellectual property nerd sort of question in some way the uh, the sort of sharper issue Which is count can be counterintuitive is that property equals alienation If there is a property right in data, then that property right can be transferred and we know from the way that click wrap agreements work and The way that uh decisions are sort of presented to individuals and how they react to them That it might be very likely for people to actually alienate that property right in their data and no longer have it And so for that reason, I do think that it's uh, it's really not a practical solution Others have different views, but but that's my view on the private right of action California has a private right of action and it is limited to uh data breaches above a certain threshold and uh, what I would say is I would encourage um people to Uh as pams, I do think through the purposes Of a private right of action and how it might need to be targeted Tim gave some thoughts about how he thought it would be targeted And what is the sort of the best basket of things for the attorney general? To enforce and whether there is a basket of things that it would make sense for individuals to enforce again That's my very personal thought. Um, and just the little bit of uh information about what california does I'd like to add on to to jennifer's Um comments because our our views are quite similar on Um, uh data property, right? So I'm just going to put forward a couple of data points here and the first data point is Um, it's really not a practical, um issue because of the extraordinary complexity And interconnectedness of data ecosystems now Most data ecosystems today are in either real time or near real time some of the major ones Um, and they also cross borders as they're doing all of this into other jurisdictions which have other laws Which would um control this data and I would find out for example the finra system Which is a financial sector system which crosses borders It's a system run by essentially a regulated NGO. It's an sec system But run by an NGO and finra is extraordinary Um, because you you're dealing with more than 1 billion data points per day And it's all for enforcement of sec rules And um, what finra has had to do is they've had to move to a very complex ai machine learning system To be able to parse all of the reports that they're getting And this is not the only um real time data system if you think of cdc and the world health organization and the european um equivalent of cdc and the african and indian equivalents You have to realize that all of this data is rummaging and rolling and being utilized through these systems And it happens as quickly as that so um because of that a property right of data becomes An enormously challenging right to effectuate. It is probably very unlikely Um to be practical. Thank you. Thank you tim. You mentioned that the ftc was under resourced Um, if the ftc were properly funded would that be the agency the singular agency that would be responsible That could be responsible obviously for protecting our privacy I think at last count general uh by my count is somebody who's practiced before the ftc for many years I think that the ftc is currently enforcing 65 odd statutes And of which there's a few privacy statutes built into there But I don't think it's possible that the ftc alone could be the single cop on the beat if you will We need you and each of your colleagues To have fully funded offices. I think I've I've hit this uh drum Many time i'm going to keep hitting it. You need more resources. I hope your state legislatures are listening I hope they they fund you fully because I think it needs to be a companionship Where you are working hand in glove There are going to be things that are going to happen in your uh states and municipalities within your states Where you're going to be the first entity that sees what's happening And the ftc is fully occupied. I think the ftc is is currently Reviewing something like 200 consent decrees in the privacy space And and just their management Of those consent decrees some of which are most of the which are now on a 20-year time horizon absorbs virtually all the staff time That's available So just managing the the existing consent decrees from settlements that have been reached On privacy issues. It was absorbing the entire staff time So we could you know quintuple The 40 person staff approximately 40 person staff And we would not be close to be able to begin I think to um give the ftc what it needs to begin to bring new and meaningful cases So no, i'm sorry. We can't can't get there just with easy from my perspective Yeah, I I go back to something you started this conversation with about our laws have not kept up And it's not just our laws, but it's our organizations Thinking about what tim said and then jennifer said about Whether or not we can it's practical for to have a a right to this data Tim's point is we are not yet at a point where we have organizations that are that we are that are resource to do this And from a practical standpoint We're not at a standpoint yet where data can be collected that if you look at the music industry when someone sings a song There's an engineer and there's labels and there's Background singers and there's musicians And the process of making sure that that right is actually monetized and allocated the right way is really difficult In 2021 so is it practical in 2021 probably not will it be practical in 2030 and 2040 when our data Our ability to manage data might be different that might be different, you know might change So right now there's a question of practicality because we're not modernized Our systems are not modernized enough to be able to do what jennifer the issues jennifer raised And tim's point is we don't have an fdc. That's actually Able to be effective in the ways we're talking about in 2021 for a for the 21st century So I don't think they're actually different in that we that what you started with is we are not prepared for this world right now But many have and this is my last point before we open it up to the ag's and to everyone in the audience Many have argued that the fair credit reporting act contains all of the protections Rights and procedures that a privacy law could require It also has the benefit of being a law that we've had for over 40 years to work with and to develop jurisdiction around So what do you think of using the fair credit reporting act as a basis for comprehensive privacy law? Especially expanding the scope to cover all data not just credit reporting data if anyone wants to take that on before we Ask our ag's if they have any questions Okay, thank you so much. It's a great question. I love the fair credit reporting act It's a fabulous piece of legislation What a lot of people don't know is that when the senate banking committee Started to draft that they called in all the stakeholders. They called in all the credit bureaus They called in all the financial sector regulators. They called in people who've been harmed They had an extraordinarily noisy and difficult conversation. You can find it all in the congressional The range of congressional history on that statute. It's an extraordinary piece of legislation What makes it extraordinary is its balance It it effectuates rights for consumers and ensures that they can get them effectuated However, the fair credit reporting act does have some gray areas and I do think there is a good place to Expand those areas. Tim has basically been talking about eligibility today Eligibility situations such as acceptance to educational institutions Other eligibility situations such as getting a job, etc The fair credit reporting act doesn't cover all modern eligibility situations and that could be expanded I don't know if it would do everything, but it's definitely worth looking at as a model Thank you. Any questions from our attorney generals? Or their representatives Any questions from the general audience? Yes, sir Thank you You spoke early on in the panel about data protection agency Any sort of agency like that would need experts who understand those issues and like technology and everything How do we ensure that such an agency would remain independent? of influence from its regulated entities While still ensuring its officials and staff have the knowledge and tools required to perform the day-to-day regulatory work Anyone want to take that on? Well, I can say how california is a setup in case that is helpful So the it's the it was the proposition that created our agency Our agency will ultimately share enforcement authority with the california ag The california ag will be responsible for civil enforcement and we will be responsible for administrative enforcement So there is a chunk of enforcement a very important part of enforcement that remains with the ag Which has its own constitutional protections For the agency itself There are in the law Requirements for first of all the agency is independent or quasi independent. It is it is not under the control of Of the other branches of government Although they do appoint the board The requirements in the law are for the board to be scrupulously independent To take no direction from anyone else And then there are time limitations after leaving the board During which somebody would be able to come back and An influence or work on a case that is before at the board There's a one-year limit For work and then there's a two-year limit for advocacy and then of course regular conflicts limitations would apply We're in the middle of preparing our incompatibility incompatible activities Policy which relates to The kinds of things that employees can do So that you know we can talk through and you can think about whether you think that that is sufficient But that is the sort of the model that california is set up under there is a challenge with And it's always a challenge isn't it for all of us With being able to attract talents given the pay that we can offer Compared to the pay that others can offer and that is a practical You know question But so far We've been I feel really delighted and successful that we managed to hire ash konsultani and then we'll then we'll see how it goes So thank you for the question. It was a great discussion. Let's give our panelists A round of applause And let me conclude by saying it's been up to government to safeguard privacy Since the days when privacy Involved and it's really critically important that we all play a role and that we all work together to find a way to stay connected And protected. Thank you all