 All right, good afternoon. We will start the one o'clock panel. The surveillance economy moderated by Connecticut Attorney General William Tong. General Tong, the floor is yours. Thank you, General Donovan. Welcome to Story Hour with William Tong. It will be my job to put you all to sleep over the next hour and 15 minutes. Because of the COVID economy, I will be your only live guest today, but we have an August panel joining us for another discussion about the surveillance economy. And we've really got some great thinkers on these issues. Let me first thank General Donovan again and his entire team here in Vermont for welcoming us. It is a beautiful fall weekend in New England. And we hope that our guests from other parts of the country get to enjoy our part of the country. I'm going to try to get out and enjoy the lake for a run a little bit and a little bit. But it is a wonderful time to be here in Vermont. Let me start by introducing our panelists. Do we have our panelists on screen yet? So we're joined by a number of folks. Let me first introduce our friend from the South, Sarah Cable, who is the Chief of Data Privacy and Security Division for the Massachusetts Attorney General's office. She is an Assistant Attorney General in the office and leads the office's cybersecurity and consumer privacy protection efforts, overseeing investigation and enforcement matters under the Massachusetts consumer protection and data protection laws. Prior to her current role, Sarah was an Assistant Attorney General with the Consumer Protection Division of the Mass AG's office, where she investigated and prosecuted violations of the Massachusetts Consumer Protection Act. So can we give her a round of applause and welcome Sarah, please? We also have Sean Davis, who is the Director of Digital Forensics at Edelson PC. Sean leads a technical team in investigating claims involving privacy violations and tech-related abuse. Sean is also an adjunct industry professor at the Illinois Institute of Technology, where he is taught and designed the curriculum for courses on cybersecurity at the undergraduate and graduate levels and additionally serves as a committee member for the Federal Advisory Committee on Evidence Building at the US Department of Commerce. Welcome, Sean. Claire Garvey is a senior associate at the Center on Privacy and Technology at Georgetown Law School. She was the lead author on three of the Center's reports on face recognition. And her commentary has appeared in The New York Times, The Washington Post, The Wall Street Journal, and serves as an expert resource to both Democrats and Republicans in Congress and in state legislatures. Her current research focuses on the use of face recognition derived evidence in criminal cases and the ways activists, public defenders, and policymakers can ensure that technology is under control. So please welcome Claire Garvey. And finally, but not least, we have Maureen Mahoney, who is a senior policy analyst at Consumer Reports. Her areas of focus include state, data privacy, security, and data breach notification, legislation, and the state right to repair legislation. At Consumer Reports, she authored the study, the California Consumer Privacy Act, where our consumers' digital rights protected, and she co-authored the state of authorized agent opt-outs under the California Consumer Privacy Act. She also authored Consumer Reports Model State Privacy Act and Right to Repair Model State Law. So please join me in welcoming Maureen. So we've had a long discussion today about the surveillance economy, and I think for many of us, we're still trying to figure out what that means. And part of the exercise today is to get a better understanding and to get our arms around what it means to us, personally, and what it could mean to our constituents and our families and our community in our states. So I've given it a lot of thought in preparing for this panel. And to me, it just means that somebody's watching us all the time, and they're watching, they're observing who we are, what we do, what we buy, what we like, what we read, and they're making money off of it. And as I think about that and all the questions and concerns that arise from that, there's obviously an ocean of legal issues and policy considerations that pertain to that general understanding of the surveillance economy. But there is a much better and more eloquent framing of what the surveillance economy is in a book, Surveillance Capitalism, by Shoshana Zuboff. And in the New York Times review, they summarized Ms. Zuboff's thesis. And it says in the New York Times review of her book, According to Zuboff, Surveillance Capitalism distinguishes itself from its industrial forebear as a new economic order that claims human experience as a free source of raw material. We are the resource to be mined. The billion dollar profits of Facebook and Google are built on a general accounting of our lives and everyday behavior. And if you want to read her book again, it's Surveillance Capitalism by Shoshana Zuboff. And I just thought that that, for me, pretty well encapsulated this big issue that we're trying to tackle today. So before I move to our panelists, it seems to me that there are some obvious questions that we're trying to answer, right? We have this quote unquote surveillance economy, gig economy, new economy, information age. And how do we regulate this economy in order to protect consumers and to protect our constituents in our states? And so that question is a federal and state question. It's an industry question. What role does industry have in self-governing and helping all of us as a society, not just governments, but as a community and society, build an infrastructure to keep each other safe while advancing innovation and technologies that enhance our lives. As part of that, another obvious question is what do we do about our own personal information and protecting our own personal information? What rights do we have to our own personal information? And we think about all of that under the general idea or rubric of privacy. And I'm very fortunate that Attorney General George Jepsen, my predecessor, I think created the first privacy department in any AG's office anywhere. But that also tells you how new these issues are and how many of the issues we are confronting, if they're not matters of first impression, they're certainly of recent impression. And then of course, over the last 24 hours, I've talked to a bunch of folks about how we look at privacy and our personal right to privacy and our personal information and how our friends in Europe, for example, look at it or our friends in Asia. And the difference between maybe in the US, we see personal information as a commodity more often than not. And in Europe, they're conceiving of it as a personal property right. And then of course, we have to ask ourselves, what is personal information to which we have a right to privacy, right? And is it just the biometric information, who we are, where we live, height, weight, gender, sexual orientation, race, religion? But I think we all know by now, it's much more than that. And for me, it's most concerning and really comes into focus when I think about my kids. And I don't have to tell you, there's been a lot of discussion over the past week about social media platforms and the role of social media platforms in our lives and in our kids' lives. And I have 15, 13 and 10, two girls and a boy. And so I am the proud, somewhat challenged father of two teenage girls right now. And every day I worry about their lives, not just with their friends in school and their peers in sports and school, but their lives online and the risks that they encounter. And obviously, I'm very concerned because I see myself the messages that they're getting on social media. And I like to think that we've tried really hard to raise two confident young women, but they get messages every day that show them idealized images of young women and behaviors that may make you more popular or more cool. And I can see how it can lead to young people, young women in particular, questioning their own self-image, their body image, their behavior, what they should do, what they should not do to be popular with their peers. And I can see how that can lead to depression, mental health challenges, self-harm and worse. And so this is a really very difficult stew of new technology and impact on all of us that we're all trying to get our arms around and figure out how to not just navigate, but to manage it and protect not just my kids, but all of our kids and our families in this new environment. So with that opening, let me just pose a question or a prompt to each of our guests here today. I will start with Sarah, but here's my question. So, Sarah, when you get up in the morning, and obviously this is your job, right, and you think about the surveillance economy writ large, you think about privacy, you think about some of the considerations that I talked about just now. What's the biggest question that you feel like you spend your time trying to answer? Or what question challenges you the most or eludes you or troubles you? You know, what is it that you chew on every single day when you're thinking about this area of the law and law enforcement? Sure, that's an easy question, Attorney General. I'm kidding, thank you for having me. I'm glad I gave you a softball. Yeah, yes. Hello, everyone, and thank you. I apologize for not being able to be there in person, but I see many of my colleagues from other states are attending virtually or in person, so it's nice to see some familiar names. You know, I think for me, I cut my teeth on really coming at this from a lens of consumer privacy and protection of data, and I gotta say that no longer feels like a very satisfactory approach for me because I think, you know, privacy is important, but not just because privacy is important. I think privacy is important because it's about control and it's about power, and someone who knows more about you has power. That's part of why privacy is important. It's to create a zone where you, you know, you are the king or queen of your own domain. And so I think for me, the nagging problem that I chew on is how do we more fairly balance the economy between these tech companies and consumers? And, you know, you think about it, especially with the disclosures of the last week involving harms to consumers and especially to teenagers. You know, you have the computing power of some of the smartest minds and the most power companies in the world who have some of the most granular detailed information about us on the one side, you know, and you have a consumer on the other. And that is, there is no way that can be a fair fight. And I think we can give consumers all of the power in the world to consent, to delete their data, to correct their data. Maybe we'll equalize the skills a bit, but I really don't feel it's anywhere near enough. And I just think there's a, what the surveillance economy means to me is, you know, there's a fundamental imbalance in the market between consumer and business and correcting that balance, I think, is something that I feel a tremendous urgency around. And I know my AG, who I saw in an earlier panel, furiously taking notes and I'm expecting a phone call from her, but I know she also feels that sense of frequency. So, you know, my job as a lawyer does consumer protection, is to protect consumers. And I'm looking at a marketplace fair, the stack is so heavily, you know, so heavily stacked against them. You know, that's the problem that I want to take on. And privacy is one way to do it, but I don't think it's at this point, it's the only way to do it, or that it's sufficient by itself. Thank you, Sarah. Let's come back to that idea in a few minutes, but let me go to Sean and pose the same question and then I'll go to Claire and Maureen for the same question. You know, what question do you wake up with every day? And I assume, could a bed with every night that that continues to challenge you in this space? Yeah, thank you, General Tong. I think what scares me the most, and you know, from my background, I'm a technologist. So for the last eight years at Edelson, I'm taking apart mobile apps, looking at the source code. I'm seeing all the data that's being sent in the background, looking at connected cars, IoT devices. So what bothers me is just the lack of transparency and understanding, even consumers, consumers will say, well, this is so pervasive, we know these companies already have our data, is this really a big deal? But they truly don't understand the amount of data that's being collected and what their uses are. I give a lot of presentations where I go into very detailed slide decks and I show videos and I show the actual network traffic being sent and what's being used and how it's being used for. So I think it just bothers me in the fact that consumers just aren't aware and there needs to be more transparency in education. You can't just throw in a privacy policy and say that we're collecting your data and that it might be used for marketing purposes. I think it really needs to be instead of you're just being opted in to everything by default, the consumer needs a choice and really needs to understand what's truly being collected. And I guess just one quick example, what bothers me the most is geolocation. Companies collect a lot of geolocation, they say because they wanna know if someone looks at an ad online and they wanna track the conversion if a person actually goes into a physical store. But is that really worth the risk of the privacy ramifications? If that company sells all the geolocation or if there's a data breach, the New York Times ran a spread where they basically analyzed the database of millions and millions of geolocation points and they were able to figure out who they belong to through re-identification, where they went to school, where they went to church. So all of that seems very dangerous to me. So I think it's very important to just be able to find a balance where companies can have innovation but we're not risking privacy and we're not just letting them do anything they want with the data. Claire? Thank you, very happy to be here. It should perhaps surprise nobody based on reduction that what keeps me up at night and I wake up thinking about its face recognition. I've basically worked on this one topic for the last six years and will continue to do so. But I do wanna frame it within this idea of what is consent, both in the online world and the offline world. I think something that really bothers me in a lot of discussions around privacy is, and it's something that I think we all bridle against this idea that privacy sort of doesn't really exist online or the consent mechanism is choosing to be on social media. That is the one moment of consent. And a related point, this idea that children that younger generations fundamentally don't care about privacy because they live their lives online the same way that older folks do or that Luddites who choose to avoid being in the online world do. And companies certainly act as if this is the case and I think regulations don't necessarily contradict that position taken by companies which is really troublesome to me because I think it's fundamentally flawed. The base recognition comes into play because an example of this I think is very clear in Clearview AI, this idea that the information because it exists in the online world that the individual depicted in a photo has fundamentally consented to the collection and conversion into a biometric of that photo. But what this threatens to do is fundamentally destroy a bastion of privacy that does exist in the online space. And that is the ability as we do in the offline world to segment our identity, to have different audiences or different aspects of our identity. I think about this, I really enjoy the British show Sex Education. And I think about it a lot with one of the characters who is gay and from Nigeria. He has a persona for his Nigerian friends. He has a persona for his family. He has a persona for his boyfriend and for his friends at school. And they're fundamentally pretty different. What if he keeps those segmented in the online world? What if a technology comes along and is able to connect those all back to him back to his offline persona? That's what base recognition can do. And I think it's fundamentally incredibly troubling and stems from this presumption of either that privacy doesn't somehow exist beyond that moment of consent from introducing yourself into the online world or that particularly the younger generation doesn't really care about privacy. They very much do. Ask anyone 15 and younger whether or not they have multiple social media accounts. Maybe they won't tell their parent but I guarantee you they are actually picking and choosing who they are to different audiences online and that is their right. So definitely something that keeps me awake at night. And I have a hard time fundamentally thinking what the solution is. But I do think to Sarah's point about privacy is the angle at which I come to this but I very much would love to hear the other mechanisms we have to help. Boreen? Hi, so first of all, thanks so much for the opportunity to speak today. And the question that keeps me up is how do we actually get privacy protections in place that are workable and meaningful for consumers? I think a lot of consumers probably assume that there is a comprehensive federal privacy law that protects the collection use and sharing of their data and requires companies to keep that secure but that's just not the case. I do appreciate that some states are beginning to step into that gap and tried out some baseline privacy and security protections for consumers starting with the California Consumer Privacy Act which went into effect just recently in 2020 but that was compromised legislation worked out hastily to avoid a stronger ballot measure and like a lot of compromised legislation it hasn't really made consumer and privacy advocates happy and it hasn't really made industry happy either. They have had to put a lot of money into compliance. So we've seen, you know, continual struggles in California over the outlines of that law. We've also seen measures introduced in other states and I think a lot of the bills that we're seeing now don't do enough to protect consumers. They're based on an opt out of sale model that would require consumers to opt out at hundreds if not thousands of different companies. So I'd love to see a model that's more workable for consumers that puts limits on what companies can collect, use and share in the first place. So the owners isn't always on the consumer to manage their privacy because they're real issues of scale here especially if there are, you know, all these new platforms and avenues through which information is being collected such as through facial recognition. So at Consumer Reports we're working to develop tools to make some of these existing laws more workable for consumers helping to develop the global privacy control which Ashkan Sultani, who I'm stepping in for today is the leader of that project to make opt outs have a global option. But I think my big focus is how to make, how to get privacy laws on the books and how to make them useful for consumers. So getting back to Sarah's original point if it's not a fair fight, you know and you have these powerful forces bearing down on individual consumers, then is it even fair to expect consumers to bear the burden of protecting themselves and is, I guess I'll call it a sort of point of sale consent model, not just not workable but not even realistic frankly to consider as a reasonable response. So that raises the bigger questions which I'd be interested to hear your thoughts on if it's not a consumer based model or a immediate consent based model then it's something bigger. And obviously when we talk about fair fights and powerful forces and powerful corporate forces versus consumers, we often look to not just consumer protection laws but antitrust laws. And that's when we get to a much larger conversation about leveling the playing field between powerful forces and consumers. So I guess I'll go back to you Sarah since you brought up the fair fight idea and then I'll invite anyone else to jump in after that to work through that question. And another softball, thank you. I used to do a little antitrust law but I'm not an antitrust lawyer. So I can't see from a fully informed basis on is there an antitrust solution here? I mean, I certainly think there's work that can be done. I mean, I think there's a structural problem with the market that needs to be addressed. But looking at the problem from the perspective of a more core consumer protection model, one thing structurally that I think could be, should be looked at is the financial incentives that play here. And for that, I mean, there's just a tremendous amount of money flowing into this industry and getting back to Best of Zufoff's book which I highly recommend. I'm about halfway through it but it is just absolutely enlightening and frightening to be candid. It makes that really hit home for me is that the venture fund investments, the funding of some of these new business models, no longer is it sufficient just to collect data and serve targeted ads. That is no longer the competitive edge that investors are looking for. What she suggests is what they're looking for is the ability not just to predict human behavior such that you can show an ad to them at the exact right moment and therefore charge more for the ads but to manipulate human behavior. On a subcontract, and it's that, that to me is so frightening and so unfair where forget consent, this is a algorithm or some other technology or tactic that's acting low in our brain spends to plant a course of action in our minds that we then do and then being able to sell ads on it, right? So that's where the investors, that's where they're looking and those are the technologies that are being rewarded with investment. And so from my perspective, I think I don't think that's okay. Data is not here too far declared to be illegal but there ought to be some risks in pursuing that model. There ought to be some hard responsibilities that the companies involved in those technologies and who are profiting off it need to take on. Core responsibilities, what those are, I don't have a good answer but this idea of, you know, finalizing all of the profits while you're manipulating human beings without their consent or knowledge and foisting the harms back on people, there is no way I can make that, you can talk about innovation all day long, that is not acceptable, I don't think. And so highlighting that problem and injecting some risk and responsibility into that business model, I think is a place where folks ought to be looking at. I don't know how, but I think that's something that should be examined. Anyone else want to weigh in there? I'm just building off of what Sarah said, going in a little different direction. I just want to emphasize the importance of, you know, strong enforcement in resources to enforcement agencies because when you have such a huge power imbalance between the companies and the public servants that are, which are tasked with enforcing that law, a lot of times it's not a fair fight and, you know, up against companies that have enormous revenues and enormous legal teams that have all the resources in the world to develop kind of creative interpretations of legislation. So, you know, we can spend a lot of time coming up with the perfect legislation and we do and it's important to have that be right. But, you know, as we're also seeing in Europe unless there's enough devoted to enforcement, you know, it's just not a fair fight and you're not going to be able to get companies to comply. And just two quick points. I think to your original question and this idea that it isn't a fair fight, I think an interesting analogy is over in the relationship between the state and the individual. We don't actually ask the individual to be responsible for protecting one's own privacy or other rights. It's the responsibility of the state not to infringe. So it's an interesting model that we have between the state-citizen relationship and why is it so different between the, between that and the company-citizen relationship? I think it's just an interesting point that I keep coming back to about this. What is the responsibility of the citizen? You would never ask, face recognition, for example, protesters ask like, how do I protect myself against being surveilled by police or, you know, being tracked? The answer is you shouldn't have to. And at least within public or criminal law, that's not an expectation we have. The other point, and this is at risk of being very, very bleak, building off of what Sarah was saying. I keep coming back to this company, Alfie, who's partnering with ride-sharing companies. They're bringing face characterization into, they're pairing it with the tablets in the back of ride-sharing vehicles to serve targeted ads. It seems that they classify people by their demographics, race, sex, age, whatnot, but also try to infer mental state and emotion by reading individual spaces. And beyond the pale, most people who read the articles about this find it deeply concerning. But beyond that creep factor, I think the fundamental question is what happens when not just the online world, but the offline world and what we experience and receive from the offline world isn't the world itself, but a reflection of what a company or an algorithm thinks we are, who they think we are, on how hyper-segmented our reality and literally in the offline space, what we engage with on a day-to-day basis is actually defined by that, as opposed to being presented with myriad choice that the offline world presents. We have a virtual attendee CLE code, by the way. I just wanna note if you aren't getting CLE, please note the code. But I think this question, Claire, that you raised again about the responsibility of the individual consumer, right? And consent, as we're talking about consent, it strikes me that I guess I consent to being marketed to, right? Watching television or looking at my iPad or even driving down the street and seeing billboards. I guess that's something that I don't have much choice about, but I don't really object to those messages and having those messages push towards me. But I definitely, if I think further about it, don't consent to being manipulated, right? And I certainly don't consent to my children being manipulated. And so I guess the question is, is there something in that? Is there some line we can draw where the behavior isn't just, you can buy a set of tires here at this price, at this place, versus you should think this about yourself and needing these products, or in many cases, you should think this about our elections or about election integrity and about our country and about your community and all of those messages, you should think this about vaccines or not, or masks or not. And so is there something in the nature of what we're consenting to and that manipulation that we can focus on? I can, oh, go ahead, John. Yeah, no problem. Yeah, I just wanted to kind of bring up one point to the current question and the last question. A one other area that has a lot of challenges in, we think about first party companies and thinking about antitrust, is Facebook too large or is Google too large? But an area that becomes more tricky are all the third party providers. So most mobile apps will use a variety of software development kits. They're called SDKs. So those could be collecting geolocation. They could be used for personality insights. IBM, for example, their Watson supercomputer has a personality insights program within student education. Students will download a course module, but they might not realize that it actually has a third party library in it for adaptive learning. And even though the purpose of learning is supposed to be if I'm having a problem with a certain type of geometry problem, let's say, it'll help me by giving them more of that same problem. But what students don't realize and their parents don't realize is that it's actually collecting millions and millions of data points and it's putting them into profiles. So it could be that it's figured out that this student has this certain intellectual disability or maybe they are just slower in general than other students. There have been a couple of videos that some of the providers of these platforms have made where they've actually shown, here are some students and we put them in this poor category. And I just think about what if that company was actually sold? And then all of a sudden, your profile is being a poor performing student and maybe that will affect you getting into college or getting into a job. So I think it's, we do have to look at the very large companies but we also have to make sure to look at the third party providers as well. And then that's a big thing too because a lot of these third party companies consent wise, they will pass the consent onto the app developer or the first party company. And then the first party company just doesn't tell the consumer or the student at all or the parents and then they have no knowledge that that's actually happening. So that's a hard thing to figure out how to deal with but I think that's an area we need to also get going forward is just how to relate these third parties and make sure that app developers actually even understand what these third party SDKs do so that they can get adequate consent and show exactly what's happening. I was just gonna add on the point regarding advertising and manipulation. I mean, advertising is manipulation, right? Like that is the whole point of it. That's certainly legal accepted but there is a line where exactly it is, I don't know but there is a line between persuasiveness, market persuasiveness and unfair covert taking advantage of weaknesses, persuasiveness. And I think just old school, I've got a book back on my bookcase with all of the old Massachusetts regulations governing door to door sales, right? There's a regulation if people used to go door to door and do these high pressure tactics to get people to buy things, our regulations regulate that, gives you a three day right of rescission if you decide you come to your senses you didn't want that product. So I think there's a long history of states regulating undue pressure by advertisers taking advantage of people in a particularly vulnerable circumstance, boiler room tactics. I mean, that is the bread and butter of consumer protection lawyers. And I think, the principles are exactly the same and it's not happening in person but I do think some of the techniques that might be used by some of these companies are getting into a realm of, you're not acting willfully. You know, there's been certain neuroscience could tell us a lot about that but this isn't just, oh, this is a new form of advertising. I mean, there isn't a manipulative element to the use of our data back against us that I think is leading to the same harms or raises the same concerns that a lot of our old school regulations also were trying to raise. This is just the newest tactic. You know, it's one frame you could put to this. Go ahead, Maureen. Just briefly, I mean, this isn't a perfect rule of thumb but one thing we try to do at consumer reports and the other panelists might disagree but in terms of data regulation, we often think about what the consumer might reasonably expect. So a consumer might reasonably expect that a website that they're directly interacting with is collecting some information about them but they wouldn't expect that everything they're doing across the internet is being tracked and sold and used to target advertising. So, you know, we think those latter practices that consumers wouldn't reasonably expect, you know, there needs to be a lot to rein them in, you know, possibly even prohibit those behaviors, you know, depending on how harmful it can be. It seems to me also, Sean, you make a really good point which is we're not, I think you call them first party or frontline providers. You know, it's not just the big technology platforms and, you know, at the most basic level, there's obviously a huge debate going on about Section 230 and I think that generally refers to large technology enabled platforms but there's a whole universe, right, of other actors out there and other secondary tertiary technologies that all play a role in this and probably aren't well addressed by Section 230. And so we're talking about data brokers, right, we're talking about app developers and it seems to me, and I guess I would ask all the panelists for their thoughts, it seems to me that we don't really have any infrastructure to address law enforcement challenges and holding all of those other different actors accountable, right, for their role, not just in keeping data secure, right, but the impacts on their customers. Anyone want to jump in on that? Yeah, I could start by just giving one example, you know, I've talked about data breaches and information being sold but just thinking of a more practical use and where there is kind of a lack of protection, you know, the National Association of Insurance Commissioners had had me kind of do a study and look into accelerated underwriting. So normally when you apply for life insurance or at least in the past, you would do a medical underwriting where, you know, they would take blood and do labs but now the companies are basically looking into doing accelerated underwriting where they can look at your data from prior web history, they could look at facial recognition, they could see if maybe you have a health issue based off of a template that's taken off of your face, tracking data from Fitbits, all of this data can be pulled into there. Now normally a consumer would be protected with FICRA in terms of certain data can't be used to make a decision without the consumer being made aware but a lot of this data that's being used for at least potentially being used for accelerated underwriting, this is a new area, is not under FICRA because it's not provided by a consumer reporting agency. So just thinking, you know, again, I think a lot of the insurance companies, they don't necessarily wanna use it to make a decision because it is a gray area but they most certainly use it to determine if they wanna retain people or if they wanna market to them or not market to them but yeah, I think that, you know, it's just kind of one practical example where there is kind of a gray area and I think that, you know, anytime a technology company or, you know, a third party wants to, you know, they try to find these little gray areas and that is hard, you know, trying to have laws that will address every potential area that, you know, there could be a means to get through. I think another example of that, there was a recent article by Carolyn Mahaskins about the relationship between Google, Amazon and Microsoft and Immigration and Customs Enforcement, which brings a whole host of issues including the relationship between private companies that are upgrading data and AIDS but in addition, what it was highlighting was these third party contracts that where there's been a lot of public pressure on these public facing companies that consumers do feel like they have a certain degree of influence over because they have a personal relationship with an end product from Google, from Microsoft, Amazon and there's also been a lot of employee mobilizing and pushing back on Amazon's contracts with ICE, for example. Another avenue with which that sidestep is these third party contracts that no one necessarily knows about that allows these relationships to continue completely removed and isolated from any sort of public pressure or collective bargaining, that type of thing. I think it's another interesting example at a plus one on looking at the avenues for legislation but also looking at the harms as opposed to individual companies or individual technology. I think that can often be limited in these massive loopholes with something as simple as a third party contract that we have to actually face the harm and what is the protection we're seeking to create as opposed to what company are we seeking to penalize or limit or restrict or what technology are we seeking to constrain? And yeah, I'd just like to add briefly that data brokers again, raise this issue of the limits of consent by their very nature, they live in the shadows, their companies, the consumers don't have a direct relationship with and there are just so many of them. So I really appreciate the work that has been done in Vermont and California in creating these data broker registries that data brokers are required to register to and I think the AGs are in charge of managing these databases so that consumers on the basic level can know what these entities are. That being said, there are hundreds of data brokers on these data broker registries and it can be really difficult to, even in California where you have the right to opt out of the sale of their information at these companies, it can be really hard to opt out. We've had consumers try to submit opt out requests to companies on the data broker registry and companies are asking for a picture of their ID or a selfie that consumers are really uncomfortable with providing more information. So, bringing these companies out of the shadows is an important first step but we also need to do more to, yeah, control the harmful practices that these entities are engaging in. It seems though that even a data broker registration act or something is just scratching the surface, right? Because what are we really consenting to and as all of you are speaking, it strikes me that maybe it's obvious but it's not just the act of using our personal information to target us and sell us products, right? The personal information is used for a variety of reasons. You talked about the insurance context. There's law enforcement, there's employment and so the use of our data without our knowledge or consent or participation seems to me that the possibilities are maybe endless in the ways that that data can be used on an individualized and an aggregate basis, right? In ways that may ultimately prejudice us in our everyday lives. I mean, isn't it much more than just selling products? Sean, do you ever thought about that? Yeah, I was just going to say again, going back to not necessarily knowing if a company is even involved in all of this. So I mean, you have data brokers, companies or consumers have no idea that their data is going there and if they do, they have no idea how many data points. I mean, you look at Axiom. Axiom releases a report every few years where they talk about briefly kind of how many data points they have and believe it was maybe four or five years ago they had collected over 2,000 data points on every single American in the US. And then a few years later that doubled to 4,000 data points. And again, as you had mentioned with a data broker list, it's great to at least have an awareness that there are data brokers, but there still needs to be more information about what types of information is the data broker actually collecting. If I was a consumer, I would wanna know that Axiom had 4,000 data points on me and that some of those data points are about my religion or how much alcohol I may or may not drink or if I am eating fast food or not and different my rate, all of this information. So I think if there was a greater transparency of the actual data types that are being collected and what it's being used for, that would greatly help. And I did just wanna bring one other example that I think people are really not aware of is now with financial data aggregation. So General Tong had mentioned aggregation. So you look at platforms like Mint where you can basically pull all of your credit cards into one single pane of glass. At least as a consumer, I mean you understand that you're using this Mint platform, you might not understand what they're gonna do with it, but then there are platforms that are called Yodely and Plaid and there have been lawsuits against them recently. And basically what they do is a financial services company will import their software and then the consumer, when they're connecting their bank or their credit card, they'll think that they're actually connecting to the bank. It looks the same. It's got the Chase logo. It says user name and password. They're actually sending that information to the aggregator. The aggregator holds onto that password. They wait until you log into your bank and then they start screen scraping all of your information. And they're using that for the financial provider, but then it goes into the whole thing of they say, Yodely for example said, we can sell all of this data about your financial transactions, but we're gonna de-identify it. But it's very hard to de-identify that. If you sell timestamps and you have a unique identifier for someone and then let's say I'm target and then I go and I buy all of that data and then I find the one time that they purchase something from me at Target maps the timestamp, the purchase amount. Now I know every single thing that that person purchased. Just again, just not having the knowledge of that. So I think as more states get data broker registries, it would be great to have more awareness into the actual data points and what companies are involved. Sarah, did you wanna add something? Very quick, I agree with all the points that are made but just transparency is incredibly important but it's important to get us to further protections, right? And so it's important because again, it is such an unfair fight. We don't even know the practices that are super problematic, the bad actors that are involved, what data is getting taken about us and how it's being used against us. I mean at a very basic level we at least need to know that. I still sort of look forward to a future of we have some transparency. Okay, now we know the full scope of the problem. Now we can talk about are there rules of the road we can put in place to give some predictability to these business practices to take the burden off of consumers from trying to police their data and chase it all over the internet. So transparency is important but I don't think it's an end into itself. I think it's a necessary predicate to some more meaningful protections down the road. And just one other point, perhaps one place to start is to look at what how much state data or state collected data lands up in data brokers. Repositories like DMV data or I don't know public utilities maybe, how much of this information is actually scraped and whether there's a relatively, I don't know whether it's low hanging fruit but in my head it is to cut off the information actually originates in state databases that lands up in these larger repositories. So let me switch gears and give a competing or contrary view. And that is for every technology that we've discussed today, I'm looking at Sean again, every cool new idea, every piece of software or middleware or some new application. There's an entrepreneur and a company and a venture capitalist and a private equity fund and a company about to go public and growth and economic development for states and do we need to think about that and how do we balance regulation which I think to many people in our country is often seen as a bad thing. That's how the conversation starts and putting more quote unquote red tape on industries that are growing so fast and change every seven seconds. I mean, what do we do about that Sarah? I mean, you know, on route 28 in your state there's a ton of innovation going on every single day and are we gonna put the brakes on that? Yeah, no, I mean, it's a tricky question. I mean, look, no, you can't put the brakes on that. You shouldn't put the brakes on that but I don't think some minimal regulation is going to really put the brakes on it. And the reason I say that is I think just some rules of the road, just basic rules of the road that companies need to abide by that equals the playing field for large and small companies, right? I mean, that's another important element of unfairness here is some of those entrepreneurs cannot break through. They don't have the troves of data or the IP that they need to compete with some of the bigger players. So some basic rules of the road that all companies need to abide by where consumers understand what is allowed to happen with their data. The companies have the ability of here's the line and I can't go below that line. I have to think, and I'm not an economist, I'm not an expert in this, but I have to think that would be helpful to some extent. Obviously there's problems, you don't want too much red tape, there can be unintended consequences. But right now, I think for better or worse, it's a wild west. And I personally believe that some rules that everyone understands and buys into makes competition more robust and healthier. And so I think there's a role for it. I think given the disclosures of the last few weeks, there are harms that are staring us in the face that cannot be ignored. And so, I think the harms of not regulating are becoming more concrete and some regulation will be needed. And I think there's a robust discussion to be had around what those regulations are and how much but I think you can regulate and innovate. I really do. I think that we need to change the incentive structure somewhat. I mean, right now, since there's next to no regulation, companies are incentivized to collect as much data as they can and not really give a whole lot of thought about how they're gonna store it and protect it in the hopes that there could be some use for it in the future. Even though the California Consumer Privacy Act puts next to no limits on what companies can collect in the first place. We did hear stories after the CCPA went into effect that now that there's some regulation, companies have to, for the first time, be engaged in data mapping and actually figure out what they're holding onto and it incentivize them to get rid of a lot of old data that could pose a security risk for liability or things like that. So I think if you put some regulation in place, it'll incentivize companies to move towards innovations that are more privacy protective, finding ways to make money that doesn't necessarily harm consumers. We've seen companies like DuckDuckGo rely on contextual advertising where advertisements are placed based on the content of the page rather than targeting based on consumers' personal information and then that could be just as lucrative as other forms. So I think companies should take on the challenge of trying to be profitable and innovative while also protecting consumers. Yeah, I guess my thought on that. I mean, if we look at, so at the federal level, eventually we probably will have a federal privacy bill. Unfortunately, a lot of the tech lobby is trying to put something in place where it will be the most minimum it can be and then they ultimately want it to preempt state law and that would be a terrible thing. We don't want that. So I mean, how it is right now, I understand that there are a lot of different states and it might be hard to keep track of all the different laws and the different states, but the great thing about it is, we're really finding out within each state what major issues are. You look at my state, Illinois, we have BIPA. We don't say you can't do any sort of facial recognition. We just say that you have to actually tell people about it. And even though I think transparency is important, I ultimately think if it's something that's sensitive enough, it should ultimately be an opt-in. So it could be a thing of where basically, companies can collect data, but if you're going to collect biometrics, you're going to collect genetic information. We figure out what the most sensitive areas are and say you absolutely have to get express consent for that and you have to actually explain what it's gonna be used for and all of the purposes and all of that. It'll be hard to outright restrict collection of effort, but I think just having an opt-in for that. And there are various states that have genetic privacy law. I like the fact that we have all, I just don't want it to be a case where I think if we had a federal privacy bill, we could set good minimum standards, but I still think the state should be able to figure out what's important to them. And then that also lets other states realize what the issues are. After Illinois had BIPA, Texas has a BIPA, for example, other states have been looking at facial recognition laws. So that's kind of my thought on that. Well, thank you all for your thoughts on all of these issues and questions. We have a few minutes left. So I thought I might open it up for some questions. I see General Camacho here. I don't know if, well, I might put you on the spot. Are any of my other colleagues in the room at the moment or are they at the coffee stand? All right, General Camacho, stepping up. I got to show them worth it to come out to these things. Sean, I don't know if you'd correct me if I'm wrong, but were you on a panel with General Tong in January of 2020 in COPPA litigation? I think you warned us all about Google accounts for kids and Google education. So I guess my first question will just be this last year we've moved a lot to remote and online learning and anything that you would imagine with this generation as we kind of inevitably are gonna have some kind of digital footprint. Because you pointed out, you pulled out the source code and these trackers and kind of how you move from being underage and you become an adult and now they have a six-year-old who's on Google for their school. And I'm thinking they're gonna have a decent amount of data by virtue of the fact that a school has adopted Google as an educational platform. And second question, just for all of you, is I'm about halfway through surveillance capitalism. TJ actually recommended the book at that same conference that you were at, Sean and General Tong. And I'm about halfway through Durkheim. I haven't studied sociology, so I'm kind of falling asleep at points in my audible. But I'm at the section where she talks about products themselves. And there's an expectation if I'm online, sure you're gonna get some of my activity, but when your headphones are asking for access to your location, when your lights, your smart devices, your refrigerators, and if you disable them, they basically say, well, now you lose functionality of things that are pretty core to an application or a product that you've purchased. So wearing our consumer protection hats, I'm automatically thinking, and she points out in the book, you've degraded a product that you've said would do certain things in exchange. And it's inevitable, but like, okay, I'm gonna, do I wanna not be able to use this very important function? Or do I wanna just say, okay, go ahead and take a look at my location when I'm using it. So just from the consumer perspective, not in the digital realm, but more of the smart products. Yeah, for sure. And I was on that panel and I appreciated that General Tong's monitoring today because I knew we would be in good hands from that last panel. And yeah, I've been working on the Google education case. We represent a variety of municipalities and attorneys general. And what's challenging with the Google aspect was it kind of goes back to what we talked about, again, passing on consent. Google is essentially trying to make the school get consent instead of Google trying to get consent themselves from the parents. And sometimes I was on a panel within the Illinois legislature and there were a parent group and they had said, one day our students just came home with Chromebooks. Like they didn't remember signing anything or having any knowledge about that. And just what the danger is, is for one with the Google example, students are logged into their Google education account. So once you log in, you're logged in, like you have a session cookie that's going to last for a long time. But what people didn't realize is then when the students basically, if they go home on their home computer and they log into their Google account to check their email or they're still using their Chromebook, that identifier follows them everywhere. So now you have students that are underage and basically all of their browser history is being tracked. If YouTube is turned on, their YouTube history is being tracked. And again, we talked about other products that could be put in these educational platforms like Pearson or like Google. So it's hard enough for parents actually knowing what's happening with the main platform, but then you're putting in modules and then those have adaptive learning and all of these various things. So yeah, it definitely does present it much. I think what concerns me a lot of the increase of these internet connected devices is certainly the privacy aspect where there's this data that is being collected that's often very sensitive that consumers aren't aware of and there are a few restrictions on what companies can do with it. But also the need for more data security protections to protect that data from hackers about half the states, I think have data security requirements, but it's not clear that they would apply to these internet connected devices, California and Oregon do have laws on the books that's specifically applying to these devices, but I'd love to see more laws like that. And then to your point, I think another great element to put in those types of security laws is a requirement that companies keep that software updated through the reasonable lifetime of the device so that once it expires, it doesn't mean you can't use that product anymore. And also the option to use it as a dumb device if you don't wanna have it be a smart device. So as consumer advocates, those are issues we're really interested in. That's an excellent point and thank you, General Camacho for bringing that up and the internet of things, which I think is what we're talking about right now is it also opens up its own whole set of issues and I remember our dryer failed last year and I went online and researched and bought a dryer on sale and it has Wi-Fi and I'm like, why the hell do I need Wi-Fi on my dryer? And I spent like two hours because I'm the IT guy at home trying to figure out whether I needed the Wi-Fi functionality and whether it was beneficial to us or not and how it worked and it's not anything that I could consent to, it was basically the base model and it came with Wi-Fi and I had no say in the matter and so that opens up a whole bunch of other issues around privacy and innovation. Any other questions from the general audience or the audience online? Well, please join me in thanking our wonderful panelists. Thank you so much for joining us for this discussion. Thank you.