 Okay, good morning everyone. This is the Vermont House Committee on Commerce and Economic Development. It is Wednesday, March 16th at 9.05 in the morning. We are here to look at 570 just for everyone's knowledge. 570 is a short form bill, so there's really no language in it. We're here to discuss the topic and the report that the Attorney General's Office did with us for us. And so we have with us Charity Clark and Ryan Krieger to go over this and actually the thought here is to really not do too much with any of this. It's more for committee knowledge to better understand the pieces concerning data and bio data and all of that stuff so that we get ourselves. Acclimated to what we may be looking at next year as a consumer protection bill. So Charity, good morning. Thank you for joining us. Can I sit in where I usually would sit? Yes, ma'am. This is the first time in my life. After times, I guess we would hope. After times. Do you want us to take the flowers somewhere in there? Okay, just here. You can put them. You can put them in the first place. So for the record, I'm Charity Clark. I'm Chief of Staff at the Attorney General's Office. I'm also an assistant attorney general, which means I'm a lawyer who's been sworn in. I came from the consumer division before I was Chief of Staff and probably for that reason, I'm here a lot on consumer issues and have worked closely with Ryan and others for the past four years on data privacy issues. So I just want to orient everybody with a little history. But before I do, I want to say that I have the bittersweet news. I'm telling you that Ryan is actually leaving our office to go to the Federal Trade Commission, which is incredibly cool for all of us because the FTC is really important for the marketplace and for consumers, but very sad for us. And that is one of the reasons why I wanted to make sure that we come in and testify before he leaves. Ryan, of course, famously actually teaches the class on privacy at the University of Vermont. So he is a wealth of information and has worked closely on this for years. So let's go back to the last time I testified in this community. It was March of 2020 and we, the Attorney General's Office, had just sued Clearview AI. That lawsuit is ongoing. Clearview AI is a facial recognition software company. They screen scrape the Internet for images that are public. So for example, if my Facebook profile is public, they might take that picture. I don't think they're supposed to by the terms of Facebook, but they might. Just to give you an example, one of the challenges of course is the bots or whatever they're called that do the screen scraping don't discriminate at all. So they just take whatever they see. So if I'm eating a sandwich on Church Street on a bench and I'm in the background of some tourist photo, there I am getting scraped into Clearview AI database, not even realizing someone took my picture. So that's only the beginning of it. So I'm not going to go more deeply but just to keep in mind where this came from at that time. And having learned a lot about what Clearview AI done and biometric data. Generally, because of the work that we've done with data privacy, we had proposed that the committee adopt biometric information, Privacy Act, a BIPA. BIPA was first passed in the state of Illinois and other BIPAs have passed since then. And just keeping a pace here. What we did is we met with a lot of the stakeholders and everyone there basically said, you know, this is going really quickly. Why don't we slow down, have some more conversations. And so we committed the AG's office to holding forums to talk about a BIPA and other things that might come up. And so the committee, Ryan and I, with consultation with the chair, decided that 2020 was maybe not the best time for those forums. And so we postponed till 2021. And after those three forums that we had across the state, Ryan and I put together this report and attached all the written feedback that we got. And I'm happy we got a lot. A lot of people did participate and it was great to see their feedback that we could kind of consider and incorporate into the report that we presented. Now, having gotten us to the report, what I wanted to do is just walk through, like give you the roadmap of the report. Ryan has spent time putting together an actual language for a bill. And it's based, the language is based on, you know, his own smarts, but also other bills that exist around the country. And keep in mind, you know, our states have layered onto other bills and we would be if we pass a bill that incorporates some of these data privacy protections, we would be the next evolution. So Ryan has tried to incorporate, you know, all those different to benefit from all the work that other states have done. And of course, all the input that we received from the stakeholders. So with that, and I think most of the overview, if you have the report, which I know we submitted to the committee, so it's towards the conclusion I tried to summarize here is like the main points. So the very last page has them. But the first thing is the BIPA, which would provide protections related to biometric information. And what we would advocate for is that the protections come with not just the state being able to bring an action but the individual so private right of action. Not every state has that and the Attorney General and Ryan and I feel that's an important element of a BIPA. So we would advocate for that and I think that that can be a controversial topic so that's one that I'll flag for you as not everyone agrees with us on that one. If there is a BIPA having clear and specific damages we've heard from stakeholders is very important. Some of the other BIPAs that exist don't apparently have that quite so clearly so that's something that we'll want to be mindful of. The second is, this might ring a bell, the California Consumer Privacy Act has data minimization provisions and we, you know, we get our data breach notification act brings those data breach notifications to our office. Ryan and a paralegal in our office handle those and we publish them on our website. And I think when I first started at the AG's office, you know, eight years ago, there was, you know, every couple days maybe there was a breach and now we get many, not many multiple, I think it's fair maybe not many multiple a day. And so when we're thinking about how common data breaches are, just not having the data can save you a lot of heartache and being not just best practices for businesses because they don't want to find themselves, you know, letting you know accidentally unwittingly, unwittingly, letting data out, but helping them and then creating a, you know, an ethos that we don't keep data we don't need. And so we want to incorporate that here in Vermont as well. Ryan will slow down in and provide more in depth. I just want you to know where we're going. So the third thing is, and we talk a lot about data brokers, and the, you know, you might have a like a first party relationship. So if you're getting into your bank, and you're using, you know, voice recognition or something like that. And that's biometric information and you have this relationship with your bank. But what if the bank does something with that? You know, I mean, there's like another person and that data broker is the concept where there's other people involved. And so you might have a one to one relationship with a company and you're comfortable, but with a data broker, they're buying the information from someone else. It gets more complicated. And this would address that dynamic because I think a lot of people don't realize that data brokers even exist. They don't know when they get a free app on their phone that the reason why it's free is because it's not free. They're paying with their data. And this would address that. The fourth thing would, and this was this Colorado that do not track. So there is a state that has put this out there. They haven't rounded out what it's actually going to look like at this point in time. But it's the concept that you have a right to like, say, I don't want to be tracked. Like don't follow my info. Don't collect my info. So we have another sweet. We tried to model other states if they're doing something, we're not trying to, you know, create anything new if there was another state that had already acted in the space. So we have several provisions to expand the data broker law. I think maybe the most significant would be to allow for an opt out. Right now, we have a ability to say whether an opt out is a choice. This would be to require people to opt out from having their data collected and sold. And that would be, there's other elements as well that are more housekeeping but that would be a good one to include and then the last one on our list. Several components of the California Consumer Privacy Act the corner sort of a grab bag so I had I struggled to summarize them here, but there's three that I wanted to just flag for you parental consent. So my personal information component, and then the right to be forgotten. So when you see that old picture of yourself on the internet and you're thinking what was I thinking with a haircut, my face, my space or friendster or whatever. You have a right to be forgotten. So that is my roadmap. Again, if you just look on the back of the report. This is sort of what I what I'm following. I have so much detail that I have to do this for my own self to like create a roadmap but that those are the items. Each one of these could essentially be its own little bill. I mean, some by BIPA would be little but you know each one of them could be addressed separately. This is a comprehensive package. It was sort of like once we got started we couldn't ignore these other things that seem to be worthwhile and we could see we're being done in other places for the most part. So, I think I won't take questions. I'm going to allow Ryan to dig in a little bit more deeply now that you have that roadmap. Feel free to ask him all the questions that you need because as you know, we only have him for another like three weeks so unless you want to take Ryan's privacy class at UVM which you will be continuing to teach I think so with that I'll pass it over to Ryan. Thank you. And you know that it's the report that Charity cited was on a reports page under the members of the Attorney General's office. It is January 5th. It's on our page of the energy's office our page. If I couldn't find the office of Attorney General January 5th on the report under the reports tab. Yeah. Okay. Right. Office of the Attorney General. Thank you. We are under reports another resource. Yes, under reports. Reports and resources tab and go down to the office of the Attorney General and there's one report there. They're tracking their faces and I can't find their reports. I do have a copy of someone wants one. Over. Please stand by for technical improvements. I'm going to be telecommuting actually. I know right. Yeah. Exactly. Well that was a requirement. I wouldn't have gone otherwise. I am welcome to the committee. So why don't we get started as members identify where the report is. Sure. Thank you. My name is Ryan Krieger. I am with the Public Protection Division of the Vermont office of the Attorney General. Before I get started is is it possible to get the language up on the screen that I submitted? That would probably make it easier for me to walk through what we've done. If not, I could just describe it. I can. I'm not sure if I have the document. You'd be able to find the email. Yes. Okay. Well, she quickly does that. I want to say, you know, it makes me very gladdened to be able to be here and describe this comprehensive privacy bill. This is something that Vermont has needed for a long time that everyone's needed for a long time that the states have been stepping in and creating laws like this while everyone kind of looks up at Congress and says is something going to happen at the federal level. This is the session they say this maybe the session it happens and then it hasn't happened yet. So, you know, it's great that the states the laboratories of democracy have been trying things to try to provide this critical protection for Vermonters. It's also of course bittersweet because this is probably going to be the last time that I'm testifying here. I was walking in here just if you could humor me for a second I was thinking back 10 years ago. The first time I testified came in here with Wendy Morgan. I think it was this room, and I had recently come from a firm in New York City I was wearing I think $350 leather shoes which are very uncomfortable and a big black wool overcoat that cut down to here and didn't provide any protection whatsoever and I had my briefcase. And I was very excited to be you know working for the state of Vermont and you know working in the legislature and I recall someone sitting on Wendy's other side leaned into her and said who's the New Yorker. Exactly. So, you know, I've learned my lesson since then and I think about everything that's changed since in the last 10 years. So yeah I'm sure that you know I'm going to walk into my first you know meeting at the FTC at some point wearing my boots and my best and everything and they're going to say who's the Vermonter. Okay, so I'm going to get started on the bill so I guess it's not up yet but but that's okay I'm going to talk to the general elements of it. First I want to note something that charity talked about with clear view and public data how they are collected public data that people posted on social media. Now, in most privacy laws you see including data breach notification acts, you'll often see some exemption for public data, and it has become pretty much a standard thing in privacy laws. It was not always that way. And it is not necessarily the best policy to include that exemption, because public data has been interpreted very, very broadly. Anything anyone puts on the internet is public data. Anything that comes from government records is public data. Even if that government record before the internet probably would have been sitting in a filing cabinet somewhere and no one would have ever seen it. Now it can be collected and it is collected, you know, merged with a lot of extra data and we're talking about property records, voting records, court documents. So this public data exemption has in some ways kind of come out to swallow the whole. And the last section includes a recommendation that perhaps we study this public data issue more closely because I'm not suggesting we do away with it entirely. I'm suggesting that it really does require a deeper look. I think you would hear from some folks who support the public exemption that there is a First Amendment right and that you can't do anything about that. I think that the law is not nearly as settled as some would have you believe on that issue, but it requires further study. And I didn't want to take on that fight in this bills, but I think it's worth looking at in the future. The order of the bill starts with some new definitions. Then it goes to general requirements and collection and then biometrics even though biometrics is our main priority. It didn't seem to make sense to start with biometrics before the general provisions. A few definitions that are included here biometric identifier is a key definition in any biometric law. We took the biometric identifier definition that we had negotiated in the data broker bill. So it tracks that language identically. We discussed it in the hearings. And I don't think it makes sense to really reopen that can of worms because there was a lot of effort that went into getting that definition down and we still think it's a strong definition and appropriate for this law. The definition that's introduced is called personal information. Now, there is a gentleman who has appeared at many of these testimony. He may be viewing in right now. He's shown up at almost all of the hearings, named Thomas Weiss, who has written extensive reports on this bill. We analyzed this bill, the privacy law in Vermont, more extensively than I have. And his insight has been very, very valuable. One of the points that he makes is that we have a lot of definitions of personal information in chapter 62. We have personally identifiable information, PII, brokered personal information, covered information, login credentials. It seems like every time we create a new bill, we create a new definition of personal information to go with it. I don't think that really helps anyone, especially the business community, because the definition tells businesses how they're supposed to treat certain data. And there's, you know, it's like a Venn diagram, you know, social security numbers in all of them, but you know, address might be in one of them. So I think it makes sense to try to coordinate these definitions. It might make sense that certain laws require very specific definitions. I was hesitant to introduce a new one. So for the purposes of this kind of general requirements, we're suggesting a definition of personal information that is very broad and expressly includes but is not limited to personal identifiable, brokered, login credentials and covered information just to say, okay, if you're covering the other ones, it's within this one as well. Okay, so of all the definitions, brokered personal information currently is the broadest one that might be, you know, the one to go with, but this is an attempt to kind of coordinate all those. And Ledge Council probably has some thoughts on how to coordinate them as well. Since our law is going to deal with selling information, I think that is going to be a definition that you're going to want to think about. The proposed language brings in the definition of sell from California, where they did do a lot of debate over that. Personally, I think that selling information is selling information and it's, you know, we've, you know, there is something to be said for broad language. The main privacy law that we enforce right now is the Consumer Protection Act and it was modeled after a law that was passed in 1910 and it says, unfair and deceptive acts and practices are illegal. That's basically the law and that has worked for over a century and it has applied to a lot of different things and it bans unfairness, you know, and the courts have figured out what that means and, you know, the agencies have figured out what that means. The current trend is to negotiate language down to the comma, which seems like it would give specificity, but then when you actually try to apply it in the real world, someone's ready to jump out and go, aha, no, you put that clause there instead of there and therefore this doesn't apply at all. One area that I found that it shows up in a lot is the definition of robocalls at the federal level. It was defined so specifically that a lot of what is currently going on does not fall under the definition of robocall, which is one of the reasons why it's become so difficult to stop these things. So I think, you know, that's a concept. We do have it up behind you now. Great. And so I'm on the first page that was the definition of sell right there. Yep. Okay, so let's move on to the first section general requirements. We repeat these terms a lot in section A, owns, licenses, maintains or possesses. Those are terms that show up in the data breach notification act. Those are kind of the four different ways that you can interact with it. It's either your data. You've licensed it from someone else. You maintain it on behalf of someone else or you just possess it in some way. That's probably the broadest definition. The first big requirement is be data minimization. And this language came out of the CCPA consumer reports also has a model bill which has data minimization language. And I think charity pretty much summed up the reasons why data minimization is so important. We should think about data minimization in two directions. One is it means don't collect data that you don't need in the first place. And the second is don't keep data longer than you need it for. And it really addresses two separate practices. We know that there are apps on the app store or on Android phones that it might be a calculator app that's collecting your geolocation information. Why is it collecting your geolocation information? Because it wants to sell your geolocation information to someone else. And it doesn't care about really calculating numbers. It cares about harvesting your data. A data minimization requirement would pretty much wipe out that entire practice of surreptitiously collecting your data. Unless of course someone consented to it or you were very, very clear. The other practice is one that we see in data breach a lot which is it is a company that has, they do have data retention policies. They do try to collect only the data they need. But then you see a data breach and you see 15 year old data has been lost. Data of customers who haven't been a customer for a decade was lost. And you're asking why did you still have this data? And I think that's also critically important because we've been trying to fight data breaches for a decade now. And we still have lots and lots of data breaches. And you will hear companies saying it is true that you can have reasonable data security and still have a data breach. If China or Russia or North Korea wants to get in your systems, there's not much, even a very sophisticated company would be able to do about it. But the one thing they do have control over is what data they collect. That is something that we can encourage businesses. And I'm informed that within the business community there is a trend towards data minimization already. That is the best practice which is being pushed out. But I think that a stronger push would probably be necessary. So the next part goes to secondary uses of data. So secondary uses would be when you have sold or shared that data onto the next party. And basically it says, the key thing, it says that if you have this data that you've gotten from someone else, you can't use it for a purpose inconsistent with the purpose for which it was collected. Or for the consent with which it was originally collected. Now, every... Ryan, are you expecting someone to also come in, a Jenny Blair? That's someone I know who is, yeah, wanted to just... But we're not expecting her to testify. Yeah, okay. And also, Ryan, should we scroll? What number are you on? We're on C1. If you can scroll up, yeah. Sorry, I may have forwarded on the testifying link instead of the... Okay, so it basically says that if data was collected for a certain purpose and it's shared onto someone else, they have to use it for that purpose. We are told that this is a requirement which is violated frequently. Information is sold for geolocation purposes in order to do mapping or some third party. And then it is sold on and on and on and on until eventually it's in some sort of spyware app. Or maybe sold to law enforcement, federal or state to track people. And you will likely hear from both data collectors and third parties that they always respect the consent with which it was collected and they would never think of sharing it on for some other purpose. I think that there's a lot of reason to believe that that is not the case. Most importantly, right now, if a company shares data with another company, there's a contractual obligation between them to maybe to share it. But if that contract is breached, the only remedy is for the company that shared it to sue them for breach of contract for using it for a different purpose, which the original company probably doesn't have a huge economic incentive to do. And the company that then breaches the contract hasn't broken any laws. They can actually use it for whatever they want. So basically there aren't really consequences if a company goes on and does that. So I think the secondary use is an important thing to enslave the law there. And then the second part says that if the subparagraph two at the very bottom of page one says that if a company, a consumer's personal information is, sorry, if a data collector is unable to determine the original reason, then they can't use the data. They can't retain it. So there's your incentive. You use it for the reason that was originally collected. You have an incentive to document the reason it was originally collected and make sure that is included with the data. Because if you can't do that, then you should not be able to use the data. You shouldn't be able to say, oh, we don't know where we got this data from, this non-public data. So we're just going to use it for whatever purpose we want. And I think in terms of data brokers as well as a lot of companies, you will see that the justification for holding these huge tranches of data is either it was public or there was consent originally to get it. So the question is, if you go to them and say, OK, what was the consent and where was the consent and do you have that documented, that's the part where this will be able to enforce. And I think it will cause businesses to maybe take their bookkeeping records to make sure that that consent is well-tracked. Part D is a bit of a placeholder. I wasn't sure whether or not the committee would have time to look at this session. I knew that if they did, they'd have to move quickly on it. CCPA and CPRA, the follow-up law in California, are very long laws with very long regulation attached to them. If we had included all of that, it would have swamped everything else. I think that's a further conversation. I included some of the language, but there are lots of other things in those laws that the committee might think are useful to put in. So that might be something else to consider. Lastly, in this section is do not track. Over a decade ago, there was a movement to create this notion of do not track. If you go into your web browser in the option somewhere, if there is a do not track option, you can check a box. Originally, there was a notion that anyone receiving that, and I think it would be like a self-regulatory idea, that anyone receiving that message would not track you. But it didn't go anywhere, it got bogged down, and it kind of ended up just being a recommended suggestion not to track. So the idea is the technology is already there in terms of on the consumer side to say do not track, but it's not enforced. And so this would require some sort of enforcement of that do not track signal. The next part, and I'm on page 224.45, I suggest taking that safe destruction of documents. It has a definition of personal information, and I suggest replacing that with PII just to normalize everything, and that way we don't have two different definitions of personal information in the same chapter, which of course would be confusing. Next we get on to biometric information. Now, for this law, I looked closely at the Illinois BIPA law, which is a fairly short law, and then the Washington State BIPA law, which a lot of folks suggested was the template to go off of. I spoke to some folks in Washington, looked closely at the law, and realized that there are a lot of weaknesses in that law. And so if we're going to do a BIPA law, I think that we should try to do it right and close some of those gaps in the law. So the first section is collection use and retention of biometric information. And it says that if you want to collect and use biometric information, you should provide clear and conspicuous notice, obtain consent to use it, and provide a mechanism to prevent the subsequent use of a biometric identifier. The next part is you should have a retention schedule. This comes from the Illinois BIPA for how long you're going to keep it. Now, before we go further in this bill, I'm going to jump down to the bottom of page 3, number 8. This is not something in any of the Washington or Illinois bills. This is a carve-out. One of the things that we learned when we were looking at biometrics is that there are two broad reasons why biometrics are collected. One is for identification, and the other is for authentication. Identification is who is this random person? We want to know who they are. Authentication is we know who you claim to be. We want to make sure you are who you say you are. The technology of matching is different. Authentication is a one-to-one match. We already have your photo. We already have your voice print. We want to make sure you are who you say you are. And we're just comparing these two things. Identification is a one-to-many. We have this massive database of photos or whatever. We've collected this and we want to identify you. The general sense seems to be that authentication is not that controversial. People like using biometrics for, I don't want to say all people. I'm sure there are a lot of people who object. But I think as a general matter, you pick up your iPhone and it recognizes your face or your thumbprint. That can be a very useful thing as long as they're protecting it properly. That can be a very, and not selling it on for other uses. That can be a very valuable thing. It's the identification, which is where surveillance comes in, which is where you don't know what it's being used for comes in, that people really have issues. In this bill, I'm recommending that we basically carve out the authentication part of it. One of the objections that we hear from a lot of folks to BIPPA is that it is going to harm people's ability to prevent fraud and to stop crime and all these other parades of horribles. I notice that the other laws don't really distinguish between these two functionalities. What this says is, and I'm mentioning this, because everything else you reach has to be against the background of the fact that there's this big carve out. Nothing in this section requires an entity to provide notice and obtain consent, collect, use, or retain. No notice or consent. There still has to be a method to opt out, perhaps. A, the biometric identifier will be used solely to authenticate the consumer for the purpose of securing the goods and services provided by the business. That's probably the biggest use. The biometric identifier will not be leased or sold to a third party. The biometric identifier will only be disclosed to a third party for the purpose of effectuating ADA. The third party is contractually obligated to maintain the confidentiality of the biometric and not further disclose it. Of course, that general usage would create a separate backstop of stopping them from doing that. This is the idea of trying to make a biometric law that works for businesses that allows them to keep doing that. Now, what this law does not have is a general prevention of fraud prevention or a general law enforcement exception. And part of the issue there is, as Charity pointed out, the way we came here in the first place was because of the action, the initial impetus to start looking at this was people being very upset about Clearview AI. Now, Clearview AI now has a 10 billion photograph database that can lead to collect all of this information. And if you go on their website and if you ask them what they're using it for, they will say, well, it's for law enforcement and fraud prevention. So if you put a broad law enforcement and fraud prevention exemption in a law, then everything Clearview AI is doing may end up being legal, which kind of seems to defeat the notion of what we're trying to get at. Obviously, this is not a law which we're specifically targeting Clearview AI, not at all. But it was that sort of behavior that caused us to start thinking along those lines. So going back to page two, one of the concerns we heard was that notice was unclear. And just saying you need to provide notice didn't give the level of specificity necessary. So we tried to describe what should go into the notice. Then we heard consent was unclear. How is consent supposed to be collected? And that's under 5A on page three. It says it must be opt in and it may be accomplished in writing by indicating a sent through an electronic form through a recording of verbal a sense or in any other way that is reasonably calculated to collect informed confirmable consent. Now, if the business community comes in and says, okay, there's another way that we collect consent that we need to make sure is in there, that's great. So the goal here was to create a broad description of these are the things that constitute consent. Now, consent is not, we put it in our privacy disclosure on page 15 and you didn't uncheck a box and therefore you've consented. But the goal here is to create a law that provided the clarity which we have heard some of the other laws do not have. So when we talk about recording of a verbal ascent, I know there's been issues with some fraud happening with telemarketers calling and you answer the phone and they ask is this so and so and you say yes. And then they dump that into an ascent. I agree I had that same thought that that is something that we would need to think about. And I'm not, I mean, if the committee thinks that that is creating too much of a risk then they may choose to strike that language or maybe there's a way to address that concern. But I mean, you're right, any consent can be, you know, fraudulently created exactly. And that is something that we should be wary of. So section four basically says that biometric identifier may not use, sell, lease or otherwise disclose unless these are oars underneath this. And by the way, one of the decisions between this law and the Washington DIPA law is Washington DIPA law prohibits selling and leasing in all of its section but not using. So if a company collects data biometric information without anyone's consent, they can use it, they just can't sell it, which seemed like a big exemption that we wanted to make sure was addressed. And basically it says that if you want to use it, sell it, lease it, you have to either have obtained consent or if it's necessary to provide a direct or service subscribe to requested or expressly authorized and the business has notified the consumer of the purpose and what third party the identifier doesn't say clearly and conspicuously, it just has to have been notified. So this is an attempt to say, look, you can use this data if it is directly tied to providing the product. This is a law that is trying to restrict the worst acts of getting the way of doing business and consumers being able to benefit from the many conveniences that are provided by biometrics. The third is it is necessary to affect, administer and force or complete a financial transaction that the consumer requested, initiated or authorized and there are disclosures and things like that. And the idea behind that is we know that biometrics are used in the banking industry, in the financial industry. You know, I was talking to someone at my retirement company that has my financials and they told me that one of their backups is they actually use voice pattern recognition. If I call in and say I am who I say I am, but my voice doesn't match, then they'll know that and I thought, well, that's neat. I had no idea you were doing that. You didn't disclose it to me, but I appreciate that you don't want to let a fraudster come in and empty my retirement account. So we don't want to stop banks and financial institutions from protecting people. But we want to make sure that exemption is to try to address that and that also gets to that fraud prevention notion. But we do want to have the protections in place. John? Go follow the question, for example, calling the bank here. So you call like your insurance company and they say, here's all your choices because you get a recording that says, please tell us what you want. Is that more voice recognition data too, or is that just part of the recording system? I mean, I can't speak to how anyone operates, but I would imagine any time they're collecting your voice, there's that potential to be able to apply voice recognition to it. Yeah. It's just those simple things we're all kind of like engaged in these days. Yeah. So I can't say that anyone's specifically doing that, but tell us which part of the company you want to contact. Potentially, yeah. All right, thank you. And then finally required by a federal statute, which is pretty standard. Okay, five is describing what the consent means. Six is basically data security and retention of biometric identifiers. And seven is basically says you can't share it in ways, if you do have consent to share it, you're allowed to share it, but it has to be for reasoning that it was originally collected for you. You can't get a general consent and say, okay, now we're going to sell for facial recognition information to law enforcement agencies if that is not something that you specifically consented to in the first place. Eight is that authentication card out. Then under enforcement, a few things to point out. So the first section under enforcement, I'm on page four is standard attorney general enforcement. The B is this is something which I kind of wish we had in a lot of other sections of 63, honestly. But when you say the attorney general may enforce, what it means is the attorney general can issue penalties up to $10,000 per violation. Now from an enforcement perspective, that is both a blessing and a curse because one, it is a very strong penalty. And it allows us to investigate cases including against very well resourced large businesses and try to get protections for consumers because they know that the penalty could be very, very, very large. The curse is that when it comes to actually calculating the penalty, it can be very difficult when you say, okay, well, you hurt 200 Vermonters. So that is somewhere between one and $20 million. Or you hurt 2000 or 20,000. So now we're up to 2 billion trillions. How do you find a number between those two? Especially if it's we've never enforced on this particular action before. So the idea here is to try to give some sort of guidance to the enforcers of the law as to what they should be thinking about when calculating a penalty under this law. And there are other, I think, statutes that I think even the CCPA has some similar language. And the suggestions in here, and again, these are ones that could be open to discussion. The seriousness of the violation. The size and sophistication of the business violating the sub-chapter. So if a local business violates one of these privacy laws and a few hundred dollar penalty is assessed, that doesn't mean that a giant tech company can say, well, your precedent is a few hundred dollars. We say no, no, the size and sophistication, because the point of the penalty is to prevent businesses from violating the law. If they think that violating the law is just a cost of doing business, and it's more profitable to violate than to not to, then we're kind of wasting our time here. So do you have like the $10,000 per day? If they were getting $100,000 per day, then they just continued doing business. Well, exactly. I mean, that would be the analysis. Would there be any assessment of what they're actually able to make of that? I mean, the guy with 10 billion photographs could be making a substantial daily income. Certainly. I mean, the profitability could certainly be something that could be considered in there. I would also note that there is the opportunity for discouragement. Basically, you have to give up all your profits. That can be very difficult to calculate and to get agreement to. So it's not often implemented, at least in Vermont. Thank you. I remember our conversation from like a year ago, actually, we started dancing onto this topic. So I have other larger questions, but specifically on enforcement. I'm wondering if there's other examples in law or this could be an example of it. But we're talking about so many companies that often we also do business with the actual state of Vermont or other public entities within Vermont. And I'm just wondering as part of this enforcement, there could also be something with preclude a business from being able to do business with the state of Vermont. Like a bank, for example, if they really violate this, they don't get to get a large state contract or continue with us, you know, either forever or for some period of time of, I don't know if debarment is the right word for contracting or not. But it just seems like so many companies collect data that there's an extra pieces beyond a, you know, whatever the AG's office might be able to assess as a penalty that would also be an added carrot not to do this to comply. I think that could be something for the committee certainly to discuss. I will suggest that a couple unexpected consequences of something like that could be one, if the companies say, you know, major contract is with the state of Vermont and it's a must, you must stop doing business then violating this law could end up being a death penalty for a business, which means that we might be hesitant to, you know, enforce the law. And one of the examples that I always go back to is FERPA, the law that protects education privacy. There's only one penalty for violating FERPA, which is complete removal of federal funding from the school that violated FERPA. It's never been enforced. So, you know, we want to be careful about that. The other concern might be that if, you know, if there is an entity in Vermont state government that is reliant on a specific technology, then that's going to bring a political element into whether or not to enforce it, right? If you enforce this thing, then we're not going to be able to use this tool anymore. And so that could complicate factors further. So that's just, I'm not suggesting yes or no, but that's two things that occur to me to think about. Mr. Chairman. What happens if the breach or the violation of this is a state agency? Well, the Security Breach Notice Act applies to state agencies. Yeah. So a decision should be made as to whether or not this should apply to state agencies. I would suggest that, you know, at the federal level, there is the Privacy Act from the 1970s. They have their own separate regulatory structure on how the states treat people's data. Vermont actually doesn't have a state version of that. And that may make sense to have something like that. Maybe it could be done just by incorporating states into this. Maybe there are reasons why that would require something different, and that would require probably bringing in folks from law enforcement and the Secretary of State's office and all the other stakeholders who, you know, are collecting data about Vermonters just to make sure. So it's certainly something that I think, you know, should be considered. I don't think the current language as it applies would bring in the state, but, you know, that's a policy decision for the committee to decide. Okay. So size and sophisticated. And then, you know, another one I put in was the business's history of respecting or failing to respect the privacy of consumers. There are certain businesses out there that are just known bad actors. And if they're known bad actors, it's usually because they feel like it's more profitable than not to do it and they're going to keep getting away with it. And so in those circumstances, I think it makes sense for the penalty to say, look, you know, even though it wasn't we who enforced against you, it's seven other states at the FTC, it's Europe and Australia, we can look at those and say, you need a bigger penalty. And then expressly says, with maximum penalties imposed, we're appropriate. We very rarely do maximum penalties in these cases and there's almost no precedent for doing that. I think that if we can point to a legislative edict and say, look, you know, we can pick all the boxes. This is what's at stake here. I think that would, you know, be fairly effective. And at the same time, I think it would stop, you know, some, you know, fictional overzealous AG 10 years from now from trying to bring maximum penalties down on, you know, a small business, you know, that's struggling to survive. We don't want that happening either. So C, so this is one to say, well, you know, as soon as the thing is enforced, there's going to be a lot of companies that are not in compliance with this law. So it suggests that you have 180 days to come into compliance with the law. So instead of saying, let's push out it going to effect for a year, let's say, you know, put it in effect whenever you want and then you have 180 days. And then after that 180 days, $10,000 per day, if you haven't come into compliance. So there is a generous amount of time to get into effect, but a strong penalty if you don't. And I think that's important because I think there may be some companies who come back and say, due to the nature of the way that we collect our data, there's no way to come into compliance. We can't get this consent. It's not going to happen. And that creates this strange problem for enforcers. Well, you know, we don't want to necessarily shut this company down, but they are in violation. What do we do? And this gives them the answer, $10,000 per day. You don't want to comply with the law, you know, $10,000 per day. If you think that there are some giant companies for which even $10,000 a day wouldn't be a sufficient incentive, then you might want to change that. But that's what is suggested to start. Yeah, that's interesting. I think for a lot of companies, 180 days might be enough, but there may be stuff that's been 30 years old that a company or a business has had. They just can't get the consent. They don't even know where these people are. Right. You know, or how do you deal with that? You know, I mean, I mean, hopefully we have businesses that have been around 30, 40 years or longer. So what do you do if they have lack of consent? Well, maybe not just biometric, but what about the data? Sure. I think that this is a fundamental question that we're going to have to ask ourselves. There might be some basic business models that this law says you can't do anymore. Yeah. And I guess if you wanted to, you could grandfather in certain businesses and say, you know, you can continue these practices because it wasn't illegal when you started. You could grandfather them for five years or something, but you could say, this wasn't okay when you did it. You just didn't know you were doing it. And congratulations. You were able to be profitable doing it for 30 years, but you've been violating the privacy of Vermonters for 30 years. And now you have to stop. I'm not suggesting that's, you know, it's kind of a draconian way to go about it. But, you know, that's, that's the decision because you're right. There are certain businesses that make including frankly, I think probably clear of UAI. They've said in court that they have no way of knowing where people are from. Therefore, because they collected this information with no regard for anyone's privacy, they can't remove it. So, you know, I, and. You know, this might, might address that sort of thing as well. Emma. So are there any examples of more progressive violations progressive in the amounts of the violation? Because in today's world, we have mega corporations. And then we have minor corporations. And I'm thinking of Facebook, aka meta or whatever it's called now and Amazon. I mean, $10,000 is fees for them or whatever the comparable is and $10,000 for like a Vermont basis is massive. Most likely. Right. So I'm just wondering if there's ever, again, if there are examples of a more progressive penalty structure, that's even legal because we really want, if we really want to go here, it's making getting people's attention, especially those who like Amazon and meta or whatever it's called. They make so much profit off of this stuff. And for this particular. So, so the one that I, that occurs to me, so the first thing I would suggest is that if there is a, like some of these companies, these big companies, you know, might have 200,000 Vermonters, you know, they violated. In fact, I think that's how much the aqua facts breach effected. 200,000 times 10,000. I wanted them. Is that that's a pretty big number, even for, you know, a massive corporation. So we do have a pretty high upper limit, but in terms of, you know, really trying to get at that sort of issue that you said, I think that GDPR, the European privacy law, tried to get at that. And I hope I don't misstate this. I believe the maximum penalty under GDPR for a willful or reckless violation is 40 million euros, or up to 4% of the company's previous years, global revenues. And not profits revenues globally, even if it was in one company. That's a lot. And I'm, you know, I would assume that they were thinking of exactly those companies when they created that. So, I mean, that's a model. I don't know if any law in Vermont has tried, or in the United States has tried doing something like that, but that could be, you know, how that sort of thing does. I do think, though, that our 10,000 times a large number probably gets to that, or could get to that in an extreme case. John? So, there may be a business, their primary business, they don't realize they're violating this kind of thread here that you're sharing this here today. So, unless there's a complaint with your office. So then they're notified by your office. That, you know, attractive things here, so to speak. So, you maybe get into this a little bit later on your proposal here, but how does a business find out how they ought to be operating in terms of staying in compliance with this law here? Sure. For this proposal here. I'm just thinking down, but at some point, all businesses are not, or whatever state need to know about this, versus just assuming they know about it. Sure. That makes sense. Yeah, yeah. There's a couple ways to address that. When I first started working with the Security Breach Notice Act, it was probably about four years old, five years old, and we were seeing companies having breaches and not complying with the law. And we were, you know, much more gentle with businesses, you know, at the time, right? We said, we understand you probably never heard of this. I went out and spoke to anybody, any organization that would listen, giving presentations. I gave presentations to the Fuel Dealers Association and to the bar and the real estate bar about how to comply with the law. So that would be one solution, at least in Vermont, is let's really, you know, do some public knowledge, you know, get the awareness out there. In terms of on the national level, there are law firms and business associations who monitor these things, and they push out that knowledge to their membership to try to, you know, get them to know about these things. And, you know, often they do it as, they may do it as, you know, like a sales pitch, like, here's a new law you need to comply with. So, you know, let's tell you all about that law. There are even, and we've seen this, you know, you may have noticed that about two years ago, these little banners started showing up at the bottom of websites asking you which cookies you wanted to opt in and opt out of. That was, I believe, a response to GDPR. So GDPR passed laws that basically made this kind of, like, vast cookie collection suddenly problematic without consent. So, tech addressed the tech issue. Businesses showed up and said, you know, do our plug-in on your website, and it will manage your cookie consents for you. So that's another solution. Sometimes the market will provide a market solution to these sorts of things. I can ask us one quick follow-up question. Where might a Chamber of Commerce fall into? Well, certainly I think that could also be... Yeah, I'm just thinking about the conduit here for... Can they hear me if I sit here? Sure. I just wanted to do a question. Yesterday, I did speak with someone from the Vermont Chamber specifically about this. I don't know if she's watching on YouTube, but we've been in conversation with them already. And thank you. I'd be remiss if I didn't mention that another policy way to address that is a right to cure. And I believe that the California CCPA has a right to cure. Now, the consumer protection side of things generally don't like a right to cure because it basically means you can violate the law until someone notes that you're violating the law and then you have to stop doing it. On the other hand, it addresses that issue that if someone inadvertently violated the law, you have to tell them, and then they have X amount of time to come back into compliance. I think that... I was speaking to someone actually in California about how that was working out, and I think there can be a benefit to the right to cure. And I doubt a lot of my colleagues in the consumer community would agree with me on this, but one of the benefits is if you pass a law like this and the AG's office is enforcing it, there's going to be a lot of people violating the laws. There's limited resources applied to a broad area. If there's a right to cure that they have to comply with and we get a complaint that someone's violating it, it would mean we'd say, hey, stop violating it, and then if they stop violating it, we move on. There's plenty of others to move on to. Otherwise, we have to launch an investigation and collect information and negotiate a settlement and do the whole thing, and that's a very long resource intensive process. Whereas if they were willing to just stop, meanwhile, there will be some companies that don't stop once you get the right to cure, and that signals to us, oh, this is a company that really doesn't want to comply with the law, and it makes it easier to make the decision, okay, this is someone we should bring in action against. It sounds like I'm advocating for a right to cure. I'm just kind of putting out the issues back and forth. I think that there's a lot of things to consider there, but that would be another way of addressing that. Thank you. I have sort of like the flip side. How do you feel your overarching understanding of the magnitude and diversity of folks that are doing things like this is such that we can have some assurance that there's just something flying under the radar that even people don't recognize as negatively impacting them so they don't have the opportunity to complain, and there's probably no cure for that other than if there was a cure after the fact, if everyone went on for a certain amount of time, finally you found out. Sure. I think that a lot of the practices that are going on are fairly opaque and people aren't going to know that they're happening. A lot of the ways that we find out about this is through journalists. A lot of the worst privacy practices that even the AG's offices have discovered is because journalists uncovered them. They've been doing a really good job of doing that. Also, academics and some other agencies, there are ways, or even whistleblowers. I mean, sometimes you see people come forward and say, my company's doing something wrong and I can't stay quiet on it. So that would be another way that we would discover these sorts of things. And possibly more likely we get them from those sources necessarily than individual consumers. Lastly, we have the private right of action in B2. This private right of action, this was taken from the Illinois DIPA law. It was modeled after it. No, I'm sorry. This was taken from, was based on Vermont law. So our local Fair Credit Reporting Act in chapter 63 provides a $100 per violation. And so this language is modeled on what we already have for our local Fair Credit Reporting Act, but it imposes the penalties from the Illinois DIPA Act, which is $1,000 for negligent, $5,000 for willful or reckless violation. Where are you, Ryan? Oh, I'm sorry. This should be for page four. Move up a little bit. Yeah, two. B2. Yeah. Do you see it now? Right above exclusions. So you're going to hear a lot of concerns about a private right of action. You're going to hear a lot of criticism of plaintiff lawyers and full disclosure. I worked on the plaintiff's side a little bit before I came here. I don't think that they are the, you know, I'll just observe this that, you know, you'll hear that they're only doing it for the money and all these other things. When I worked on the plaintiff's side, I will say that there were usually a lot of defense lawyers on the other side who I don't think were doing it out of the goodness of their hearts either. And they were usually actually paid a lot more than the plaintiff's lawyers, to be honest. So, you know, what it really comes down to is what a private right of action means is if a business hurts you as an individual, you can do something about it. Without a private right of action, your only option is to complain to the Consumer Assistance Program and hope that others have and hope that because of those complaints, the Attorney General's Office is going to look into it. And if they look into it, you won't know they're looking into it because we don't make our investigations public. And you will hope that, you know, after the investigation finishes in a year or two years or three years, that there may be some restitution that comes to you. That's, you know, that's what it looks like if you don't have a private right of action. And the fact of the matter is, you know, unless, you know, the legislature is willing to budget a much higher budget to the enforcement of this bill, I think that a private right of action is an important supplement to the ability to make sure that people comply with the law. If there are concerns about, you know, rogue class action lawyers or shakedown litigation or nuisance litigation or things like that, I think that there are ways that those can be addressed and maybe incorporated into the bill. But I do think that if someone is agreed by a violation of this law, they should have the ability to do something about it. And, you know, the Attorney General feels sorry on that issue. Okay, there is an exclusion which, you know, addresses law enforcement and that comes out of one of the other laws. And one thing I didn't put in there, but I flag, it may be appropriate in some cases to put a HIPAA exclusion in or Graham Leach-Bliley exclusion in. The thing is neither of those laws specifically address the collection of biometric information. So to say if you're complying with HIPAA, you don't have to do X, Y, and Z. A lot of companies comply with HIPAA, but are doing this for totally other reasons. But there may be some things that can be done to address that. So that's before I move on to the data broker section and I don't know if I'm going too long or if you want me to speed up, I can do that. Any other questions or comments on the biometric section? It actually goes back a little bit, but I mean, you talked a couple of times about consent and not varying consent in a privacy, but I mean, I feel like for a lot of apps, a lot of websites, you know, it's a long term, a legalese term of use, check a box. I tend to feel a lot of like very few people read those. And if there was a simple presentation, like your example calculator that tracks where you're physically located, I think very few people would approve of that. Is there any thought around how to make consent a more accessible idea for the average citizen? You know, it's a really good question and it's one where the consumer protection ethos is based around the notion of consent, disclosure and consent, noticing consent. Basically, you know, if a consumer wants to, you know, climb a rock wall or ski down a mountain and they can, you know, they consent to the risks, then it's great. And we respect that. The problem is it really does break down in the privacy sphere because it's hard to provide the notice to people. It's hard to provide the notice to people in a way that they really appreciate and are actually telling them. And, you know, even then, it's hard for consumers to really fully appreciate, you know, what's going on. I gave an example in class the other day. FERB requires a notice to go to all students at the beginning of the year, and it allows you to opt out of directory, sale of directory information, which is information that the school internally defines as information they can share. I asked my class of 20 students, I said, you know, you guys are probably more aware of privacy issues than most people. Did any of you sign the FERPA opt out? You know, no hands up. Did any of you know there was a FERPA opt out, right? There's a federal law that requires notice specifically to be sent to people, which was center people, UVM complies with the law, but of course, nobody even, you know, still knows that it exists. So there's a huge limitation in this, you know, ability to notify, which is one of the reasons for an opt-in consent, right? So you have to check this out. Now, in CCPA, they have a section that basically says you can't refuse to do business with someone if they don't give you consent. So you can't say consent to track your geolocation is a requirement of downloading this calculator app. Then it goes on to say that, however, you can give businesses financially or individuals financial incentives to collect their information. Now, I'm not sure what the difference is between saying you can't, you know, not do it if they don't say, but you can, and they also say you can't charge more if they don't consent, but you can give them incentives. It's, you know, there are regulations that are trying to deal with that. It's a hard one, especially if everybody, if every app in an industry says, well, we're going to need your biometrics and you have to consent to it and there's no place else you can go. It's a sticky problem. I think that the data minimization section addresses some of that, right? If there's no reason for you to need this in the first place, you definitely can't collect it. But, yeah, it's, at the very least, the notice will make it aware that it's happening. It will mean that they can't sell it on to other people who are going to use it for other reasons, which I think, you know, even if no one really reads the consent or sees the consent, the consent isn't going to say, we're collecting your location so that we can sell it to Homeland Security. It's probably not going to say that. And if they don't put that in, that means they can't sell your geolocation to Homeland Security or to someone else who's going to sell it to Homeland Security or something like that, just to give an example. Okay, we have 45 minutes left. Just want to make sure everybody realizes that. We started 11. Okay, be concise. That's what you want. Just a very quick follow-up on that. I'm wondering if there's like any, just building off the consent piece around a regular, like annual check-in, because often if we download the first app or where we get that initial consent, but then life goes on. Some of us have children who occupy all of our mental bandwidth and we forget that we've sent it to that, but we might want to change that if we actually had a second to really think about it or what, or things change, right? Like there's each data, I don't know, all the things that could happen. So is there anybody who does an either an annual or some sort of way for people to easily go back and either change or an annual check-in around consent? So banks are required to send an annual privacy notice. I'm pretty sure that's under Graham-Reach-Bliley, in which it shows you what they're sharing and what there isn't and how you can go about opting out. So there is a model for saying an annual notice has to go out. So that would be one type of thing to be modeled after. Okay. Okay. Moving on to data. I just want to have a different thought. The law enforcement exclusion piece I know is referencing you said quickly another law. Yes. And I know that that's a committee overlap issue, but I do think that's another area I would flag up but I think there's some more exploration around what does law enforcement necessarily need to collect for folks? We don't need to talk about it now, but it would be an interesting area for you. And to note, there is already a prohibition on Vermont law enforcement from collecting facial recognition data with a narrow carve out that was negotiated last year. So at least that part has already been addressed. Okay. So as you all know, we introduced the nation's first data broker registry back in 2018. Since then, data broker registries have been adopted in California. I believe Virginia and there is a law in Delaware that seems to be making some progress, which might go through. When we passed this law, we wanted to do something very light touch. We knew that we were wading into a brand new industry. We did not want to disrupt the industry. We certainly didn't want to do anything that, you know, might cause someone to bring a lawsuit that would invalidate the law. And we were successful in all of those goals. Now that we have had four years of experience with that law on the data broker registry and we've heard a lot of input, we think that it makes sense to make some use to the law. The first big one is that under the current law, we have a definition of data broker data breach. The idea behind that is that if you want to have a, if you have a data breach, there's only a data breach if a very specific subset of data is involved, PII, which means your name or your first initial and last name and one of these types of data. Now what that means is that if you want to collect 10,000 pieces of data on someone, but not their name, then you can't have a data breach. You might have their social security number, their date of birth and their address. You might have enough information such that anyone, anyone, and there are data brokers out there that de-identify data with identities. It's a non-trivial thing to get around the data breach notification act. In addition, although the data under PII is very sensitive, it doesn't include things like address, phone number, date of birth, demographic information. There's a lot of information in short that could be lost about somebody and there would not be any duty to report. You know, people might find that very sensitive and data brokers specialize in collecting this specific information. To give you an example, one of the enforcement actions I worked on with a lot of other states was the Ashley Madison breach. That was a dating site tied, that was directed at people who were cheating on their spouses. Their motto was life is short, have an affair. And then they had a data breach and all of these people who were on this affair website, suddenly their data became public that they were on this affair website, had credit card data not been involved in that breach, I don't think there would have been an obligation to notify about the breach, even though obviously this would be something that people would want to know about. So the idea behind the data broker breach was this was a much broader definition of PII, but limited to data brokers. It was basically defined as the same as our data broker, our data breach law, except take out data collector and put in data broker, take out PII and put in what we call broker personal information for the very broad. And what the law required was basically in the registry every year, data brokers had to list the number of data breaches they had in the previous year. I know of instances where companies had data breaches that we knew about and still put zero in the list. This goes to another issue with this, which is penalties for filing false information. They're actually on to any in the current law. But I think it makes sense that if a data broker has a breach they should have to notify the people just like anybody else. So that's the first kind of common sense change here, requiring data brokers who have a data breach to notify. And the notification process is similar to the process we have for other entities. One thing that was removed from this was the opportunity to notify through substitute notice. So substitute notice is a company has a breach and it's either the cost to notify would be above $10,000 by any method or they don't have people's addresses, which means they can notify public media and put something on their website and notify people that way. And I think that makes sense to have a substitute notice. However, if we're talking about data brokers, if XYZCo has a data breach and loses your data and then just puts out a notice that XYZCo had a breach, no one knows that XYZCo had their data in the first place. So that notice isn't really useful. So this version would take away the idea of a substitute notice. And if a broker comes in and says, well, wait, how would we know their addresses? We don't have that. I think the response will be, well, you're a data broker. You know how to get people's addresses. That's literally the line of the industry that you're in. And if someone said that this is too burdensome, I think the response would be, look, you went into a business of collecting all these people's information and you had an obligation to protect it and you lost it. So who should be burying the burden of that? It should be the people who had nothing to do with you, but you just lost their information to identity thieves and fraudsters or foreign governments or should it be you? So that's the emotion behind that. That's one big change. Then this is a lot of language that basically does, we can jump to page seven now because the rest of it is all just the notification act. Okay. So, oh, let's get something here. No, I don't need it. Okay. So then under annual registration, one new requirement that we want to put in here is instead of telling people the method for opt-out, if you offer an opt-out, this law would now require data brokers to provide a method for opt-out. We have been hurt. We have been told that that is the best practices of that data broker is to provide people with a method for opt-out. We think that that should be simply required across the board. And because of that, a lot of the language in the registration could come out because that was key to if, now that we know that if it's a requirement, you have to say what the method is for opt-out. Another change is taking out the actual knowledge requirement for having broker personal information of minors. If a data broker is collecting information, it should be their obligation to know that they have minors information and to provide proper protections on that. So the next part at the bottom of page 7, B1, we thought that if you're going to increase the obligations under this act, it probably makes sense to increase the penalties because now there's a much stronger incentive not to register. If someone is not registered, it's much harder to find out that they've been violating some of these laws. So one change is changing the civil penalty from $50 per day, not to exceed $10,000 per year, to $100 per day flat penalty for violation. Brian, you may have touched on this when I wasn't in the room, but have we been able to actually apply civil penalty to any data broker that we've identified that hadn't registered with the state? We have not been doing that. We have been suggesting that they register and then they've been registering. Yeah, where we found that was the case. So the ones that you've found notified them, they've registered they haven't... To my knowledge, yes. So I suspect that if you've found out, notified them and then they still won't register at the point in time where you go after them. Right. And that could even be included in the law. If you see under C, another problem that we have is people omitting information or data brokers admitting information or putting in incorrect information. For example, we asked for very specific information on how to do an opt-out. And we said you shouldn't put just a link to a website and a lot of companies didn't do that anyway. So C says on topic page eight, a data broker that omits required information from its registration must file an amendment to include the omitted information within five business days of notification of the emission and is liable for 1,000 per day each day thereafter. So there's a right to cure. You didn't include this, you need to include it, otherwise there's a penalty. Then for filing false information and we put in materially incorrect information. So I'm not trying to go after people because they did a typo or anything like that. This is a single civil penalty of $25,000. And then if it fails to correct the information, 1,000. So if they omitted something they should have included, they get a warning and then $1,000 per day after they don't correct it. If they put materially incorrect information, it's $25,000 and one of the things to really consider in terms of these penalties is just, and I think this may be part of the issue with the current penalty is again, the office has limited resources and has to pick and choose where they're going to apply them. So if it's something where there's going to be like a $450 penalty and someone else is doing something that could be $20 million with the penalties and restitution, you kind of have to align the penalties with where you want the enforcement to be and that's something that I was thinking about when I put this in here. Not that that's ever the sole reason why anyone brings an issue but I think that it is a factor that might influence. Next is a new section called data broker additional duties. So individual opt out. So this is a consumer or their agent. That's important because some data brokers may say you have to notify us directly, which means if I want my 95 year old father opted out then I have to tell him to file the opt out which is probably not going to happen. May do any of the following. Request, stop collecting the consumer's data. Delete all data. Stop selling the consumer's data. Data brokers have to have a simple procedure to submit such a request and must comply within 10 days and they must clearly and conspicuously describe the opt out procedure in its website. Another reason to make sure agents are included in there is because there are market solutions to market problems. There are companies out there that will opt you out on your behalf. But if you can't assign this right to an agent then they're not going to be able to opt you out of data brokers in Vermont. I think it's important. They request. Yes. So okay, I'd request you to do that but what actually tells them that they have to do it. So that's A2. They must comply with the request within 10 days. Now, that's helpful but to be perfectly honest it's problematic because there are hundreds of data brokers out there and the notion that anyone is going to take the time to individually contact all of these data brokers and tell them to opt out is unlikely unless someone maybe doesn't have a day job or really, really cares about privacy. And it also begs the question, who should the onus be on here? Should people have to spend hours or days of their lives to go through all these hoops to tell someone to stop surveilling them basically? So that's where the general opt out comes in and I think this would be a very strong protection if we enact it which is a consumer or their agent may request that all data brokers registered with the State of Vermont honor an opt out request by filing their request with the Secretary of State. So one stop shop opt out of all data brokers and this is another reason why if you're going to put that in it's important to up the penalties for not registering because that's a huge incentive with the broker registry. Now where this gets a little bit tricky is that what you'll hear from data brokers very likely is that if I'm John Smith at 100 Main Street and I contact a data broker and say delete all your John Smith records then they're going to be very concerned that they're deleting the right John Smith. There may be another John Smith who wants their records to be in that mailing list for whatever and doesn't want to be deleted. So then the question becomes how do you provide enough information to make sure that they are the right person and this creates this privacy paradox where sometimes you have to give more information in order to have your information removed. So the idea here is this is a little bit of a punt asking the Secretary of State to develop a method to facilitate this general opt-out and the idea is you would provide the personal information to the Secretary of State's office but not to each of these hundreds of data brokers and there would be a way to facilitate the opt-out between the Secretary of State's office and the data brokers. This is loosely based on the National Do Not Call List. You put your phone number in a list and they can't call that number, that's easy. The problem is you don't want a list of everyone's name and social security number which anyone can look at. So how to make that happen is going to be a little bit of a tricky one but I think it makes sense for the state to have that information and create a secure way that that can be communicated with the data brokers. Now I put the Secretary of State in here because they're managing the registry. There are other ways that this could be done. The Department of Taxes has all the sensitive information about you that would be needed to do an opt-out. You could have a square on a tax form that says, you know, by the way, opt me out of all data brokers. Now imagine that would have to go through a bunch of other things that might go over complicated things and the Department of Taxes may have no interest in doing something like that. So, you know, that's just one idea. Another one might be the DMV. When you renew your license, there could be a checkbox, you know, that says, you know, opt out of all of the data brokers. Okay, so these are two, those are just two areas where we kind of interact with the state on a regular basis and they have enough data to facilitate that without us having to provide more data. They already have our data in their files, and they know who we are. We have to identify ourselves to the DMV or the tax department. So that could be a different way to do it. Just to remind me, where are these data brokers now registering? Is it with the Secretary of State's office? Yeah, it's the same database as corporations. We'd have to figure out whether or not the fees that are being charged would actually help pay for this system for the Secretary of State to ban this up. Yeah, yeah. And the idea here is, and I think this is the same way that the Do Not Call registry works. Every 31 days, the data brokers would have to check that list and make sure that people are not, because the issue with data brokers and same with Do Not Call is that you might flush your list of all of these numbers and then collect them again. So you need to kind of keep going back and making sure that you have... The alternative is to require each broker to maintain like a black list or a white list of people, but then they're still maintaining your information even though they're not supposed to, just to make sure that they don't accidentally collect your information again. So the exact technology of how to make that work is something that would need to be worked out, but I think it would be a very strong protection if you're able to implement it here in Vermont. This is similar to what we did a few years ago with insurance companies. There's a master death file list that's out there that I think the federal government puts out Social Security and they scour that to see who's passed away. We found that with insurance, the life insurance, they were doing that and paying it right away, but with annuities, they could have a... They weren't paying it out until somebody requested it. So we made some changes there, but they have to scour that every so often as well. So I think that's the similarity in the thought here. I mean, this makes a lot of sense, but without knowing a lot about the data industry, would this general opt out? I mean, if this caught on among states, wouldn't this more than anything really, really drastically change the entire data industry? I think it would have a huge impact, yes. I think that this could be a game changer in terms of that letting people opt out. Another way that this could potentially end up evolving is I believe that there are market solutions there too with regard to the Do Not Call registry. There are companies that serve mailing list companies and say, we'll make sure that you don't have anybody on the Do Not Call, so you don't have to check every 31 days. We'll do it for you. So there could be companies that do the same thing here. Basically a data broker that helps data brokers opt out. What's that? New businesses. Yeah, I know. I'm in the development bill. Very important. Next, we have the credentialing requirement and this credentialing requirement, again, we've been told that this was a best practice. Credentialing means if you're going to sell information to someone, you have to make sure that they are who they say they are. There have been instances of data brokers selling information to fraudsters, or licensing information to fraudsters. And the big companies say that they have lots of systems in place to make sure that doesn't happen. Under the Fair Credit Reporting Act, companies that provide credit reports have to credential whoever they give the information to. So this incorporates that language as a credentialing requirement for data brokers as well. Now, and this comes out of the hearings and what we heard from a lot of folks, this individual opt out, general opt out, and credentialing would not apply to broker information that's regulated as a consumer report pursuant to the Fair Credit Reporting Act. The Fair Credit Reporting Act already has an opt out and a right to correct and a credit freeze credentialing. So basically credit reporting agencies with regard to their credit reports would not have to worry about this stuff. Now, if they're also data brokers, which I believe a number of them are, they would have to comply with regard to that data. But this would not be on top of the Fair Credit Reporting Act requirements. And then the last section is what I mentioned before, the suggestion that we study the public information exemption. That is a walkthrough of the bill, and I'm happy to take any questions. Any other questions for Brian? This gives especially new committee members an idea of all the other things we do in commerce. So he's fairly comic development work, painting and insurance. But this is, I mean, a really interesting element. Consumer protection has been our strong point. We've kind of taken a break over the last couple of years because of COVID, but I think next year it's time for us to get back into that realm again as well to make sure that Vermonters are protected. So he certainly will be getting something drafted, presenting and having all the people that are interested in this bill in to discuss it and come out to hopefully a good place to maybe get this through next year. So without any questions, Ryan, thank you very much. Thank you for your service. Thank you. We appreciate it. I've been in the committee over all these years and really your knowledge of this subject has really helped the state. I appreciate that. And I'll say that really my biggest regret about leaving the office and going to the FTC is that I will not be able to help this committee and your colleagues on the Senate side work on this next year. And I hope that you end up putting something through that really creates those critical protections for Vermonters. And we can again be a leader in the country in terms of privacy protection. So thank you all. Thank you very much. We wish you all the best of your service. Best wishes. Thank you. Charity, thank you for coming in as well and we'll stay in touch. I don't know if this is the exact language that you want to put in for next year, but we can talk about that. Maybe get a jump on drafting. Yes. Yeah, that would be great. Ledge Council didn't, you know, this was all Ryan that together. So that would be wonderful. Okay. Good. Thank you. Thank you. Thanks again, Ryan. Good luck.