 So it's 535, 538 and we'll convene the third and final privacy meeting. What's the official title of these meetings? Sure. Privacy? Privacy hearings. Privacy hearings. And do you want to... Thank you, Senator Clarkson. Why don't we ask for those on the phone just to introduce yourselves? Yeah, this is Jerry Moritz. I'm Richard Moritz. We're with the lobbying firm Moritz and the MAC. And we're very interested to hear how the inspector is doing tonight. How are you, Jerry? I'm good. He's left the conference. Anybody else on the phone? Okay. Hi, this is you. How are you doing, Tim? Excellent. And I'm sorry I'm not up there. It's okay. I'm afraid I wouldn't get out tomorrow. You made a good call. Jerry Moritz. Have you joined the conference? I got cut off. Sorry. Okay. Can we get you napkins? Yes. Why don't we just go around really quickly because this isn't that big of a room. And just introduce ourselves and who you are and where you're from. T.J. Donovan, Attorney General. I'm Charity Clark. I'm the Attorney General's Chief of Staff. Ryan Krieger, Assistant Attorney General of the Public Protection Division. Chris Curtis, Chief of the Public Protection Division. Clay Purvis, Department of Public Service. Lauren Yandell, Small Business Advocate of Vermont, Attorney General's Office. So, by over being a former IT person for the federal county officials, they're going to be the Chief of Staff of the State of Vermont. Sat Khamenei, B. Perk. Thomas Weiss, President of Montpelier. Andrew Kamen, Council of State Practicing Security Coalition. Kate Davey, nobody in particular. Well, you are Kate Davey. I am Kate Davey. Allison Clarkson. Brian Monk, Waker Johnson. I do want to ask me a zero. Chris Dorr, from Madison, P.C. David Bevelle, also from Madison, P.C. And R. Sharrig, also from Madison. Ted Fisher, from A.O.E. And McCrassan, McCrassan Group. Multi-colonial camera affairs. Dylan's right here. Chris Rice, M.M.R. Tracy Canino, Collider Citizen. Jen Kennedy, Kennedy Associates. And I'm here for C.D.I.A. and... Great. Well, thanks for everybody coming out tonight. As I said, this is the third and final privacy hearing. I think at the conclusion of this, we will issue a joint report about these hearings and some of the questions that we were charged with investigating and researching and attempting to answer. And I think there's consensus, at least from the AG's office, as to two issues. Number one, and I believe Senator Sorokin has already put in a request to draft this bill, is on student privacy. It's something that we certainly support. The second issue would be for a chief privacy officer for the state of Vermont. We view that position as an internal position looking at the state's policies and procedures and certainly trying to answer the question about what the state does with its own data and whether or not there should be some regs as to whether or not we sell it or should sell it or sell to whom. The third question, I think the third question was, should we seek to modify or amend the data broker bill now law from last session? I think the consensus was no. I think everybody's comfortable with that and see how that develops. And the fourth question was whether or not Vermont should adopt the FCC standards. And I don't think we have an answer as to that yet, although I think there is growing consensus that we should probably wait for California. In my understanding, they're in the process of developing those rules for 2020. Does that sound right? And that has really been the last two meetings. And those were the specific questions, I believe, that we were charged by the legislature to address. And we'll produce a report sometime in December answering those questions. So why don't I just open it up and ask for questions, concerns, other ideas that should be given consideration or things that we haven't yet addressed or should address? So why don't I do that first? I wonder, General, just by way of context for folks that may be watching on Cable Access or who may be joining us for the first time if you're local citizens and you haven't participated in any of the prior hearings. One of the reasons I think we've found some consensus on some of these ideas is because we've spent two public fora already, several hours of discussion with stakeholders, other concerned citizens. And the issues of student privacy are issues that other states have dealt with already. So those states have, there are some states that have those laws on the books. And so as Vermont looks to, how do we be in the vanguard of protecting our citizens and particular students, that was an area where pretty quickly it seemed like there was an emerging consensus that that is a model that other states have already adopted, that industry has already adapted to and that provides a certain threshold of protection for those students. So that just appeared to be a common sense approach that we felt pretty comfortable that a recommendation in that area makes sense for Vermont. So if you're just coming at this for the first time, that's something that we thought was a good step in the right direction that made sense. And on Chief Privacy Officer, I just think we've had a general sense that there's a discussion around what's happening generally with protecting the privacy and data of consumers generally. And if that is a question that we take seriously in the commercial context, should there not be a person who is charged with looking at that issue in the context of citizen data? And so state government, of course, has a huge trove of data and knows a lot about its citizens have to interact with the government every day in a myriad of different ways. So a Chief Privacy Officer who's really tasked with looking at that, again, I think there's a small handful of other states that have created that position, maybe a half a dozen that have already acted in that area. But that just seems to be a very practical common sense thing to do. Obviously it would require an allocation of resources by the legislature because it would be a new position in state government. It might be a position that would work closely with existing entities but would have a very specific charge. So it would be working with the Secretary of State's office, for example, it would be working with the ADS, for example. So there's a lot of ground there to cover and that could look a lot of different ways depending on how the legislature might construct that. But we think it's an interesting idea and it's one that we had a lot of discussion about in prior public hearings. So it's come up and it's one that the legislature has already indicated some interest in in the past. So we just thought revisiting that and flagging it as that's an area that's ripe for discussion just makes sense for this report. So that says a little bit of context on those issues. To get further context, actually, so we've thrown out the term student privacy a number of times and what do we mean by student privacy? What does that look like? There's already a federal law, FERPA, that basically controls or puts some restrictions on what schools can do with their student data. The proposals that have been put forth, the law which started in California was called SOPIPPA, Student Online Personal Information Protection Act. And this is not so much targeted at the schools as at the education technology industry or ed tech. And I don't have any kids in school, but I guess there's a lot of technology in school, cloud-based technology, companies that provide apps. Google has a product. And students, generally speaking, don't really have a choice as to whether or not to use these apps or not. The teacher, the administration besides this app or website or whatever is being used, then the students use it. There may be consent or not, I'm not sure. And so what SOPIPPA does is basically, and the devil is in the details and that might be something that gets argued about, what it basically does is puts restrictions on what an ed tech company can do with the data that it collects. So in the absence of any restrictions, presumably, one of these companies can collect student data, package it up, create a profile, sell it, use it for targeted advertising, use it to rank a student or grade them and basically create a permanent record of that student which will travel with them forever. That would be in the absence of any regulation. And so this might say, if you're collecting a profile on a student, you can only use it for the educational purpose that you have been implemented to do and you can't sell it on or you can't use it for targeted advertising. There are about 20 states that already have laws like this in place and they're all a little bit different than each other. So that is what we're talking about when we talk about student privacy in this context. It's the education, technology, industry and what they are doing with student data. So let me open it up to the audience for questions, concerns, thoughts, suggestions. Yeah, I mean we want to hear from you about what's important to you as we have this discussion and move things forward. So is that coming out with VPERC? We've already submitted written comments with a lot of honor and honor. Interests here, I think several of our members have emailed the Attorney General's office mostly today about their thoughts on these issues. I want to just touch briefly on three. So in terms of, you mentioned the FCC rules and the FCC broadband privacy and waiting for California. I don't want to open up a whole two of these hearings. We discussed a lot about the California law. I think people on both sides of this consumer or business side all agree that a lot of the California law needs to be worked out. However, the broadband privacy rules, the FCC rules, while not specifically included in that California law, I want to make the argument our reference as part of some of the broader stuff there. That's a very specific item. That's the idea whether or not internet service providers specifically need to get opt-in permission from their customers in order to monetize their data. So we would argue, and I'm not saying that there's consensus on this, but I would just want to make sure that it's out there that we believe that Vermont does not need to wait for California on that sort of thing. We would be pushing and arguing for Vermont to move forward with enacting the broadband privacy FCC rules. So that's just, I just want to make that distinction of that. That's not all of California, which is much more sort of far-reaching. And then just in terms of getting things on the table, this was brought up last time, but it's worth putting out there again. I don't think it's a bad idea to revisit the state's Data Breach Notification Act. We don't have a specific policy proposal, but as we know, data breach notification right now refers specifically to personally identifiable information. That's sort of a narrow subset of name plus social security number, name plus credit card, that sort of thing. And it would be worth taking a look at and deciding, does that really represent how data is contained and managed today? So for instance, if somebody were to lose an email and a credit card, I don't believe, and you could correct me if I'm wrong, but that combination is not triggering for data breach notification, right? So the question is, you know, with a lot of these databases now separating out different pieces of information named for being kept here, but other sensitive information being kept here, do we need to revisit what that looks like, so either is that. And then this is just really not fully baked, but since you said getting things out on the table so that you're interested in getting out of here. Similar to what we were talking about with student privacy, the idea of medical data or health data. You know, we have HIPAA, obviously, but one of the issues with the sectoral approach to data in this country is that it does need gaps and holes, and HIPAA really deals with actors and not the data itself. So there could be non-medical, you know, institutions or whatever holding on to medical data is some of the most sensitive and important data out there. So I'm admitting that this is half baked. But whether or not some of the smart people in the Attorney General's Office could take a look at what there could be done about medical data, especially non-HIPAA covered people holding medical data and what are some of the things that we could be doing there to better protect that. So those are just three things to undertake. Thanks, Zach. Chris. Chris Dore from Edelson, BC. To expand on that last point, just raising the question of what is medical data and what can be done to protect it. I may have raised this a little bit before, but the question of things like biometrics and genetics that HIPAA is a terribly outdated statute at this point and doesn't have the effectiveness that it should, and those are two, you know, big areas that are not well covered. So I would just raise that as a question. It's a good point. I'll include that type of data which is being held, but like you're saying, we're not traditional actors in the medical space at all. Genetic testing companies like 23andMe or biometrics held by numerous consumer companies, social media companies into that nature. Tom. I've been concerned about, I guess, the fractionalization of categories is somewhat what he mentioned. And I think we should be concerned about the data and not the category. If it's medical data, it should apply to students and to medical people. If it's financial data, it should apply to everybody equally. And first of all, determine what data means protection and then determine what level of protection various groupings of data can have. And then once we get that out, we say, okay, for students, it's these categories and this is how it gets protected. And if it's something that students have and that medical patients have in common, it gets protected the same way. And it gets protected the same way whether it's in the hands of an internet service provider in the hands of somebody that we call a data broker or it's in the hands of somebody that we all have a direct relationship with a consumer. I think those categories are not irrelevant, but they're not the categories we ought to be focusing on. I think first we ought to be focusing on the data that needs protecting, how it gets protected, and then dividing it up then. If you're in one of these various categories, these are the data elements that get protected and everybody gets done the same way. Tom, I tend to agree with you. It's the how, which going back to Zach's point about, I don't think it's a question to your weight or not. It's a question of can you pass a law, that will withstand legal challenge? And I go back to my fundamental question, I think that I started the first hearing with, that in a rapidly changing world, can the law keep up with technology? And are we going to be passing laws every year to try to keep up, because my sense is we're going to be doing these hearings on an annual basis. And so I go back to the how, also looking at the federal landscape to see if there will be action there or not, but certainly to lead on this, but to lead on it in a way that is effective in that we pass legislation that withstands challenge. And then we can all agree that this is a landscape that's changing rapidly. And I do think it's important to look at the national landscape, the federal landscape, and to look at what states are doing as well. I mean, perhaps somebody can answer this question. And on the net neutrality stuff, I thought a reasonable act for a state to take, obviously we're now in litigation, which is fine. Do we want to be in a constant state of litigation? Maybe we do. So I think there's a lot of factors on this, and I think it goes to Tom's point about that. It's really the how do we do this most effectively that protects for monitors that also I think sends a message, though, too, that Vermont is part of the global economy. Vermont is part of the digital economy, and we want to be part of that economy, but that is not to abdicate our responsibility to our community members to protect their privacy. And so I struggle with the how, looking at really two different landscapes. I think it's second to the point there, the difference between the student, the reasoning around student data protection and reasoning around the citizen data protection because we're obligated to interact with states and also different. And I would argue that the how is the key, what we don't have to answer yet, the work that we can do with resources that we apply to solving that problem in one instance from a technical basis will provide us the same tools to protect that in any other context. And I think the biggest key is not so much defining and trying to regulate how we hold data but focus on how we interact with data and push the data down to the user. What that produces are overall risk exposure. So instead of having these data sites where, yes, there may be a high value target at times, but that can be known and addressed. But the value of hacking 145 million individual people that's a high cost to do business and really mitigates the issue of data security, which is the other component of that security. Good point. Well, that's an interesting point, right? So we're talking about privacy, sort of writ large, but a lot of what we're talking about is security. So, but privacy and security are, in some ways, different things, right? So that's a really important distinction to make and we need to be clear about what it is exactly that we're trying to accomplish through these efforts. But I do know, I mean, security is constantly on people's minds because they read in the news or they hear about security breach and they wonder, am I exposed somehow? Somebody can steal my idea. What do I do in that? So I think that's a great point and we need to be able to clearly distinguish between the two and figure out where they overlap and then be conscious about what we're trying to accomplish. What I would present as a suggestion is that if we can establish the role of a privacy individual, the director or chief privacy officer, that their objective may be to identify areas of resource expenditure in different departments in different areas of the state that can be collaborated specifically toward security, data management and sort of developing those technologies to interface. And if we can do that in open technologies, open resource technology, we can also find a community of other contributors from around the world that are building similar things and hopefully leverage those resources for our long-term. Tanya Marshall is Chief Records Officer of the State Air Force. Just for full disclosure, my background is actually in information sciences. That's my degree. So a lot of things here talk about the how. That's actually one of the things that newer legislation that just passed this year is getting is that, you know, we talk about records of public records management and that's the only scope that I will talk about because it's not outside of that, is, you know, we talked last time about the last privacy hearing that the state is pretty immature in terms of governing its own information and the struggle, even for myself being in this position, I've been in the state for 15 years of watching people's minds that go from a box of paper records and not really understanding. So the how part of it is actually structured currently in the legislation to direct 100 of this past year about information governance, which that's really where you get all these different factors. The key part for information governance really runs on pretty much the principles. One is accountability. Who's responsible? The second is transparency. How well is it documented? Not transparency. Think of governing accountability, but really documentation so everyone understands what they have. Integrity, the accuracy of information. So although we're talking about privacy, obviously when we're collecting information of any means for the state or the individuals, it's about to provide service so there's integrity aspects to it. Protection is a huge principle within that, so that's the next one. We also have compliance. So any time there is a law, how do you actually comply with it and how do you have that relationship to maybe advise that or recommend when the legislation isn't quite accurately aligning with the other part of it. Availability, obviously for any entity you want to have your information now that you do your job as required and then retention and disposition. For the Vermont State Archives and Records Administration, we offer oftentimes people classified as retention and disposition. What can I get rid of something? What can I destroy it? That is something that we do. We do all the other factors. And so as you look at the Chief Privacy Officer, and I know we haven't submitted written comments yet, but I really see the collaboration aspect because the how is something I've spent 15 years in the state of Vermont. It's kind of a social science experiment in some ways with me watching how these things come about, but really the how is not what we've always done very well. And it doesn't mean that we have breaches or things happening, but we definitely have a disconnect when it comes to here's the legislation, here are the individuals working, here's the kind of information that's written and recorded, how do you actually manage that in a responsible way in compliance with the laws that are required by the federal and state governments? Every about. It'll help my job, but I also don't want to sigh because what will happen is if something can get passed during a silo so it's so focused on privacy that the how's never happened. And that's really what the information governance framework is about. I think that's a great point and it's, you know, whatever we do, we want it to be effective. Yeah. If we don't build the framework or the foundation, you will first. I mean, I think, you know, with this, sometimes we get caught up in the, you know, I don't want to call it a headline, but we forget the foundational work in state government that if we pass something, you better be able to enforce it. Well, if we don't have the resources to enforce it, then what are we doing? And if we're coming back every year because there's a new issue and we haven't done the foundational work with it, with the state, I think here is a really fair point and that, I mean, that's what I keep coming back to on this and I, look, our job is to protect people and how best to do that, I think in a world that's changing rapidly, that's going to require the state government to look internally first as opposed to just responding I think to each crisis. I think we really need to reflect. I appreciate your remarks. And I do believe just, we have a chief data officer as well and we have a weekly meeting. So those things are the things that we're trying to combine because really under, you know, data is just a subset of public records and definition. So when it comes to the protection for the technology focus, but again, it's about those foundations, those pillars, because one, I always describe it when we do trainings, it's kind of like you're building a porch. You have one pillar that's short. You're not, you're not actually having a full foundation that's for, you know, something that's so you have these eight pillars of the state's principles for information governance and when they're at a different level and there's ways to rate them. So even when we look at why a breach might have happened if it happened to the state government, the first thing that we do is analyze where its level is that agency that has been caused different areas of information governance and oftentimes we've been drilled down to what the problem was, you know, and what caused it versus a reactive, which might be legislation, what was the real cause of the problem and distilling that down to understand it to fix it. Are there other members of the general public that, oh, John, you got it. This is probably a detail, but if you, and I admit to not recalling you if I came up in the legislative committee to see their house commerce or an economics about in the Senate, but I know they talked about us looking at a chief privacy officer. Is there any decision yet on your part of where this would reside in the state government? For those on the phone, if you couldn't quite hear it, the question was, is there any indication of where the chief privacy officer might reside within state government if the state should elect to go down that road? I'm trying to get a really good argument for the Secretary of the States. Yeah. You know, I'm not advocating you either direction, mine is more collaboration. Yeah. It does all a certain effort in that together. I don't know. Is the quick answer. I really don't. I mean, it should be wherever it's going to be most effective. I just wonder, I'm seeing a lot of familiar faces in the audience, folks that have participated actively throughout this process, but I'm also seeing some new faces of members of the general public. So I just thought, you know, if there's anything on your mind, you don't have to be an expert and any of this stuff. I'm curious within public protection, I'm just, what are you thinking about these issues? What concerns you? What, where do you think the opportunities are? Is there anything in particular that made you want to come out to this meeting tonight? So I'd love to hear from new folks that we haven't had an opportunity to hear from before about what's on your mind and what you think is important in terms of protecting the privacy of homeowners. I wanted to direct your attention to a very interesting book that I came across called The Closing of the Net. It's by Monica Horton. I have a copy. I'm going to leave with you. I'm also going to leave you copies of the first chapter, which is called Power and the Internet. And the reason that I haven't watched the first two appearance, but I have been involved in IT for over 25 years. And I've been very concerned about the lack of understanding the public of what this whole data mining, data profiling is really about. It's not just they're going to put a different advertisement for you than the next person. It's really, really filtering people's access to reality and the information that's knowledge upon which we all operate as a civic democracy. And I just happened to stumble on this book when I was in London. It comes from a person who's from London. Monica Horton is a from the London School of Economics. And she makes a lot of points. And this first chapter, you're going to find a lot of it. I mean, from her perspective, data is the new oil in the world economy. That's really one of her important points. And the internet service providers and the applications are the ones that actually have the power to shape access to knowledge which is the basis of all of our civic decision making. And people don't realize how I can do a Google search and I'm going to get certain results back and you're going to do the same one and you're going to get something completely different. And both of us feel like we've really figured out the world's reality and what the information is and the best deal. Or I can go try to get a flight on a flight, you know, purchase an airline ticket and because I know I paid $835 the last time I went to Paris, they're not going to get too far away from charging me that Saint Christ. Oh, maybe you get $50 off. And that person is going to be getting a different quote. There's just so many things about it. And I totally get your, you know, your concern about every year we're going to be doing this again and again and again. Because I really feel like we are very ignorant as a population. We don't have any way to educate people about this and how it's, and we saw in the way of the election is people get a Facebook post and they immediately pass it on. I asked a couple high school kids when you get a Facebook post, do you check the resources before you send it along? No. I don't think, so education is going to be part of this process. I'm really feeling your pain of how do we make legislation. It's not just protecting it in the silos or it's educating people and also how do you actually, I think dealing with the commercial collection of it and use of it which is where I think part of what you're talking about is has to be addressed is that it can't be consolidated and profile. So the point where everybody's being, their world view is being manipulated and how can we ever have a civic society when everybody's getting a different view of reality. So I'm going to give you an approach. Give me your thoughts on the how part. I've been struggling with the how because I think a lot of it is education. There's a book called, another book, I love to proselytize the books, called program or B program. It's a Douglas Rush pop book and he really goes into the 10 educational things you need to think about. You don't need to be a programmer but you do need to understand the fact that the programmers have completely created your world view. They made it so you only have two choices for gender when you go through one of their apps. That's their decision and you don't have any choice. So program or B program it's education is I think going to be a lot of it. Legally I think the thing about not people being able to the opt-in you have to have the default position the data belongs to you and there's a lot going on in Europe and so Monica Horton who wrote this book, The Closing of the Net really focuses a lot on that because they're ahead of the game with their whole general data protection regulation. So we have a lot to learn about that so I think it's sort of somehow figure out how to define Vermont's citizens data as owned by them and then make mechanisms for the business people who really have legitimate needs to do it because people do appreciate giving a recommendation of a book that just came out that happens to be the target of the stuff they like. So I feel like the data is owned by the user the open person and they need a way to be able to opt-in for things that are out of interest to them and they need to be able to correct inaccuracies but it's very tricky so I will continue to try to think about but you will find this chapter power in the internet incredibly and it goes beyond there it goes to not commercial use but actually government use as well government's filtering data so there's a lot of symbiosis between commercial use of the data and governments who want to be able to filter in more nefarious purposes in other countries right now besides us. Great thank you that was great thank you for that you go out and talk to everybody in the state because you make a lot of sense I don't have to find a way to educate people I don't have any confidence again and particularly young people with school and principal friends of mine are educators those are the people that need to be helped I think this issue it's ubiquitous but it's overwhelming to people right it's okay to your point I have to do this if I want to get a flight and I don't think people know their rights and I agree with you about how do we raise the awareness and the education part that's where the real change comes from I think that requires us back to the foundational part to actually become truly subject matter experts on this in a rapidly changing world and to create change and to make progress where we can not to always get back bogged down on the big stuff because it's empowering people at the end of the day I think that's going to make the change here well you have to acknowledge the fact that there are solid business reasons for people to have the data and using it so you can't put us in opposite ends and say it's either the privacy I want to be private or I'm the business person I want it all because I don't want to have to once you choose these try to get approval from you for that so this is where the challenge is you have to recognize we have people that are incredible entrepreneurs that want to do cool things and people want free stuff and so they're getting free software and apps and everything so I don't want to shut down the opportunity for people to actually get that one and have great ideas for business I think that we just need to make sure it's private for the people and then you have to think it through how can you make it so there are opportunities there are legitimate ways to access that information for the people that are doing wonderful things with software and data well said I'm going to go back to let's protect people's privacy but let's also have Vermont be able to compete in a digital economy that's a big part of this too well that's what I'm saying you don't want to shut it down back to your comment on your comment so secretaries these office used to run a citizen education for the secretary and for our efforts of bringing it back to the things that were including it had been more focused on elections but given on last election round and now digital literacy which I see for people understanding I have three high schoolers so I see this all the time about finding something that's different than how we did research but digital literacy is understanding the validity of what you're reading and also understanding how the technology can shape what the consumer so I see a lot of value in that and I would strongly recommend some of the recommendation underneath the hat of any of the three categories or subcategories about this part about being part of a larger society so one we have to be better at thinking about what we're reading and this always has happened but it can be a combination of some specific education especially at a younger age so as maybe as part of the student privacy component it's not just always about the technology they're using because I see that now a lot of teachers are uninformed of how to and schools are doing that where there's all sorts of technologies that are collecting student data as they're using that but the individuals are making the purchasing don't know or understand the context of what they're making purchasing on and the students also don't know so there's an opportunity to delve into specific education that also includes digital literacy that also includes consumer protection at the same time and doing that at an earlier level so maybe that's a component for the student aspect so you're going to go up earlier so there actually is an infrastructure place in the state that will provide information literacy for every Vermont public school student the problem is it's kind of under siege right now school library positions and school library assistant positions are being cut across the state and they are the folks who know how to teach kids the difference between fake news and real news they know how to teach the kids how to do reverse image search they know how to teach the kids how to identify dicey websites they know those things they're up to date on those things but they can't impart that information to the students if they're horribly under staff but the structure is they have the schools can we can find certified librarians for the schools to hire to do the to play the teacher librarian role the way they know they can that it's already in place R. J. R. from Allison I kind of lean into this idea that a privacy law would have a negative impact on the state's competitiveness in terms of the digital age at their core I guess for privacy laws are just transparency laws they're not regulating the technologies in really any way they're just saying these types of data and information are considered important and you can't take them from people without telling them and when companies are transparent with people consumers have the ability to make choices about which businesses they want their business with you know if you hear from small and medium sized companies that are not Google, that are not Facebook they are dying from transcripts because they think that's the only way that they're going to be able to get competitive with those folks if everybody had to put their data practices on the line for consumers to see a lot of these companies feel that consumers would go away from those Silicon Valley companies and towards the smaller and medium sized technologies I appreciate it do we have anyone here representing any local Vermont ISPs your hand sort of I want to back that not only small tech companies but residents if we can put the right technology in place we can flip the way social media monetizes data so that the individual is actually getting compensated for what is truly theirs and it doesn't matter what we agree to with a service agreement because nobody reads it we're in a trust relationship we're assuming that there's a certain amount of responsibility being taken with the data and that we're going to be treated a certain way it doesn't really matter what the subtext is it's our intention that really law needs to be defined on and we passed a law this year in Vermont Act 205 that creates a fiduciary responsibility around a company itself to be a personal data protection company that got changed from the word trust in the last minute which is a key distinction but what it does is it keeps it in the Secretary of State's jurisdiction and allows us to work with companies to define a fiduciary responsibility around their holding of data and I believe that that is a door opener to push the market to push that data again into the user's hands once you put the data in the user's hands and you have established protocols and paid walls so that any participant whether it's a big tech giant or it's a small startup tech company has the same access to that data through a market interface now not only can we attract the small tech companies from Vermont but we can attract more residents because we've got the laws that get them paid for their social media usage and now we can meet the state's goals of bringing two or three thousand more people to those companies and I would propose that the state look into opportunities to create a public type of partnership to list companies to exercise that law that we've passed and to make sure that the state is involved in a way to get as much open source technology out of that and protocol knowledge out of that so that we can implement those tools for our state services while business can run with those tools and create a better ecosystem I like about these means it's the follow up though so we've got to make sure we have everybody's information I do have a consulting firm based in White River Junction by the way White River Junction is doing great looks beautiful awesome it's a nice little town really were you at our meeting just to back that up I feel like this issue of monetizing private data is something that every Vermonter I know would get behind. I'll continue to worry about them complaining about spending money on lawsuits or privacy officers if that's the sort of thing they're going to be taking care of and I think that my friends who would by choice spend a snow day tomorrow deer hunting will feel just as strongly about that as my friends who are going to stay inside with a hipster podcast and a craft brew I think everybody will agree that that's what they want their government to be doing for them did that make anyone else want a craft brew other thoughts, suggestions, comments ideas this is Jones back 30 minutes ago one of the gentlemen on the side here made about treating data as the same got me thinking about student privacy and student privacy is very important and I certainly encouraged that but it raises the question of is student privacy only a subset of a much larger question about children privacy and how best to serve that it really is something is there a sufficient framework in place to cover kids no matter where they are whether they're in school out of school using school products or anything of that nature and shouldn't there be some consistency between how we think of data when it's at school and when kids are just on their own and whether there's need like HIPAA that I was talking about before COVID is already also updated statute at the federal level that it only covers a subset of kids stops at 13 and it really has not kept pace with the way that kids interact with the world around them and whether there's opportunities there to expand that reach at the state level to both cover student privacy but just kids privacy in the broader context well this is something that actually came up in the context of the data broker working group and one of the original recommendations of that working group was to take a hard look at children's privacy in the context of that third party sale of data and whether or not there was room to have a sort of Vermont style of covering essentially the same sort of COPA framework but for under 18 so you'd be capturing 17 to 13 were not contemplated by the federal law and I don't know if there are people in the room I think they were part of that working group process and I'm just curious I think at the time there were concerns about well wouldn't there be preemption questions have other states done this and I just wonder are you aware of other states in that area you know it seems to me that you're covering a group that's not contemplated by COPA all together I would never convince that preemption is really an issue because this is a population that wasn't covered by the federal law so I am curious to give your take on that question and whether or not there are templates out there for activity in that area to my knowledge I think there are some I think our home state of Illinois has one that it's clearly ever been taken off the shelf that has covered I think is it up to 16 so 14 15 to 16 years but it is also an older law that there is need in general to set a few times if you know the question of every year you have to refresh this but I think when a lot of these laws were written in the first instance they were not envisioning anything that was coming and we now have a better handle on the ecosystem that we're in and I think better future-proof protection for people so I think there are opportunities. It's a slightly different topic but we were talking about data security earlier you know one big aspect of this whole conversation is that we all entrust businesses with our data on a daily basis and we assume that those businesses are taking reasonable steps to protect that data we know that a lot of these businesses are not because they're constantly data reaches we hear about them every day and not just the collection of our data but also what Internet of Things products so you can go online and buy a $40 security camera or nanny camera that's cloud-based and put it in your home and then you can view your kid on your iPhone but that nanny camera is hackable and someone else can be watching your kid through that same camera and that's not a theoretical harm it's happened it's happened multiple times we know that cameras speakers you know all these surveillance tools that we're putting in our homes heck refrigerators, televisions are hacked um the attorney general's office has spent a lot of effort enforcing data security under our consumer protection act we have interpreted the consumer protection act to say you have to have reasonable data security um and as has pretty much every other attorney general's office and the federal trade commission but there are only there are very very few actual there are hundreds of settlements that bear this out but very few legal decisions actually uphold the legal theory that all the enforcers have and none in Vermont and none in the second circuit um we all rely on a third circuit decision so some states have implemented laws that rather than just relying on our it's an unfair act or it's a deceptive act they say you have to have reasonable data security and if you don't that's an unfair act um there's probably about a dozen states that have done that so that's something that we could consider doing I would argue it wouldn't actually change the obligations of businesses it would just clarify them and not just for businesses that hold on to our data but businesses that produce products that could hold on to data that produce acts that hold on to data or that provide services to other businesses where it's assumed that they will hold on to data in the medical area under HIPAA there's already a security rule that says you know if you're working with a company that has HIPAA data you have to have a certain level of security too but that only applies within that industry within that area so you know there's a question as to whether or not it would be enough to just say everyone has to have reasonable data and that alone would raise things or to go one step further and say and here's what we mean by reasonable data um and I have heard a lot of arguments from the business community that they would like that that they would like to hear what exactly we mean I don't know what that would look like if we actually tried to do it but um that's something else to consider that might be helpful in this area yeah I mean I think it's a fair point because one thing we know for in order to have a landscape that treats everybody fairly and equally you know to the extent that you can provide certainty to the business community that I know is generally desirable for the business community they kind of know what the rules of the road are um and the attorney general is always talking about building a culture of compliance in Vermont so we do a lot of public education and awareness and reaching out to communities and trying to find out what is uh where are the trouble spots and what can we do about it so to the extent that we could remove some uncertainty by sort of saying here here are the right lines this is what it means um maybe that would be a help to everybody um certainly worth having a conversation yes sir so we'll call it where this went but there's a conversation around like repair talking about hardware standards um that's a key component to all this and where that intersects is that you know the hardware manufacturers basically basically they write the rules if we're not careful about maintaining certain guidelines on how those things are put together and there's you know always like you said you know things are constantly coming to light that are hack and hackable and there's two things one we can try to prevent hacking but there's a lot of analyzing by breaking down the silos but there's also just awareness and it's being able to design and define systems that can be implemented that make sure that at least if it does get hacked you know that something's been hacked so you know when they've been corrupt you know when the microphone's been accessed and there's no questioning or falsifying that data record you can see okay you know that that camera was accessed three times when I accessed it and be able to verify that there wasn't a breach of that so you know all those things I think have to come together and maybe that's you know again where there's a place for that chief officer position or simply a sort of recognition of where collaboration has to happen between these different initiatives I'm glad you raised it there is a right to repair a working group or anybody that's interested that is chaired by a member of the House and the Senate, Senator Pearson and Representative Hill they're actually meeting on Monday and I'm a participant in that working group as well so a lot of interesting discussions are coming out and I think one of the questions that frankly has come up that we think there's been an answer to but that comes up it is about this issue of security because some of the manufacturers will say of particular kinds of products that one of the things we build into our products is a certain amount of security so if you eliminate that and we no longer know or control how that those systems are protected because a third party whether the consumer themselves or a small repair shop now it's out of our control so we don't know what's going on with the product and we can't protect it anymore so there's a question about is that opening up a can of worms and all of this but also to your point you know could there be more and better security features and so they're both sides of that point it's come up and I don't think there's been a conclusion to that but it's just part of the discussion it's an interesting one that's not like when you're looking for what's really objective in many cases there are international standards that many technologies or any industry builds to and I know that's what we look for is what standards they're even not actually implemented but what standards they're also using to measure compliance because we're looking at how to build these things together even from legislation what is the standard is that companies then both try to build to but don't necessarily adhere to but if it's put into that's a reasonable standard to adopt and that's something that has an international body which is what you're really looking at or a national body that's something like this standard but what are some standards that are already out there that are already outlined because there are over-gain the security there are all sorts of things for trustworthy systems so anything that kind of sounds vague there's no international standards that can be looked at but it's a legislation so it kind of takes out that but it puts the hands of the person or the company to actually adhere to but it does have all the ability which is what the party is to measure compliance or where something goes the other problem with security being tied to the manufacturer is that then the manufacturer has breached everything out of that manufacturer has breached and in my opinion it's just my opinion but in most cases when you get a technologist or technology company saying something is too complicated it's a complete call out they just know it costs something so just know we're at the hour mark and I have a 15 minute ride on that to get home for bed type reading so I'm leaving I'm going to turn it over to Sherry Clark to run this unless you should conclude this I know Chris and Sherry and Ryan and Clay are happy to say even if everybody else leaves we'll be here other thoughts, comments, suggestions one of the reasons I was interested to come here and learn more about this topic is because of the deep concerns that I have and a lot of types of consumers citizens don't have either choices they only have one company to choose or maybe two companies to choose from so that whole free market thing is not really happening or they don't have the time to spend all of their time researching saying okay well okay now I'm going to buy a crop pot and find out which one is the least hackable so we really do depend on the government to take that to protect us in that measure because there is only so much we can do as individuals so I really do appreciate the deep thought that is going into this process and being able to attend this hearing as a citizen and just that Vermont is moving towards as much protection as possible for the citizens even though this is a difficult subject thank you I think that is a great example executive order and legislation and now we are litigation and there is litigation at the federal level obviously here in Vermont and it's navigating all these different landscapes with the goal though of protecting Vermonters and how do we do that most effectively understanding again for me I'm really stuck on this question how do we keep pace with technology very I'm not sure I can I'm not sure I can I would just argue that the people writing the law need to become technologically that's the point of the foundational that we agree with you but if we have a citizen legislature right my point being that the term code of law is a thousand multi thousand year old term code is code if that it's really we're doing the same thing we do over a thousand years we just have to hire a high school kid to type it out for us so the point of implementing technology I think sometimes we're so caught up trying to implement regulation legislation because that is the code we know but if we can put resources towards getting the people employed by the state to be digging into the technology and the solutions that are out there in this body the law can be a leader in implementing solutions things like net neutrality is great litigate don't litigate if we decentralize the name systems then we de-identify the traffic and now they can't pick and choose what traffic they're going to control and now net neutrality is actually an obsolete context so the consensus I'm hearing is that on how is to develop the subject matter expertise with the state government with the gravity 100% because if we don't we're doing this which is fine and I'm happy to do it but we're also caught up in losses or losses we'll do our job thank you you're already doing way better than the senate did when you were interviewing Mark Zuckerberg so you got a good head start thank you you made a good point about developing the expertise within government and maybe part of the solution is we have a lot of the expertise in government now spread out throughout different parts and a lot of the efforts are basically by those people seeking each other out and meeting with each other maybe the solution or part of the solution might be a reorganization of saying all of these problems we have DFR that says we need expertise to look at finances and insurance and they need to focus on those we have people who are looking at utilities maybe the answer is that we need to rejigger how we're looking at this and say we need a commission or an agency or something that is focusing on I mean that's a chief privacy officer but that's more of an inward looking not an outward looking maybe that's a longer term maybe not a factor I can give you an example just in the middle of it but we just were a white paper with one staff member related to GDPR so it's national so we got pushed out finding a collaborator in state government that one understood it as well as we did knew all the factors for it was impopulous and difficult but it is many ways we're set up in a certain structure and it's about collaboration and it's really about willingness to collaborate where when you're talking about technology it's government and I was talking about standards the one entity or area that don't get much use of standards but really good technologies are own internal not talking about Secretary of State's office per se but we're so third party we're internally sometimes having the lack of expertise internal on that kind of technology is no longer here we have switched as a state very much to have developers and computer scientists to just basically applications and third party so I think there's a focus here being small enough to have an idea and to collaborate and understand using to fill the expertise I need to negotiate those contracts with the vendors can be a challenge because we don't have the expertise to know what we should be asking for that's true but it's been a shift you know so it's more in many ways having a skill set we've lost some of those over the last decade or so we have approaches to state technology and data I don't see any more hands the snow is supposed to start coming at 7 p.m somebody already called us it'll be a snow day tomorrow me don't you you no we're always over we're always over I'm going to take my leave I'm going to conclude the hearing unless I feel free to stay in chat this auditorium is ours until 7 o'clock so thank you all thanks for coming everybody