 Good morning. This is Friday, April 9, 2021. We're here to deal with H195, which is a bill having to do with facial recognition technology. It seems like a simple little bill. So Michelle could walk us through this simple little bill and we appreciate it. Sure. Good morning everybody. So I'm going to start out. It is a short bill, but I wanted to give you a little bit of the history as a reminder. So last year in Act 26, there was a provision that the General Assembly passed establishing a moratorium on the use of facial recognition technology by law enforcement. Can you see that up on the screen? So you'll see there establishing the moratorium unless the use would be permitted with respect to drone or existing provision of law. And you'll see in subsection B with the terms that are used with regard to facial recognition technology. So it's up to you. So there's currently a moratorium, but you have obviously the ability to specifically authorize any exceptions to that moratorium or to lift that moratorium. So what you have in H195 is creating a narrow exception to the use of this type of technology for law enforcement, specifically when it involves cases pertaining to sexual exploitation of children. So you guys have been dealing with H18. So you're familiar with this chapter. And when we talk about sexual exploitation of children under chapter 64, that is primarily the chapter that deals with the production, possession, viewing, etc., of child sexual abuse materials. There also is the luring statute in that chapter as well. So you see here in the bill as passed by the House in section one, subsection A. So notwithstanding the language that I just showed you that was enacted last year, the General Assembly is authorizing the use of facial recognition technology by law enforcement for criminal investigation into sexual exploitation of children. So subsection B is where the limitations come. Can you, Michelle, can you go down? Is it not showing? I think, well, it's stuck on... Well, this is the moratorium from last year. This isn't the bill this year. It's not showing it on my screen. Sometimes when I flip from one to the other, I have to close out a sharing. I feel like it won't flip. Sorry about that. That's all right. So I'm always unsure if what I'm seeing is the same as what you're seeing. Okay. No, that's... All right. I think we got the picture. Okay. Sorry. Hold on. I'm trying. Yep. No, no. I think we understand where we're going. I know, but you need to see... Don't you want to see the bill? I do. It's 195. Yeah. Yeah. I'm sorry. I didn't mean it that way. I meant... Okay. Can you see it now? Yep. Yeah. All right. So subsection A is just creating the exception. And then subsection B here is creating the parameters under which that technology can be used in these circumstances. So you'll see that the use of the technology that's authorized by the exception in A shall be utilized only where law enforcement's in possession of an image of an individual they believe to be a victim, a potential victim, or an identified subject in the investigation. The search is solely confined to locating images, including videos of that individual within electronic media legally seized by law enforcement in relation to the specific investigation. And so I think I can anticipate the questions. And I think that probably if I'll go through all this discussion, but I think if... And I don't know if Matt Raymond is testifying today, or is on there, or if David's going to do it, but they can talk to you about... This is being utilized as a more efficient way to do something that they can already do, essentially. And so they can take you through the process of how they would actually do that. And then we can circle back around to the language and make sure that Michelle has a question. Sure. So what I have in my hard copy, what was sent to me was not as passed, but as introduced. So they took out all the other, because as introduced, it also related to sexual assault, homicide, and kidnapping. They took all that out. Yes. And I think the witnesses can talk to you about why they had requested that be broader there initially, is that through... There was a particular case, and perhaps others, where during their investigation of a child sexual exploitation case, they uncovered other crimes like homicide and other things. And they said, well, we discovered that through using case. And so they wanted it, if it was originally discovered through an investigation into child sexual exploitation, they wanted to still be able to explore those other ones with this use. And the House Committee preferred to limit it specifically to these particular cases. Michelle, could you scroll down just a bit? Okay. So I do have a question about B, right toward the end. It says, it's confined to locating images of that individual within electronic media, legally seized. So I understand what it's trying to say is they're not going to be free to go out into the broader internet and scope around. But it seems to me that there's a little bit of slippage into what is on somebody's computer that might contain that broader internet. So if the person, let's say, has the cloud and they have material that's in the cloud, is that considered within electronic media legally seized? I don't know the answer to that question. Okay. Perhaps the AG's office would. All right. It seems like something we should clarify because I think one tendency is to look at a computer and say, okay, only what's in there is what's in there. But there are enduring connections even when you're not on the internet to broader databases. And I'm just wondering if it opens it up to say if the person had Facebook, let's say, and even if they're not connected to the internet, I'm wondering if it's possible to search within whatever was in the computer when it's shut down. So it's going to be dictated by the scope of the search warrant. So I think in talking to the practitioners, you can talk to them about when they're investigating these types of cases, what are they usually requesting in terms of the search warrant and able to search through. So level of practice of what they're doing out in the field. Okay. I can ask those questions when the witnesses step up. Sure. Okay. Well, are you ready to move on? I'm ready. I'm all done. I'm sorry. How could you be all done? It's that short. I'm still trying to take care of A8 and F7 and you're all done. The multitasking. I'll be honest with the world on YouTube. I'm trying to multitask here. That's okay. Okay. Our next witness is Marshall Paul, who is the appellate defender general. No, actually, Marshall, we know you well, but I think we have been wrong for a while and I've never bothered to correct it. Mainly because I think that as it lists me as a juvenile appellate attorney, I usually get scheduled behind commissioners and deputies and I kind of like going after them. But I'm the deputy defender general and chief juvenile defender for the state. And I am here with some very brief testimony because we do not oppose this bill in the form that it passed the house. We had some concerns about the scope of the bill when it was in the as introduced form and much broader as it's been narrowed down. We don't have concerns about it and I'll explain that. So when the moratorium on the use of facial recognition technology by law enforcement was an issue last year, we supported that moratorium because of our concerns specifically about the use of these large facial recognition databases, some of which have been in the news you might have heard of the databases that are run by Palantir and some other giant corporations. And there's some real concerns about the use of those because facial recognition technology is really in its infancy. Like three years ago, if you asked people about facial recognition technology, there was some software available. There was technology available, but it was not very powerful. It was not generally usable by law enforcement to do huge searches for small, for an individual image to try to identify a person. That's really been an explosion of technology in the last couple years. And frankly, it still is not, it does not have all the kingsire depth. So there's still very high rates of misidentification. And one of the biggest problems with it isn't so much the rate of misidentification, it's the disparity in rates of misidentification. So the software now is fairly good at accurately identifying white people and fairly bad at accurately identifying people who are not white. And that is when you are comparing an unknown image against the databases of known images. The reason why we don't have a problem with this bill is because this bill is only permits the opposite. It allows the law enforcement to compare an image of a known victim against an unknown pool of images. For example, on a suspect's hard drive or on a hard drive full of seized media or however that happens to come into their possession, they're comparing a known image against an unknown image. And there is not the same problems with misidentification when it goes that way. Excuse me. And so really what this does, you know, as far as we're concerned, this does not introduce those same problems that we were concerned about last year with racial disparities and the rates of misidentification. And instead what this does is really simply automates a process that law enforcement is doing anyway. And so with that in mind, as long as this bill stays limited in that way, we don't have a problem with it. And so we don't oppose it. Great. Any questions for Marshall? Falco Shilling is our next witness. Thank you, Marshall. Good morning. Good morning. I need it. For the record, Falco Shilling, advocacy director for the ACLU of Vermont. So thank you very much for having me here this morning. I think my testimony is also going to be quite brief and echoes what you heard from Marshall Paul. We appreciate the work that folks put into this before it came over to create the version that has passed the House. We also, we don't oppose what passed the House. We appreciate the fact that this was more limited than what was originally introduced, and specifically for the reasons that Marshall just spoke to, that this would be looking for an identified individual within a set of images. And then those images would be verified by a person which alleviates some of the concerns that we have about automated systems, you know, trying to identify people and then possibly arresting them from a set of images. That's where we see this technology as most problematic. And we also think that this this shows that the moratorium that you put in place last year is working as it's intended. You know, the moratorium was put in place, law enforcement came forward and they said we have this use for this technology. It's very limited. It's very specific. And we were able to discuss that. And I think we came to a place where we understand why this technology be used and don't have privacy concerns or concerns about the inaccuracy of what's going to be coming out of it. So just to echo what Marshall said, we don't oppose what passed the House and appreciate the, you know, the work that went into getting the language to where it is right now. Hey, great. David, sure, is here from the Attorney General's office. Thank you, Mr. Chair. Thank you, committee. For the record, David Chair with the Attorney General's office. A couple of the witnesses already have given us some nice summaries about the import of the bill. I'll just say a couple of things about how our office approaches this issue and a couple of the specific protections and then open myself to any questions. If there are operational questions, Commander Raymond is on the line here too, and he'll be the person to answer some of those, I think. So the Attorney General's office approaches the concept of facial recognition technology with considerable skepticism. The office has a lot of concern around the privacy implications of facial recognition technology as well as the disparity issues that Marshall Paul spoke about. To that end, the office has sued Clearview AI, which is one of the major facial recognition, purveyors of facial recognition technology in the country. I believe that's the Palantir Corporation that Marshall was referencing as well, same entity. Before we sued them, Clearview AI actually approached our Internet Crimes Against Children Task Force that Commander Raymond heads up and offered its product and we turned it down. We refuse that. We don't approach these things with eagerness. We approach them with skepticism and that is how we designed the bill and how we were happy to help redraft it to ensure that it stayed narrow and these uses did not spill over in ways that were unintended or not desired. In order to make it a really narrow usage and ensure the protections that a couple of the other witnesses spoke about, there are really three protections in here. If you look at it, and I'll actually go from bottom to top, the first one which is at the bottom here is that these are only searches that will happen within electronic media, computer hard drives, thumb drives, whatever that might be, that are legally seized by law enforcement in relation to some specific investigation. Legally seized is a way of saying seized pursuant to a search warrant. It's theoretically possible that law enforcement could gain one of these by consent. I don't think that ever happens as an operational matter at the current time and Commander Raymond can talk about that a little more. These are things that are seized pursuant to a search warrant which means that a judge has already decided that there is probable cause that evidence of a specific crime is likely to be found in the place to be searched. So we're already talking about a narrow universe where there's already actually been some legal process that has likely happened in order to ensure that limitation, that limited view. The other protection which folks already talked about is that this is images of known individuals. We're not, this is not searching databases to try to find people. These are searching only known individuals that could be a victim or potential victim which is the child or would be the child in this case or an identified suspect. Not again, not looking for random people. There has to be an image that they have of a identified suspect in the investigation and search those images in order to see if there might be activity going on that requires intervention, rapid protection of a child and so forth. And Commander Raymond again can talk a little bit more about those operational aspects. And then the final protection is just the limitation. These are only going to be allowed to be used during criminal investigations into the sexual exploitation of children. And again these are investigations that are only carried out by the Internet Crimes Against Children Task Force in Vermont which means Commander Raymond and his staff. So those are the three levels of protection and again as others have pointed out, or I should say I'll just talk a little bit about how that happens. Right now the Internet Crimes Against Children Task Force looks through every image that it seizes on these devices and they do that to see, well to assess what the criminal liability is, but also to see whether a victim might be exploited, somebody who's in Vermont that they need to protect, see if the suspect that they are investigating has themselves perpetrated some of these acts as opposed to just possessing images. So these are searches that are happening now lawfully. They are searches that will continue to happen lawfully whether or not this bill has passed. So this does not expand police power in any way. All it does is allow these lawful searches to happen more quickly than they happen now. As Commander Raymond will testify there's a considerable backlog in investigations into these offenses in Vermont. There's been a dramatic rise in referrals under these crimes and the inability to use this technology to do these searches has exacerbated that backlog which is the genesis of this request. But again this is a limited request. It does not expand police power. It only allows officers to do more quickly what they are lawfully allowed to do now. Okay. Mr. Chair. Under the roof. David I'm looking at the language and the I guess the phrase that I am wondering about is electronic media. So it says electronic media legally seized and in talking about it you and the other witnesses seem to be talking about a device and a hard drive or a phone or something like that. And I'm wondering first of all why the word devices wasn't used there because electronic media also includes online streaming and other facets of the internet. And for me the slippage here is if we're talking about looking at somebody's phone that's in the possession of the police not connected to the internet it's pretty understandable what is in the device. But if we're talking about electronic media then seized opens up a lot of possibilities. What is what is seized mean in terms of streaming video or the cloud or something like that. It suddenly expands out of the range of the immediate physical into the digital world. And so I'm wondering your thoughts on that and was there discussion of that phrasing in the House. I don't believe that has been discussed yet and I think that Senator you're correct in terms of the intention here which is to be limited to devices. And I don't actually remember the genesis of that that's you know the term electronic media at this point. My read of the statute is that it accomplishes that if streaming images or videos would not have if the law enforcement is looking at that that is not material that they have seized that's stuff they're accessing on the internet they'd have to go to the you know the warehouse or whatever the the the place where the media companies have stored that and people are accessing it from there that that would not be a that would not be material that's been seized in any way by law enforcement. So I don't think that this lawfully permits that in its current iteration because again it wouldn't have been seized it's stuff that's out there somewhere else. What about the cloud. Sorry what about the cloud though if you if you have the person's device and you've legally got that and the their cloud account is tied to the device have you have you then seized legally what what they've got stored in the cloud. I do not believe you would have but I would also defer to Commander Raymond in terms of those sorts of how do they deal with those sorts of issues where somebody is accessing things from the cloud I think he might be well I know he will be the better person to ask some of those operational questions. I do think that this statute really is limited to the physical devices that have been seized pursuant to a warrant and and if you are accessing stuff that is stored elsewhere that has not I do not read that to include things that have been seized. Okay in in that case my open question to you and the other witnesses would be how you would feel about swapping the word devices for media. Certainly we're open to clarifying the intention but I also whenever we do that we just I just have to check with Commander Raymond and others to make sure we're not actually doing something but we were happy to clarify. Luckily he's the next witness. Just so I understand Senator Bruce you're you're specifically talking about the word media as opposed to a device like a thumb drive or the actual phone a hard drive hard drive on my computer the iPad and everything else and not what's stored elsewhere or what's been streamed on that device or yeah or databases where it's shared between other other users and and that user but that might be downloaded. Actually you may I think you may have uncovered a problem with the how fast version of the media because it does mean something different. I'm not media to me I don't know what the definition of media is. Do we have a definition in statute of media Michelle? I can read you what what what I pulled up. Electronic media is media this is not a legal definition. Electronic media is media that uses electricity including television radio, the internet, fax, CD-ROMS, DVDs and online video streaming. It includes any medium that uses digital or electronic encoding of information which is pretty expensive. If you change that to devices you can see the difference immediately. I'm just curious what the witnesses feel about that. Any other question for David? I'll move on to Commander Raymond. Commander welcome again and please provide your prepared testimony and then perhaps answer the question about media or vice versa. Matt Raymond, commander of the internet crimes against children task force and I'll just jump right to answer in the question. Electronic media was chosen for a reason and just to back up I'll explain the first question that I'd heard earlier. If you if we see someone's phone and they have the Facebook app on there and their Facebook data people believe reside on the phone but it doesn't actually it resides in the cloud so when we see the phone and get a search warrant and examine it the phone is not in an online connected state so none of that Facebook data are very limited only what's been cashed on the phone which is very very limited information is able to be seen from the phone in the state that we examine it we it would be illegal for us to go through the Facebook app which has the keys to the person's online cloud account and view their their cloud account because that would be viewing it in real time and would actually require what is federally known as a title three warrant and there's no there's no analogous route to get that information in Vermont we were forbidden from doing that so we're only looking at it and basically what's called a dead box state none of the links are active to online cloud service and activating and going through those would be illegal hopefully that makes that more clear well yes and no so how in other words what you're saying as I understand you is that you're just interested in the dead box or the device and what's in it yeah but I'll get to the second part of electronic media I haven't got there okay I just wanted to explain that some people think like they have their phone and when we examine like they see all their Facebook data on their phone but it's not on your phone it's in the cloud clearly on Facebook servers not on your phone so that I just want to explain that that part first and then the second part is so yes one part is devices that we're obviously interested in and that would be you know people's phones thumb drive wherever they could store images or videos however we do get cloud data but the only way we get that is pursuant to a search warrant so if somebody has molested a child taking pictures of it and stored it say in their Google photos so that may not be saved anywhere on any of their computers there's different ways you can set up your settings on Google photos I'm just using this as one example just so it's easier to talk about so that you can set it up so none of the pictures actually reside on your device they all reside in the cloud and people that are doing something nefarious sometimes choose that option because they think they're protecting themselves better so the only way that we get that data is by writing a search warrant getting it approved by a judge sending it to Google and they send us the contents of their Google photos there's already protections built into the Privacy Protection Act I think it was done in 2016 that limits that when they give us though that data we have to go through the data and excise everything that's not relevant to our investigation and only keep what's relevant to our investigation so that would be electronic media that we get from them we don't get it they don't send us a hard drive right they send us a link to download it from their secure servers so we're getting electronic media from them but we couldn't go online and and just online seize data because again that would be a Title III violation it'd be federally okay that's thank you very much that's much clearer it's I I'm okay with that but it is different than what we were discussing before which was limited seemed limited to devices so you're saying if if you can legally acquire a database from some provider from Google or or from somebody else and you can get a search warrant get a judge to okay that and get that then you can use this technology not to search a device but to search this downloaded information file that you've been sent by Google yeah but again that's user specific and in case specific to be a child exploitation case and then we would have that probable cause that that data has child pornography child sexual abuse material in it for our cases um yep and and and then and then as I say I'm right sorry I I'm just gonna say I'm I'm I'm fine with that but the you know I think the issue here is um wanting to make sure we don't inadvertently open a floodgate to this technology that we're not aware of so I'm just trying to tease out what what's actually included here by intention so right that would be a good explanation yep it would be limited to only the data we get and the reason it says legally sees instead of search warrant is because there are cases where the victim has the information on their phone someone has been extorted sexually it's called sex distortion and we're getting increasingly younger and younger kids that are victims of that unfortunately and the parents will sign over consent for us to get the information because the suspect has sent information on to the child and then when we examine these phones or computers um and we just had a case the other day that had one computer one phone and one tablet and there were just over a million images because you have to realize every icon on the computer is an image so that when we just say to our forensic tools you know pull all the images into a gallery view it's going to pull up a you know in that case it pulled up just over a million images and then there's an option of we can manually scroll through each one individually or we could say like if we if we have a known victim we can use the victim's image and say show us all the images that depict this this image it's just a way to speed it up as one example we had a phone we had a case um where there was a girl that was unfortunately molested and was uh had to explain that it was her picture was taken with this particular phone while it was going on and it took the examiner six hours to find that image um where we could have found it in less than a minute using using a tool that we have um and again in the other thing just to explain is our tool is case specific it doesn't aggregate data doesn't save data doesn't save any biometric information about the searches that are done meaning that it's only used as a filter only so basically we have a million images up in a gallery view um and again we could flip through them one at a time or we could just use different filters um there's um a lot of different filters we can use one of them is facial recognition so you can enter a photo that you already have of an identified person and say do you is there a picture in this in this a mass of data we have it does this picture exist of this person and it'll show you likely matches and then it's the examiner that has to look at them and say this is the girl this is the girl and typically we're looking for uh you know a child uh being sexually molested um uh and though those are the images that we're we're trying to find in there and so we pick those out as and and see the location that they're in and make a report about where they were found um but the biometric data that's being used just to pop up and filter those images is not being saved not being put as part of some database it's case specific and when the case is over you know all that information is just saved as part of the as part of that you know siloed case information. Thank you um other other things you'd like to tell us or questions for Commander Raymond. Alice? So I'm just wondering with regard to um you know saving saving the material that you have for you know a case happens but you think it's going to be appealed and all of that so I'm assuming you would save all of that but what I'm really wondering is how does it get used by the court in terms of if someone goes to trial? So the images the images found on their say it's found on their phone uh so the forensic report will say this image was found on this the forensic tool tracks what you've done in the tool and then the um the person charged can have a forensic expert look at that data and see exactly what was done so there's nothing done you know behind the scenes but at the end of the day what's really important is uh this picture which is usually a picture of a child being sexually assaulted um was found on his phone in this location um and then the other part of the information is what user attribution was found in that same folder to show that that that the defendant is the person that access to that fold uh folder um it's less important obviously the defense expert would be able to look at how it was found but it's less important to see how it was found it's like saying I found the murder weapon under the bed but did you use your flashlight to illuminate it or did you just look under and see it it's if you had a search warrant to search that location it really doesn't matter how you saw it I mean it's where it was found and what the object is of yeah so so when you get to court um is the image become part of the evidence so if we introduce a image as evidence which we do if it goes to say a jury trial um it's uh entered under seal so that obviously a member of the public can just get a copy of the record and get child pornography or child sexual abuse material um so but yes it would be shown to the jury so that they could make make a determination either as to age or whether it depicts sexual conduct all of which are elements of the crime okay so I didn't realize there was this a separate way where that would be sealed permanently the image yes you know thinking about privacy of that person later on yeah we're obviously highly protective of these images when we possess them they're on a secure offline server that can't be accessed by other people and we don't we don't provide them out anywhere we bring them obviously to court and have to enter them as evidence you know in the rare event of a jury trial and but is the court as secure as your system in terms of what evidence they have had presented to in the courtroom so we've always put them on either a thumb driver a disc and uh and displayed them and then removed the removable material so it's not ever been entered on to an electronic system of the court so that it's as secure as a physical piece of evidence would be at court okay thank you I'm not sure is that that secure okay well thank you uh anything else you'd like to tell us about the the bill or the process Commander Raymond I think it's all been pretty well covered unless people have have questions I don't want to take up people's time for no reason um Peggy Judge Grierson is our next witness but I don't see him yeah I he was he had confirmed and I mailed him about a half an hour ago and I have not heard back so I'm not sure where he is we are running significantly ahead of schedule everybody's been very brief yeah um so uh Senator Sears is off now um I'm imagining Judge Grierson might take a few minutes so um unless someone has something about the bill they'd like to discuss I'm gonna suggest that we take a break until 9 30 and hopefully by that time Judge Grierson will be back um I imagine he will be relatively brief and we'll probably wind up taking another break before our 10 30 segment on 128 but but for the moment let's unless there's an objection let's take a break until 9 30 and Peggy if you could just drop Senator Sears an email and some more text yeah very briefly just wanted to apologize to the committee for my lack of memory around the the full breadth of electronic media and then Commander Raymond for clarifying that for everybody and as he noted correctly the the search warrant piece is really the key the key protection there but thank you everybody understood thank you David um okay so we'll take a break