 Okay, so let's turn to Michelle. Good morning, Michelle. It's still morning. And we're going to be looking at H 195. I know Michelle you sent. You sent us something hopefully it's posted, or is it in the email? I'm not sure. But the. Right, I just, I wanted to give you the act from last year that the H 195 referenced so we could just take a look at that language so it's in your email I wasn't going to the bill so short I wasn't going to share the screen so I didn't set it up but hopefully that's okay. And just a little caveat is I, I don't really have any background on facial recognition technology so I just wanted to let you know that up front. So this is a new topic that just came to me and so I'll walk you through. We're just going to take a glance at this very short bill and then I'll show you the act that it's, it's making an exception to. So we're looking at the bills introduced for H 195. And what it's essentially doing is creating an exception to an existing existing moratorium on the use of facial recognition technology by law enforcement and so if you go to either, I don't know if Evans posted it yet either that or I email you all a copy or people are following along on YouTube it's act 166 from last year and it's section 14 where it has the moratorium and so you'll see in the language there. Until the use of facial recognition technology by law enforcement officers is authorized by an enactment of the General Assembly law enforcement officers so not use facial recognition technology or information acquired through that technology, unless the use will be permitted with the use of drones under a specific statute that you have entitled 20 on law enforcement use of drones. And then it goes through and it just identifies that defines what facial recognition means and facial recognition technology. So what you have an H 195 is is doing just that where your spit, the General Assembly would be specifically authorizing the use of that technology, but only in certain types of investigations. And so you'll see there the ones that are listed. We're obviously really familiar with because we've just been working in that chapter and so that's chapter 64 of title 13 relating to sexual exploitation of children. Another chapter is a sexual assault chapter. You also have homicide and kidnapping. So, so those, so it could only be used in investigation of those types of those crimes, and where law enforcement and possession of an image of an individual they believe to be a victim. A potential victim or a suspect in the investigation of one of those crimes and the search is solely confined to locating images, including videos of that individual within electronic media legally sees by law enforcement as part of that specific investigation. And this also came from the Attorney General's office so I'm happy to answer questions if you have them for me but I'm guessing most of them are probably most appropriate from the age for the AG's representatives but Thank you. And this may be for the AG's office Michelle. And I'm just wondering where sexual assault homicide and kidnapping came from. And because when when I did talk to the AG's office a couple weeks ago, we talked about sexual exploit exploitation of children only. So I didn't know if you had any insight on where this other stuff came from because to me that's, that's way out of line, compared to what I would ever want to do, but So I got a proposal I wrote it up and there you have it. That's that was my guess. Usually I'm a little more involved in the back and forth all stuff but this one. Now I think you have to speak to David and okay. Thank you. What else. All right. Great. I'll be here. Thank you. Okay, great. So then why don't we turn to Matt Raymond and David share from the Attorney General's office I'm not sure who wants to testify first but could help us understand the the need for for the spill from your office standpoint. Sure. I didn't invite commander Raymond to go ahead if he doesn't mind. Great thank you. Yeah, not a problem. So, when this moratorium past we had been using facial detection and facial recognition as part of our forensic tools for quite a long time now. And this became a problem for us right away. So, in the child sexual exploitation realm, if we have a known victim, or a suspected victim, and we have seized devices from an individual, and now we want to look for that victim on the devices in easy way to do that is use facial detection to run it across. So we're not using it in the standard sense where you're trying to identify a person by you know scraping images from the internet that's not what we're doing. We have legally seized media, and we're confining our search just to that legally seized media to try to identify the images of sexual exploitation of this child across the device. So just maybe to back up a little bit so everybody's on the same page to just explain there's facial detection, which can be done by just a program saying is there a human face in this in this data. It's just identifying a certain person. It's just identifying a human face and then there's facial recognition, and that's the piece that is using biometric data on a particular face to identify a particular human being. And in this case what we're just asking permission for is on. Sorry, that's my dog snoring in the background. I apologize for that. I'm just asking permission to be able to examine the data that we've already seized legally and to use the forensic tools that we've are always been using for this to continue to be able to do that by using the facial recognition part of those tools. And one of the questions that was asked was why was that expanded past child exploitation. So I can give two quick examples. We had two different cases one one case was a person that was originally we had arrested him for trading images of sexual exploitation of infants and toddlers on kick a messenger service. And then when we went through his data, we actually found that he had been raping his two year old toddler himself. And then we use the face of that toddler to search across the data to get the rest of the images and what what that did was change it from a sexual assault of a child to aggravated sexual assault of a child. That changed it to a potential life imprisonment case. So if we limit it to just child exploitation, then where do you dice that up once we were searching for additional images. It's now really a sexual assault investigation. It's also a child exploitation investigation. And then in another case, we had a person that we actually had uploaded videos from directly and in a proactive case that we had done we went and did a search warrant on his place and through his device and then examination of that information. We actually found that he was sending money thousands of dollars overseas for the torture of children for the kidnapping and murder of a person by suffocation. And now we're into a homicide investigation across countries. So, limiting it just the child exploitation becomes problematic when we're when we're sifting through this data. And again, we're not asking to do the, the controversial step of facial detection is going out and trying to identify somebody, you know, by scraping images off the internet we're not doing any of that just legally sees data. It's really just a filter in our forensic tool to be able to identify victims in these cases. Okay, thank you. I see Barbara and then Tom. Thank you for explaining that technology difference. So when you're scraping. I don't know I have been wanting to use a different word when you're taking this is all property that has been voluntarily given to you, or obtained through a warrant is that right. Correct this would be either sees pursuant to a state warrant or consent from the individual. So, once you have that you're you're not like looking up who their Facebook friends are and correspondence that they're having to lower other people or that that's just data that's confined to the device. So, an example is if you're, if you're using your mobile device to get on Facebook, none of that data on on your Facebook app is actually stored locally on your phone. That's all stored on Facebook servers so we can't even information when we're looking at the data that's just stored on the phone. So, going through the data stored on the phone for looking for human faces. Well, it's in particular depends on the case right but it's okay right if somebody. You know we have somebody's trading child pornography has a child of that same age and gender, then obviously searching that for images of that child on their device is about more it's important to be able to identify it, if that if that child's a victim or and let's say you can't identify who the child is, you don't run it against something else to try to figure it out or run the parent by something else, like through your own. Yeah, we don't use we don't use anything like the clear view AI or anything like that where there is. And clearly that would not be allowed by the way this bill is written either. But no we don't do that what we do do is nationally. All the icacs work together to collectively try to identify these victims, and that's not running it across. You know, any, any data scrapings or anything like that it's looking at the images. The unknown victim. Is there anything in the image that we can use to locate like the geographically where they're located like trees in the picture rocks or located skyscrapers right yeah exactly like something that would identify the people and then we share that through all 61 icacs and try to because somebody may have there may be a series of images traded on this one. And you know I may have one piece of the puzzle Indiana could have the other piece of the puzzle and then if we, you know combine forces then we can locate that child and rescue and there's thousands of success stories of that nationwide. That's awesome. What and where does the where does this like the child I mean I'm the person who ends up getting convicted. I'm sure, you know there's a mugshot etc anyway, but like the child, where would that be stored like, could somebody get a warrant from you to get that child's picture or might it show up in some database later on or like. Let's say the child is not, maybe this is not an issue now but maybe the child is not a citizen, and you're required to turn over photos. So all of our stuff is stored on a secure server. No one has access to it. And for identifying purposes, the only thing we use on images is the hash value of the image. So the actual images and even shared anywhere and no data or identifying data could be obtained from that. Thank you. So if we're trying to find out if an image has been shared and is out kind of in the wild being shared by people. The only thing that we have to share to be able to figure that out is the hash value, which is just the digital signature of the file has nothing about the identity of the person involved. You know any information about that it's can be used to identify any electronic file. No, that that would not be obtainable by anybody and we're very protective of these images because illegal content and and for protection of the victims in these cases. Thank you. Thank you, Tom. Thank you. Good morning commander good to see you. Hi nice to see you. So there's facial recognition and you mentioned something else as far as a type of facial recognition. Now it's it's just a different level it's facial detection I was just explaining facial detection is, is a tool that could just say is there a human face present right now. Is there a particular human present but right human faces, just like, you know, a lot of forensic tools you can say, are there animals present you know like is there movement in any of these videos is there are there human faces, and then facial detection is what we're talking about today which is a particular face, which uses biometric data across the face to identify a particular face against all others. So with the other type with that, would that be used to determine between an actual human and a anime, because there's some pretty good anime out that's that's what went through my mind. Yeah, no. Yeah, it's I've never been never heard of it being used that. Okay, that that's kind of what went through my mind. So, I mean, to the lay person, as you probably gathered that I was pretty shocked by the, by the language in this guy wasn't ready for it. But I guess, you know, I don't have an issue with it I guess now that I've heard your explanation, but but I would, I would like some myself I would like some kind of defining language that that it's more understandable that these things, you know, if you if it did get expanded to the sexual assault homicide or kidnapping. That's for me anyway that it's a little more understandable that it's coming from the child exploitation. You know, world, I guess, and not because the way I read that it's just sexual assault it's homicide it's kidnapping period. And it looks really broad to me. And again, I guess, you know, with the lawyers in the room, I would like, I would just like language that is more specific to sexual assault homicide and kidnapping, as it pertains to the sexual exploitation of children. I would be a lot more comfortable with that. Thank you. I don't know if you want to respond to that or Michelle. I, I agree with Tom. It is very broad and it and it doesn't. You don't need the premise of an investigation into sexual exploitation of children in order for that it's just a it's just a broad group I mean you can narrow it down I guess maybe a question I would have for David is not if you're if you're doing a criminal investigation into sexual exploitation of children. And then it leads to another discovery of another crime. Is that not still, you know, looking at the language here, you know related to that specific investigation having to do with sexual exploitation. I might work on the language to narrow it if that's the committee's desires to have it focused on that but it is it is written very broadly. Yeah, thank you Michelle. David. Yes, David chair of attorney general's office for the record. I think we can work on something along the lines of what the committee is getting at I mean I think what I'm hearing is that you want to have something that makes it clear that these are investigations that originate with child sexual exploitation. And with the understanding that sometimes that opens out into investigations that include other serious offenses, and I think we could write that in a way that clarifies that these are investigations that are originating from investigations related to child sexual exploitation. I don't I certainly don't have specific language now but I think that that concept could be embodied without limiting the effect. I think that's kind of what I'm looking for I'd be real interested in seeing the language because my mind. I mean, I with this facial recognition I kind of had blinders on, you know, originally that this was an ICAC bill. And of course with this language as I've already said it does does branch out and but but anyway I would be real interested in some language to narrow it up considerably. Thank you. Yeah, I understood and I think you know the intention certainly is that it is an ICAC bill, and then we're trying to encompass the things that ICAC does but I but I understand I take the point and I think we can write the language in a way that encompasses that point. Great. And I think I would feel comfortable if it went further to say, I mean, because you wouldn't want it to be like hey we don't have access to use recognition unless it's an ICAC bill so this could be an ICAC bill. Let's bring these cases over here and they originate that way. I mean, the moratorium was put up for a reason and I think it's important that we address the issue it can't just be in limbo for years. So, I guess I want to know like how, like some protective guardrail so that it really isn't just a little way to use it in other venues by thinking it could be an ICAC case. Yeah, thank you. Sure. And I give the same answer. I mean, I think we can try to craft a language that makes clear the origination issue which is what I'm hearing. I think what I'm hearing correct me if I'm wrong from the committee members that it's a it's a question about where these where the usage is originating in. I think we can work on something like that and still allow the investigations to, to look at everything they need to look at to protect people. Thank you, Bob. Yes, thank you. Commander Raymond my how you doing. Good thank you. I'll be in a lot familiar with with this this technology and so on so forth. Is it my understanding that you're going to limit this to media within telephones and laptop and so on so forth in the same technology would not be used to gather information from, say cameras or monitors work that were placed in a public venue with expectation of privacy. Correct, this would be just about as it's written just about media that was seized pursuant to the investigation that was legally seized so if you know currently, if we take a computer, a laptop a thumb drive. The forensic tools work is we make a copy of all that data. And then that copy is run across the forensic tools all as one aggregate set of data. And so currently, just to give an example we had a case where a child had disclosed that they were sexually assaulted, and that the person had taken pictures of it. We had to examine or six hours to find that video, which would have taken minutes with using the facial detection capabilities that are built into the forensic tools that we use. And again, what to limit what we're only limiting it to looking at the data that we can buy a search warrant. We have been given judicial authorization to look through. So that examiner can either look through every video, you know, one by one, and finally find it, or use a one tool in there which takes, you know, minutes or seconds to run across the data, and then show us the video that we're looking for. So it's really about saving hundreds of man hours, because we were going to be able to get to the same result. But it was going to take really hundreds or thousands of man hours to get to the same result. We have the legal authority already to look at that data. And we can look at it manually one by one, or use the tools as they're designed to use to save time. And, you know, we're already woefully understaffed to do this job, and then to add this restriction on it makes it nigh on impossible at this point. Okay, thank you. I just wanted to just open up a carte blanche type of situation here that's all. Yeah, no, we're restricting it to just the data that we already have the authority to look at. We just want to be able to use this tool against that data. Again, restricting it just to that data, not, not searching the web, not, you know, branching out or doing any of that stuff. Okay, thank you. Okay, I'm going to go to coach and then Barbara and Tom. Madam chair and committees. Thank you. And thanks for the indulgence had three bills to introduce. So I was at a couple of other committees and appropriations. Obviously, my concern was very similar to representative Rachel sins. Also, I stand alongside my fellow representative bird it with a strong, strong commitment to this work being that Commander Raymond was very clear and articulate about the usage. The timing and the technology itself. I would be personally agreeable to hearing language from the AG's office in working with our attorney. I would like to thank Michelle to come up with that language that clearly gives the strength that Commander Raymond needs and the the strength that we want to maintain in this work. I just keep going back to that report that commander gave us at the beginning of coven and his statistics showing that the rate of incidences went up 419% that is very, very striking to me, and I think to all of us. So, let's get about the work. If David and Michelle can bring something back to the committee that meets, you know, those concerns the guard rails, but also gives Commander Raymond the strength that he needs to protect our children, you know, I'm with you. Just needed to say that. Thank you. And actually I'm going to go to Michelle because I'm sure I see your, your hand is up and then Barbara and Tom. I just thanks I just wanted to ask a follow up with respect to Bob's question around like public cameras security cameras things like that. Um, do you and I just I just don't know but so do you not get warrants to to obtain and review video footage from those type from you know something like that likes like a business a security camera or things like that I'm just trying to understand how Bob had expressed a concern about, you know something where you kind of capturing a lot of images of people who may not be aware and coming in and out so I'm just trying to make sure that the language is is reflected that way. Yeah, no, we never have actually, obviously the crimes that we're looking at happen in private very private areas, because it's usually somebody sexually assaulting a child. So we're not. And again, we're not using this to, as it was traditionally used facial recognition to identify people in the crowd or, or, or identify unknown individuals it's really identifying known individuals across the device a set of sees data that we have that's our intention of what we want to do. If that makes sense. Sure. Do you have any thought do you kind of get what I'm just wondering if the language is narrow enough to, I mean understand that in practice, you may not do that I'm just wondering about what it says that media legally seized by law enforcement in relation to the specific investigation. Yeah, I guess I guess my my concern is that if the committee's will is that you don't want taking you know like a security camera video at a, at a subway stop or something like that that that we make sure that that's clear in here. Yeah, I think I can think a little more about that and my initial response would be that to some degree I think that concern would be ameliorated by the change we've already contemplated saying that these are going to be limited to investigations that can be engineered in child exploitation investigations. In other words, because the nature of those investigations which Matt just talked about, you're very unlikely, maybe you never really look at that type of media. You know, see how if it isn't limited to that type of investigation that you then could get into the type of, you know, broader look or not broader but you know look at a camera or something like that, that you're talking about so I think to some would be ameliorated by that change, given the nature of the crimes that are sparking these investigations. And we can think about if there needs to be additional language I think that's one key way that we can limit it. Thanks. Thank you. Okay. So, I think my first question is for David, which is, how, how is, what's the definition for accepting a case as a, or actually, I guess it could be for either of you, accepting a case like is it that there's probable cause or, or what for an investigation case. Matt I think you should talk about the practice. So as we're talking about it in here. When we get a warrant to search for the devices we actually get it under the authority of title of chapter 64 says right in there like to search devices for evidence of child exploitation under title, you know chapter 64 title 13. And if you don't find any what happens then you just close the case. Well in that case obviously that's a probable cause level just to get back to your original question so that we would have to show a judicial officer that we have probable cause to investigate a crime of child exploitation. And then. Yeah, so obviously at the end of the day, usually by that point, you know, typically in almost every case, we already have child exploitation provided to us usually by an electronic service provider that we've matched up to by the time that we're seizing their devices so we've already we've already found child exploitation at that point. So this bill would only apply to not the people where you went in to see but the people where you then went to court after with your probable cause. Meaning the suspects. Yes. Yes. So if if the three people that you go in and there is not information but you discover that one robbed a bank. Would this bill apply to them or not. Obviously would depend on how it's rewritten because I understand that there's the will is to rewrite it so that it has to stem and the and originally I'll be honest that's how it was the original language was, and I brought up the concern. I was an attorney, but I brought up the concern like what happens when we because we do have cases where it branches from, you know, the, the regular images or videos to, you know, active sexual assault investigations, or and kidnapping or murder investigations like then where does the line drawn on when we can use facial recognition you know like once it's elevated to now that we're now that it could be a possible or a murder, then can we still use it at that point. You know, that that was the question I had, and obviously that will depend on how the language is drafted to fix that. Right. The other, the other sort of concern that I have is I can think of so many examples. And this isn't going to be in the bill I hope, I mean as a finding, but I can think of so many examples where if we allowed particular technologies we could save law enforcement tons of hours like, but we don't necessarily want to do that because we, you know, don't want to change privacy drastically. So, I appreciate that it is like it can save hours and hours of work but I think that's a dangerous factor for us to consider, because, as I said that that's wide open where that wouldn't be the case. And I would hate to have us now have lots of law enforcement coming in here saying, Oh, but this will save us 200 hours just let us why the drone, you know in somebody's window or whatever whatever it is you know we're going to do. So I just felt the need to say that. Well, just a comment on that this is something that's been. It's part of, first of all it's part of forensic tools that are extremely expensive and the only ones that have in Vermont is, is our ICAC. We, we purchase it and maintain it for Vermont so that nobody else has this capability in Vermont to be honest with you for law enforcement. First of all, this is something that's been going on for years so it's not your, the moratorium changed something added hours to law enforcement we're not, we're not coming to you and asking to use something new that we've never used before. This says was going on for years and years and there has, you know, I don't know of one documented case of misuse or and I don't think anybody can point to one this is something again. Just as a filter for information. It's not hurting anybody's privacy because we can literally thumb through each image and video one at a time. And so we're going to see much more by doing that than by using the technology which we're now limiting our scope of view. Otherwise, I could put somebody in a room and hit next on every image and literally look at the same set of data, one by one manually. So you're not protecting anybody's data by keeping this up on this in this, in this instance, there's thank you zero protection. Thank you very much helpful thanks. Okay, Tom. Thank you. Commander my worry isn't, you know, with you folks misusing more than it is the reliability of facial recognition and the potential there. I don't worry about the misuse so much. And, and just because you know, knowledge that, you know, we all have now and what I believe is, and know in ICAC investigations. And what I, what I think I know and what I've seen that criminal charges are brought forward until it's pretty much beyond a reasonable doubt, you know, and with the investigations that you people do but so. So, say we do pass this you can do the facial recognition. And, and, again, I know how ICAC is for lack of a better word I guess that the way you, you know, again the way you people almost double check everything. If you found a few found say, whether it was photos of an accused or, or of a victim, that type of thing, would you have, or do you have anything in place for like a double check. Just to make sure that the facial recognition did do the job it's intended to do. Yes. So, at the end of the day we're really just using it as a filter, because you can imagine on anybody's computer. Imagine every icon that you see on a computer is an image that's exactly what it is. When you're looking at a computer forensically so you know right out of the gate. In the settings every computer has, you know, hundreds of thousands of images on it just coming to you, and then everything that the person loads on it. So we're using this as a filter to try to filter down the information but at the end of the day. It's a human investigator that's looking at the data and saying, you know, it's really just filtering down to a finite set and then you're saying is this is this a child is this is this a child being sexually assaulted. And is this the same child. And that's left to the actual human to make that decision. All the horrendous there have been horrendous uses with facial recognition but it's always been in the trying to identify a suspect and not not how well we're talking about just, you know, filtering images in a child exploitation case and trying to trying to figure out who's the victim and, and, you know, like where the images are on the device. It's, it's where they have a suspect, they run it against some like clear view AI, and it spits back a person, potentially this person, and then the investigators like yep it's that person and runs with it and does no groundwork to verify. Is it really this person and that that's the horrendous uses of it and that's not what we want to do. You know, we were, we were approached like every eye cat by clear view AI and asked you do you want to try our product and I said no. You know, I didn't think for monitors would stand for that and we didn't want to be participating in any of that. So we don't want to go in that direction. It's really just, just uses a filter and at the end of the day it's it's the human investigator that's making the decision, you know, like, yes this is his child and he sexually assaulting her. Right, yeah, that was my guess is that you were doing, you know, I guess what would I call the double check but it's here you say that. And, oh, let's see. I don't even understand my own note for the most part, but it was talked about earlier, as far as restricting let me give me just a second. I'm going to have to come back to that I guess, figure out what I meant, but as far as the forensic work you do are there's just the two of you if I remember right in your office. Other than I know you branch out to other departments, but whatever, but are you certified in the forensics. Yeah, well we, you know, there's four state police detectives that do a lot of forensic works for us. Oh, yeah. And they're like the forensic wing for us and then you get myself and Jesse Sawyer from the AG's office. And, but yes I've gone through forensic training. Yeah, okay. Great. That's all I have for now. Great nice really helpful Selena. I apologize if my questions a little bit redundant I've been had to be just a little in and out of this hearing this morning at times but so like from everything I've heard you say it really sounds like you're really talking exclusively about victim identification and not suspect in your workflows here. Yeah, we're not typically we've already identified the suspect. We're and again we've we're looking at this we're looking through the suspects devices right there their information, but sometimes we would need to search by using facial detection. So we have no recognition of the suspects image. So if he's sexually assaulted a child and videoed himself doing it say his faces is visible in the video. So then it's reasonable to assume he's there could be more. And so then you would also search by the person's image across the device to see if there's other because if it's a different child obviously it wouldn't come up by searching by that child. But the suspect you would then search using his but again it's not using it to identify the suspect. You're just trying to identify the videos. It typically nowadays it's videos not images of sexual assault of a child. I'm just thinking a little bit about how we could narrow things and listening to talk I wonder if there's some language about acknowledging like an identified suspect or something like that. I really appreciated representative Rachel sins remarks about sort of, you know, like there's a lot of technologies and techniques we could use in law enforcement that save time. Understanding of that is a slippery slope and I would also know that I think, you know, childhood sexual abuse and exploitation is a, is it an extreme level of a crime where times of the essence. So I'm wondering if you could just talk a little bit more about how some of the practical ways the moratorium has affected your work and if there've been instances where it really did result in a child being potentially in harmful situations for significantly longer periods of time. Yeah, so it's really hard to answer that question because we have such a backlog now, and I'm not saying it's all caused by this, but we have such a backlog that's been exacerbated by this that there's a lot of unknowns in that right so we have in that one case I already explained where we found that he was. He was not an arrested the guy for trading on kick child sexual videos of involving infants and toddlers, and then it was by the time we examined his devices, we found out he was actually sexually assaulting is two year old daughter himself, and then we were able to remove the daughter from that situation. So in that case, if you look at that compared to our cases now when there's a backlog of getting stuff examined. And then we don't know like right because some of these cases haven't gotten to the full exam yet, so there could be victims in there that we have not identified. And so the sooner victim is identified the sumer of victim gets services the better outcomes of their for their life. And so the, you know, time is of the essence in this. And again, we're, we're not asking to use a new technology a lot of people are saying this is like something new we want to be able to do. This is something we have traditionally been doing time and time again, all you know, that's how we did every case. And it was the moratorium that changed things. And again, if you look at the thousands and thousands of cases we've done. There's not been one complaint or one issue with us using it this way. You know, we're not using it to to identify people or misidentify people and in that case we're using it really just as a filter of data that we have legally seized and again can page through every image one at a time and eventually get to the same same scenario. So, I'd just like to point that out that this is something that we've done for years and years with no issues. Thank you. Oh, that was really helpful. Thank you. Commander, I think I can pose my question. And it's not really, I guess it's not really a question for you but maybe for Michelle and David and the committee is, it was brought up a couple times about other information or other videos or security cameras and that type of thing. And it kind of it does say it in the bill that you know that you are restricted to electronic media legally seized by law enforcement, but I didn't. I mean, to me that that that says only, you know, the stuff that you got by warrant but I just didn't. Again I'm thinking out loud I just didn't know if maybe it need to be a little clearer and maybe even say that it. Other videos or security cameras and that type of thing can't be used. But but anyway, I'm kind of thinking out loud. Is there any other questions for commander Raymond. Great. Thank you very much. Thank you. Yeah, thanks Tom. Okay, David, I think is up next but. Okay, great. Thank you. All right, thank you. Thank you very much. Thanks to the committee for taking this up. Again David share with the attorney general's office for the record. A lot of the key points we've actually gone over in this first discussion so I don't want to take up too much time unnecessarily, but I did want to make a couple of a few key points one overarching point about how we approach this legislation and then a couple specific points about how we approach this legislation just to emphasize some of the what's there and acknowledge some of the edits we might or one of the edits we might make going forward. The overarching point is that the attorney general's office really approaches this issue and approach this bill from a place of great skepticism and concern about the overuse of facial recognition technology. So I guess sued clear view AI to stop some of their uses of internet scraping data and then trying to resell that to anybody who might obviously they approached ICAC we don't know who else they may have approached either law enforcement or commercial or commercial use of various kinds. So we know we're in an active legal fight against that type of facial recognition which is just unknowing, you know without people's consent without even their knowledge just scraping the internet for images to essentially use that for very broad based unlimited surveillance. Again, that's something that we are against and we are actively working against. In drafting this legislation we actually consulted with one of the lawyers who's working on that case to make sure that we are in accidentally edging over into the type of image of this technology that we don't want to engage in and that we are again actively fighting against. With respect to this particular legislation I just wanted to emphasize that there are three protections built into it. As a way of thinking about it conceptually, the most important protection I think is the one at the end, which is limiting it to electronic media legally seized by law enforcement. That means either through a warrant or by consent, if somebody were to be approached and voluntarily turn over, turn over data, turn over devices, obviously they don't have to do that and then we would have to get a warrant to obtain that data. This itself is a tremendous limitation and really pulls that itself pulls this out of that world of sort of unlimited views of whatever information might be out there that we could apply this technology to. It's important to emphasize that anything that we obtain through either of those means either consent or warrant and I, Commander Raymond correct me if I'm wrong, but my senses were really usually talking about warrants here. The information we obtain through that means is it is is information images videos that law enforcement is going to have whether or not this bill passes. It's information that law enforcement is going to look through whether or not this bill passes. This is not an expansion this is not using technology to expand what law enforcement can look at or what law enforcement will look at its law enforcement can and will look at these images in order to do the things that Commander Raymond talked about protecting victims finding if there are other victims involved with a particular perpetrator. The investigations, these images are going to be looked at anyway this is not an expansion of what would happen regardless of whether this is this is past. The second protection that's built in is limiting it to an image of an individual, the law enforcement police to be a victim potential victim or suspect. I'm talking about individuals that are already known to law enforcement saying that if you have somebody who you believe fits in these three categories. Then you can look for that person in the images, but it has to be an individual who they've already developed a belief about. The second protection and the third protection is the category of offenses that we're talking about, I will acknowledge plainly I think that that is probably at this point. The least protective of the protections the least restrictive of the restrictions and I think that we can redraft that slightly to make that a little bit more restrictive and make it clear that we're talking about investigations that originate with the ICAC Task Force and make that a little bit of a more restrictive restriction. One of the things that I wanted to make about the particular bill that we really are talking about a very limited set of cases very limited use case very limited instances in which they're going to be used and again, the time saving argument is not I understand the time saving argument could be used in a plethora of ways that we would not actually wanted to be used and that could violate civil liberties. That's not the case here we're really talking about images that are going to be possessed by law enforcement by virtue of these investigations and that will be looked at by law enforcement anyway. So it's a really limited use case that does not expand law enforcement's purview or ability to review information or images. So for my three points I think some of those recovered a little bit before don't don't mean to be repetitive but just wanted to emphasize the sort of construct that we're working with here and the general attitude that we're approaching this with which is a very we're trying to be limited here and we're happy to work with the committee to make sure that's the case. Thank you, I appreciate that and certainly look forward to to new language. Tom. And that's, thank you. That's exactly what I was going to say as I'm looking forward to the new language that'll, but to lay person that it's that it's clear that it's, it's narrower much narrower but and and one thing I'm really glad to hear David as far as your office goes is is maybe I've heard it before but your skepticism on facial recognition and, you know, in your hesitation on the use of it. So, I think maybe it was discussed a little more than I heard but early. So, say if you know the bill passes you know everybody likes it it's narrow that type of thing. I think, what was there some talk about possibly some language if another crime was was found, because just, just as an example, there's an investigation, you know on the exploitation of children in. And for whatever reason, a murder is seen in it. So, what would happen in a case like that or if it again if it was said I apologize but or if it obvious evidence that a bank robbery was was was done that type of thing. So, how, how would an investigation or could an investigation expand into into that. So I can answer that one. So, as part of all of our warrants we have a section in there that says if we uncover evidence like we're looking for evidence of child exploitation but inadvertently we could uncover something else. So we have a section that says if that happens we stop the exam and apply for a second warrant. Obviously that second warrant wouldn't be under child exploitation at that point, it would be. We would have then have like two searches right the child exploitation search and then if we had uncovered, you know the someone videoed themselves doing an arm robbery, we've not uncovered that but I'm just throwing out. I understand. And then we would stop the investigation right stop the exam apply for a secondary warrant to examine it for the armed robbery in that arm robbery case we then obviously under, you know the new language we don't have yet but that is to come, we wouldn't be able to use it under that under that investigation. Okay. Oh, that's great. So that's that so then if you did apply for the other warrant, then you could search the devices for that information, potentially, and I'm going to guess, maybe even that warrant may cover residents to see if you could go find evidence there to. Yeah, we would we would hand that and we would do the exam and then hand the information off to somebody else because we don't have all right, we don't have enough resources to do the child exploitation so we're not going to follow up on bank robberies. We would then we would provide that information to somebody else, and they would run with it but right, but yeah but that would that would show the protection of using the facial recognition in that case, the second warrant wouldn't be about child recognition. Right and then from say if that did happen and you handed it off to whatever division or you know other detectives. From there, they wouldn't be able to use facial recognition. Correct. Right. Okay. Great. Thank you. Let me just add to that briefly. The practice that commander Raymond talks about is also based in legal restrictions around warrants. And although I what didn't come here today fully prepared to do a lecture on Fourth Amendment search warrant law, you know it is the case that warrants have to name the evidence that they intend to find in the place that they are looking for it. So that's why out of making sure that they're that that they're not trying to later use in court evidence that could be knocked out because they didn't have a proper warrant to seize it. So if we go back and say okay, now we have probable cause to believe that this other evidence which we weren't originally which we didn't originally get a warrant for but now we have probable cause to believe this other evidence, maybe found in the place that we are looking to seize it and so we're going back to get this other warrant to say, now we want to look for this other evidence and we have probable cause to believe this other evidence about another offense. So anyway, that was just a brief little overview about the legal protections that lead to that practice that the commander was talking about. Thank you. Coach and Tom. I think David just clarified the question that I was going to and Tom's question led into that, because I was looking back at some of the the original discussions that we had had, you know, over the years around the reliability, you know, of recognition especially for people of color, and, you know, it hasn't really been resolved. The, the National Institute that is responsible for that technology at the federal level is where that resides and they're not comfortable yet with the technology so I think your point is well taken, especially around the warrant. They're very clear very specific, and you know that that helps me a lot. Thank you. Great. Good. Thank you. Tom. Last, last one I promise. So, David you kind of my interest as far as warrants go. So how detailed is a warrant say, say it's a the bank robbery case discovered through an ICAC investigation, and you, you needed a warrant to see if you could find evidence of a bank robbery I guess would be the terminology to use but but how detailed would it would it be as far as what evidence you're looking for and maybe the commander would be better to answer that I don't know. I'm going to defer to the how the command. He's the one who writes the warrants in these cases right to him how he does that for these cases. I mean, I can tell a little bit more about the general standards and I can say that sometimes these become litigated points was the evidence that ultimately was seized sufficiently described in the warrant. I don't actually there is like a legal standard that I'm forgetting right now about, you know, just how specific it needs to be but all the commander will have an answer for you in terms of the specific warrants. Right. So, again, usually we're getting our search warrants to search a residence for devices. So there's two part warrant basically one is you can search this location being the residents right. For this specific evidence and in that case it is we're searching for electronic devices capable of storing electronic media. And then searching those devices for evidence of the sexual exploitation of children being specifically videos depicting child sexual abuse material or videos or images depicting child sexual abuse material so it's very, very specific and has to be specific tailored to the investigation. And then in other cases, it may be that we have to search for if if our original video that led us somewhere was a sexual assault and we think that person made it and there's other stuff depicted in the video, like unique clothing or something that would identify that we would we would have to add that to the warrants. So you have to be very specific about what you're searching for. So no matter, no matter what the department then warrants are pretty specific then. Yes, yeah, they have to be or they be legally challenged and your evidence and wouldn't be able to use it as part of the proceeding. Okay, great. Thank you. Okay. Not seeing any other hands. Okay, this is really helpful. Thank you. And I really did want to just get the bill out there more of an introduction. And when we do take it up again which I hope we will soon once we, you know, get some more language together then I will ask ACLU to, to join and and to testify and we appreciate it. So we are going to adjourn until 115 after lunch.