 From your perspective, has AI been at all? Or technology as a whole? I mean, one of the things we've been covering on my side is looking at AI and its inherent racism that we've seen as of late. And so to the point where I think Google Vision shut down their entire AI division and has stopped sending or selling, stopped helping cities with crime data because inherently it's, yeah. How do you think about AI? How do you think about, you know, is there any technology that can help in this effort? Well, I mean, are you talking specifically about the death penalty or are we talking about other crime in general? Yeah, well, look, I mean, we, you know, there are certainly good applications to artificial intelligence. I'll give you one. In San Francisco, we work with a Stanford computational lab to actually take race on other products as a race away from police reports so that prosecutors in the general crime filing team wouldn't know what the race of the person was when they made the first decision. So for instance, let's say that a robbery occurred and I'm gonna use a create obviously fisticious type of setting. You have a green person that committed the robbery but the prosecutor doesn't know that it's a green person. Prosecutors looking at the evidence and all they can see is that, you know, witnesses describe the assailant in this robbery as a person unknown race. That was 200 pounds, six feet tall, wearing a white t-shirt and blue jeans. Another person describes it as a person of a slightly different tonality of skin that weighs about 160 pounds and had long hair and it had black pants and a beige shirt or a beige t-shirt. And the prosecutor looks at this and says, well, I really can't make a decision here because I'm not gonna be able to prove we are on a reasonable dial. The person is in custody did this right. So they mark in their system, they say, this case cannot be filed for insufficient evidence. Then they unmask the report and the report now shows clearly a photo of a green person and a video of a green person that is a one in custody committed in the robbery. Now the prosecutor says, uh-huh, I have enough evidence now to move forward when I move the case. So now they move forward with a case and that would be an appropriate way of altering your original decision. But let's take the same set of circumstances and now the prosecutor unmasked and you have a really fast video that shows what it looks like a green person with no facial, which are very common, but especially a lot of those business cameras in some places are antiquated, you know, the shot may be bad, the person who's wearing a hat, whatever. But the prosecutor said, wait a minute, this is an area where green people commonly commit crimes and green people and his or her mind are more likely to commit crimes than that. So he moves forward and they kind of, obviously they don't say that on their rationale, but they say, well, you know, we have a video that looks very similar to the person's being described and this is an area where green people are regularly commit crimes, so therefore we wanna move forward with a case. That case would be an inappropriate way of proceeding with a case that would require supervisory approval and that would be denied. So we use artificial intelligence in that case to first of all, do what humans would have a very hard time doing, which is completely decoupling race and all the processes of race, right? So neighborhoods, sometimes names, I mean, one had done this, we most of us would think of it Latinx, right? Or Leroy, we may think of it African-American, right? So all that gets taken out of the mix and that first decision cannot be removed, it stays in the system and they actually, my commitment was that we would make that technology available to other DAs and that we would share the results of our process over years with researchers and policymakers to see how we can get rid of implicit bias. So that's a good way of using artificial intelligence. Another thing that I did is when Prop 64 passed, that was a legalization in marijuana, the initiative also actually said that the person that had been convicted could go through the application process to have their record sponge or reduce depending on the conditions of their conviction, but they had to do it on their own. And we looked at it and we said, you know, we learned that about less than 6, 7% of all the people that qualified for record sponge were actually applied for it because it's cumbersome, it takes time, it takes money. And poor people don't get to do that, right? But we also know that a criminal conviction is gonna keep you from getting employment, housing, a whole bunch of things. So in looking at that proposition very closely, we say it says that the person has to do it himself but it doesn't prohibit the DA from doing it, right? Mass. So we decided we would do this in mass and we started it with very laborious, right? And we realized it was gonna take a little time and I'm trying to push other DAs to do it and they said, hey, look, I would like to do it with, but I can't because I don't have the resources. So we went to Code for America and we developed artificial intelligence to actually review criminal records, determine the people that qualified and complete all the paperwork necessary to get the relief. Hey everyone, thanks for checking out that clip. If you enjoyed it, be sure to hit the like button down below. And if you're interested in hearing the full episode, it's out right now on our YouTube channel. We've had a lot of great guests come on this show before and we've got a lot of great guests coming up in the future. So hit subscribe so that you don't miss a single episode. And one final note, we're always looking for new ideas and new companies to feature on the show. So if you know of someone or know of a company, write us a comment down below letting us know who they are and what they do. We'd be happy to have them on the show. Till then, I'll just be here waiting for your comments. So, see you later.