 So I think we can get started. Welcome to Law and Order, circa 2050. I regret to inform you that this is not produced by Dick Wolf and is not a spinoff of the television series. It is instead a future tense event. Future Tense is a partnership between New America, Arizona State University, and Slate. It explores emerging technologies and their transformative effects on society and public policy. Key to that is a series of events like this one here in Washington, DC, and in New York City, as well as content on the Future Tense blog at Slate. I'm Jacob Brogan, and I'm here to read you your rights for this event. You have the right to not have your neighbor's cell phone go off next to you. Please silence your cell phones. You have the right to ask questions after some of the sessions, but we are live streaming the event, so please do wait for the microphone. When you ask your question, please also make your comment in the form of a question that could be written with a question mark on the end of it. If you do not do that, I will make fun of you behind your back and possibly on Twitter. Probably not on Twitter. But if you are tweeting about the event, as long as your phone is silenced, you can follow the discussion online using the hashtag law and order, which hopefully will not get confused with any breaking news about the television show. And you can follow Future Tense on Twitter as well at Future Tense Now. I would like to invite the speakers of our first conversation to the stage. And to introduce them is Leon Nefak of Slate Magazine. Hi, everybody. I'm Leon. I write for Slate about criminal justice. Very happy to be here. We have our panel today sitting on the left here is Logan Kepti. He is an analyst at Upturn. It's a group that studies how new technology affects civil rights. Logan recently published a report on predictive policing and civil rights. To his right is Camie Chavis. She is a professor of law and associate dean for research and public engagement at Wake Forest University, as well as the director of the criminal justice program there. Her research focuses on criminal justice reform. She writes widely about police and prosecutorial accountability, law enforcement, and racial profiling. And all the way in the right, we have Ralph Clark, who is the president and CEO of ShotSpotter, which is a company that creates and licenses gunfire detection and location technology, meaning that when a gun goes off in range of a ShotSpotter microphone, the microphone picks it up and alerts the authorities. So with that, we will get started. So the name of this panel is Will Technology Make Crime Obsolete. The answer is no. Usually if there's a question in the headline, the answer is usually no. But I think a more pragmatic way to approach this is, are there technologies already in place or on the horizon that promise to reduce the crime rate? And the answer to that might very well be yes. So there's a lot of things we can talk about. We can talk about aerial surveillance, like they had in Baltimore, in Compton. We could talk about text-based 911 systems. But since we have the expertise here that we have, we're going to focus on predictive policing, which is the process of identifying places and people who are likely to commit crime in the future, as well as gunfire detection, which, as I said, is Ralph Baileywick. So I thought we could start with ShotSpotter. And if you don't mind, maybe we could just walk us through the use case for the technology. So someone fires a gun within range of a ShotSpotter microphone. What happens next? Sure. So we like to describe them as sensors. And the reason we describe them as sensors is there's a lot of intelligence at that sensor that can basically ignore ambient noise and trigger a time stamp on impulsive noises, booms or bangs. The way our technology works is we spread our sensors out. And these arrays of 15 to 20 sensors per square mile, we like to deploy them up high on top of rooftops. They don't require a line of sight, so they can be in the middle of rooftops. And when a gun is fired, of course, that creates a very unique acoustic event. And that acoustic event will basically spread out. And because the sensors are located in different areas, that impulse of noise will hit each sensor at a slightly different time. And we're able to take that time differential and triangulate the exact location of that particular event. And then go through a couple of classification techniques, both machine classification and human augmented classification, before pushing that alert out if, in fact, it's a gunshot to an agency. And what does the agency do with that data point? Do they dispatch police officers to the spot and look for who shot the gun? Or what's the best use for this information? So the best use is the first thing we have to understand is that in many of our underserved and challenged communities, they're not likely to call 911 when they hear gunshots. So 80% to 90% of the time in certain communities, guns can be fired. No one bothers to call 911. And so the first thing that our technology does is give much better awareness to police departments about when and where guns are being fired very quickly and very precisely. Our hope and expectation, of course, is that officers will be dispatched immediately to that particular scene. Oftentimes, they won't encounter a perpetrator. And our expectation is that they're not going to be capturing perpetrators all the time. But it's more likely that sometimes they will be able to aid victims, certainly be able to recover physical forensic evidence in the form of shell casings, which then can lead to an investigation around who these very few shooters are that are causing the vast majority of the problem. The biggest thing, though, that they do, quite honestly, and it's very difficult to measure, is they're denormalizing gun violence. So typically in these communities, guns are fired a lot, people don't call, police don't show up, it becomes accepted. And by having police be able to show up very quickly and precisely to that particular scene, get out of their patrol car, engage the community if they're not encounter a perpetrator, aid, and victim, be able to ask questions about what's going on. Most importantly, we've seen this happen in some cities where they're knocking on doors to check on people, say, hey, are you OK? The exact way they would in a fluent community. So the analogy I'd like to use is I was born and raised in Oakland, California, live in Oakland now. The part of Oakland I grew up in, it's a challenged area. And so there's a presumption that gunfire is OK, where the part of Oakland I live in now, of course, if a gun is fired, people are going to respond. They're going to show up in my door and say, Mr. Clark, are you OK? Do you find that small percentage of the population that is firing guns, do you find that there is awareness among that group of people that there are these sensors around that will help police basically catch them? Is there a, have you found, I guess, have there been? A deterrent effect. Yeah, is there a deterrent effect? And I'm wondering if there have been studies that you guys have done or that outside researchers have done that show that, indeed, this reduces the amount of gun violence that happens in some of these cities. Yeah, so very fair question. So we have done studies looking at our data across the nine or so deployments to where we are. We're about 90 cities and looked at gunfire incidents in our coverage area on a year-over-year basis, because that's the way you have to measure it. And we've seen reductions up to 35% in several cities that are not only using our technology, but they typically are kind of self-motivated to do other things. So we don't ever believe that it's just our technology that's responsible for driving this deterrent. And we're typically a part of a more comprehensive gun violence abatement strategy. But you're absolutely right. What you're trying to do is, to the extent you're not able to intercept and interdict those very few shooters, being able to deter or change their behavior. And one of the biggest determinators of, I guess, leverages or levers that you have in changing behavior is what the community norms are. And if you can change the community norms to not buy into the idea that gun violence is OK or it should happen and be more collaborative with police, you can change that behavior. There's no question about that. I think you said 35% in some cities. That's a big number. What do you guys think? Logan, any response? So I definitely think one thing that ShotSpotter provides that I think a lot of technologies are aiming for is sort of a more real-time adaptation for law enforcement. I believe ShotSpotter just pushed out the application where officers can get data on their phone or tablets. So that's not necessarily a thing that would be predictive in a sense of a future deterrence element. But empowering police officers to have more real-time data to potentially follow up on leads is potentially, I mean, I know there are some internal studies that ShotSpotter has done. I don't believe there's been an external validation study that's been done. And it'd be interesting to see any sort of results from those sort of studies. But yeah, that's my new reaction. What do you guys think? I mean, based on what you know about how police departments work, how they set their priorities, I mean, is it realistic to imagine that your average police department, maybe not your big city police department, but a sort of mid-level police department, do they have the will and energy to respond whenever there's a gunshot, even if no one's killed, for example? Well, let me just jump in and say it's unacceptable for them to say that they don't have the will or resources or energy to do it. What else are they doing? If you're not responding to gun violence, please tell me what is it that you're doing? And that's the thing that we have to change. I think recently there was a city, I won't name the city, but there was a shooting in a commercial district that involved a couple of deaths, sadly. And there was an official response to that particular event. And the response was interesting. It was like, that's not supposed to happen here. One mile away from this commercial district in a large southern city, which I won't name, there's gun violence happening all the time. And the presumption is that it's okay for it to happen over there, but it can happen over here where the potential victims are perhaps not persons of color, males persons of color in underserved communities. That's just a notion we just gotta get rid of. And that's the challenge is to say you have an obligation and a sworn duty to provide high levels of service to any community if you're affluent or non-affluent. That's my opinion. Tammy, what do you think? So I agree with that. And I think that police departments indeed have an obligation to use technology, anything they can to keep communities safe. At the same time, though, we have the very delicate balance of protecting civil liberties of folks who live in those communities as well. And so it's not, so should they respond, I think absolutely they should respond, but I would be more concerned with how they respond and particularly how they respond in those communities of color. One thing that we know about predictive policing, and I know that we're gonna talk a little bit. No, let's take a transition. Yeah, talk a little bit more about that. But in terms of if you're able to, if you're able to anticipate crimes kind of based on algorithms, then you're also able to pinpoint who might become potential victims of those crimes. And so I know in Chicago, there's work and officers are intervening beforehand, saying based on our assessments, based on this information, we believe that you might be a potential victim. Right, and that's like an individual-based predictive policing model, right? As I understand it, there's geographic models and there's, you could say personal models. So I know Chicago kind of got in some hot water for having all these people on their heat list. And the, I think, problem with it was that they were essentially targeting a certain population. Why is that sort of a built-in risk with predictive policing? Logan, do you wanna- Yeah, so the way most predictive policing systems work is at a minimum they're relying on historical crime data and historical crime data, usually for most departments, is referring to calls for service and crimes that the officers themselves are observing. So say I witness a burglary and I call the police, that record will go into a system, the records management system, and then you can use three to five years of data to pull, then test out against the most recent 30 days or so and see based off the predictive tool we've developed, how good is it at forecasting the most recent 28 days of crime? And if you're using historical crime data with certain communities, there's a pattern of practice of biased enforcement and that there's a long history of criminologists saying that historical crime data is not necessarily a record of true crime, but it's a record of enforcement against crime. So an acknowledgement there is sort of something that needs to be sort of a level setter when it comes to predictive policing sort of where the problem sort of spin out. Maybe one of you could explain briefly like how is this information supposed to be used? So as I understand it, police department that uses a predictive policing model or algorithm, they get a percentage for every square mile of the city, for example, roughly. What do they do with that percentage? Do they send people there? Do they put up crime tape? What do they do? So it depends. It depends both on vendors and the police department themselves. So it's very variable. Some departments don't really suggest any new tactics to take once you're in the box, so to speak. The box would be the area that's forecasted as a 43% likelihood of a robbery occurring from one to two o'clock. It's not specific. They say based on. Some systems do an hour resolution time frame. And other departments, if you're trying to really pilot a program and see its efficacy, ideally you might wanna see, okay, we're gonna have a targeted intervention in a certain area where we get out of the car and we walk around and we talk to people or another tactic could be you just sit in your car and you're a visible presence. There was one study by the Rand Corporation in Shreveport where there are basically two different divisions of the police department. One that actually enforced a specific tactic of getting out there and talking to people and one where the officers weren't given any sort of strategy to enforce. And then the latter group, the officers stopped paying attention to the predictions, generally didn't use the tool and just kept going about their day. So sort of not really a, from the police department's perspective, probably the best use of resources. So it's really variable which is probably in itself problematic that if it's up in the air as to the interventions that are occurring, it's probably related to previous interventions which we know are not the interventions. I remember reading somewhere, I think that it was in Albuquerque, they would take that 42% chance of a burglary occurring between one and two and they would put a dummy car there and just wait for someone to come. Like bait, basically? Right. I don't know, I don't know if I like that. But Cammy, what do you think? I mean, are these percentages making these communities safer? So what I was thinking of is, that's one model, right? To have that increased police presence because again, I kind of go back to the idea of how our officers or how our police departments when you use this data and how are they going to respond? And so one thing that concerns me in terms of some of the unintended consequences of using technological advances is the impact on the community and if you live in a hot somewhere that's been designated as a hot spot or perhaps somewhere that, using the shot spotter technology, there's lots of gunshots in this area. As a police officer, when you respond, are you going to respond in a more violent manner? Could this be a situation where we see violent encounters between police and citizens increase? So I think that again, when we're thinking about the technologies and the benefits, we really have to be careful about studying these. That's another area of concern is that we have, for many years, this technology is not brand new, but we need to have better studies related. Do you think we're moving too fast on this stuff? I know there's like 20 out of 50 biggest cities, I think, are already using predictive policing and there's like 11 more. I think this is from a Guardian story I read that are already, that are coming on board. Is the adoption kind of moving faster than the research that might validate this technology? I think that when you have police departments in this day and age, and many municipalities are struggling with budget deficits and the like and you still have high crime problems, you want to use technologies that can efficiently help you investigate and prevent crime. I don't understand the temptation, but I do think that we should always kind of be careful, back up. And another aspect of this is involving the community, involving the folks who are going to be the subject of some of these technologies, involving the community and community organizations and kind of setting guidelines. And as long as we can have important guidelines in place for their youth and also education in the communities. Communities should know, I think, that if shot butter technology is being used. Ralph, do you think it's possible to, so I mean, your technology is sort of premised on the efficient sort of deployment of these sensors, you can't just put them all over a city, it's gonna cost you much money. So you gotta put them in specific places where you think they're gonna do the most good. I assume some of the same risks apply where you're gonna be basing that on historical data and that will have sort of built in bias perhaps because of where enforcement happens. Do you think there's a way to reconcile that? I mean, is there a solution to the bias problem? Yes, I see it less of an issue in our particular use case because I think, again, our notification is only happening when there is a felony commission, which is a very serious event, I think, when someone fires a gun. So I think you're absolutely right, we're probably not going to want to put or deploy shot spotter in broadly affluent communities, but I think we do want to be able to deploy this technology as a technology assist to help a police department be in service to underserved communities. And I think the points that we're making here about being in service to communities and getting their buy-in and cooperation engagement is really, really important in our particular case. And that gets underplayed a lot, I think, with what we do because at the end of the day, if our technology is deployed and you don't have the community buying into the way the police are responding now, hopefully more precisely, quickly and respectfully, then you've lost the energy behind it. So our technology by itself doesn't reduce crime. Police departments by themselves can't reduce crime. What you really have to have is a very broad engagement and collaboration with communities that you're serving to reduce crime. They have the ultimate lever on this thing. And our tool is merely a tool to be used by police departments to kind of get that buy-in. Do you think, let me know when we have, should move to questions. But do you think there is an obligation on the part of the vendors who sell the technology or license the technology to police departments to train the people who are actually gonna be acting on the information that's being produced? Is there a moral obligation for them to go in and say, this is how you actually use this technology to make communities safer while also not making them essentially overpoliced? Absolutely, so I come from a technology background and I love to geek out on the technology we're a company that has 33 issue patents, a history of technological success. I mean, $75 million of investment capital. What we do technology-wise is really exciting and cool. But frankly, my passion really is around trying to educate departments in their respective cities on how to use the technology to drive positive outcomes. And it's interesting for me to do that because I don't come from a law enforcement background. So kind of coming at it in a very consultative way and being an outsider observing these things and watching business transformation because at the end of the day, if I was doing an interview with someone, they said, they really got it. They says, oh, what you're about is helping transform how agencies view and respond to gun violence. And that's exactly what we're about. And it's hard sometimes because I think, hopefully in the later panels, we'll have discussions about how difficult it is around change management in cultures that have a way of doing things and they're measuring things in a certain way. So we talk about UCR data. UCR data is a very limited thing as far as I'm concerned. And sometimes things that are most impactful are the things that are most difficult to measure and like community engagement and collaboration. I mean, that's a lot more important than any UCR data that probably gets tweaked and, you know, mess around with. When you've looked at the police departments that have adopted predictive policing technologies, do you find that they're using it as if it's magic or are they using it as a starting point and then doing, you know, changing their behavior, changing their tactics in a way that'll actually make it useful? Yeah, so that's why you should say it. There's one article where the headline is like better than a crystal ball and it's referring to Spillman Analytics, which is a company that does predictive policing. And like three paragraphs down, there's a captain in the California department where he says, this is not a crystal ball. And it's like, okay, well, if there's a recognition within the police department that... Your question mark on that? Right. That'll just hit the wall. If there's a recognition within the police department, which it seems most public statements by police departments is that this is another tool to help augment RA existing efforts. Though there has been sort of this odd sort of veneer of press coverage, sort of adding some sort of merchants of the future quality to predictive policing systems. So I think there's definitely buy-in amongst the law enforcement community that there are some limitations. One thing that is concerning is that some companies do have contractual arrangements. Some vendors have contractual arrangements with police departments where they license the software and for a lesser fee, the department will make themselves available to promote the software to news people and things like that. So that's where it gets hard to draw the line between. Is this a department sort of vocalizing their actual affect for the software or is this a contractually obligated necessity? Yeah. And on that point in Ralph was saying about community collaboration, you just can't underscore that enough because technologies can assist, but what they can't do, you can't have maybe two rival gangs who are about to have a turf war, right? Technology can't tell you that that's about to happen, but a local resident can. And so it has to be technology augmenting already at what police officers and police departments are doing. Should we take some questions? Anyone got a question? Right there? Should we get a mic? I had a question actually related to a few of the last comments. So for ShotSpotter, when you are actually putting together contracts or agreements with police departments, are you putting something in there about a requirement for community collaboration for the ways that the police department actually engages with the community? Cause we've seen cases where the vendor says, well, we really told them that they should do this community stuff and it just doesn't happen. And then the second part of that question, are there times when you have talked to a law enforcement agency, recognize that they weren't doing this as a big picture project and decided, I don't think we can work together on this? Two very great questions, really good questions. So to the question of can we contractually enforce or compel our agency clients to implement best practices? The short answer is no. We encourage, as a part of our contract, we make resources available. Frankly, a lot of the best practices onboarding we do is allowing, or not allowing, but helping agencies get connected with other agencies that are implementing these best practices. We're not the inventor of these things, we're basically re-borrowing, repurposing things that we've seen work in other places and trying to make it efficiently available for our agency clients to do. And a lot of it's a kind of work in process, learning in process question. On the question of do we see agencies that don't do best practices, sadly, yes. We have certain agencies that do an amazing job in one aspect of the technology, so other agencies do an amazing aspect of the job with another aspect of the technology, some kind of do it very broadly. Fortunately, we've only had a very few customers that haven't done very much of anything, and they either, in one particular case, they self-selected out. We've had two other situations where we've chosen not to work with the agency because what's really important for us is to work with agencies that are committed to leaning in and so they can drive good outcomes because then it makes our job of selling a lot easier. And no, we don't subsidize our pricing for people saying nice things. They should want to say nice things, and that's our position. Just one quick follow-up, though, is that where is the agent, the vendors may not be able to do that with the agencies. I think that the federal government or municipalities that are receiving funds can, or should be, conditioning the receipt of those funds on collaboration with communities, and so those funds will be used to purchase technologies, and so that's one way that we could see that. Yeah, just real quick. I know that a number of police departments for shots by, I believe, either get heavily subsidized or largely subsidized through federal grants like the Bureau of Justice Assistance or the JAG grant. That's the same when we were studying predictive policing is it's pretty widespread for federal grants to at least help start police departments off in acquiring the systems. I think where things kind of get difficult is once a provisioning of a grant is done, where those resources continue to come from, but it's definitely currently the model where BJA is not necessarily tying certain conditions to the funds, which they would be able to do, say you receive a $50,000 grant for a predictive policing system. You could say, as conditioned as part of this, develop a policy or communicate with your city council, like currently you only need to advise your city council that the adoption of the system, the predictive policing system is occurring. They don't actually need to get city council approval or go through hearings about how the system will be used. So there are a number of different interventions at that policy level before we would get to thinking about vendors recommendation. Right, right, right. Another question? Back there? Yes, I have a question about, you raced it earlier about the biases that could exist and so how do you address the inherent biases in the historical data that are then used to inform the algorithm? So how do you address that? Right, so it's definitely a hard problem. I think when you're creating the algorithm and training algorithms, you have to be closely guarding sort of against pernicious feedback loops. And to my knowledge, most systems today or police departments today don't conduct racial impact assessments as to know things that are occurring once the system is deployed. So those are sort of like after the fact once the system's sort of churning the data and spewing out forecasts, sort of adjusting against it. You could also look for other sources of data like the National Crime Victimization Survey for data. You could look for hospital records for data. You could potentially have some model where if shot spotter data is public data, the police department could potentially incorporate the incident of that gunshot into, if that is depicted as a felony in commission at that time, it could be put into a predictive policing system ostensibly, which I'd be interested to know if that's ever been talked about. But. But one issue is that we don't always know what goes, exactly what's in the secret sauce of the algorithms and the, yeah, so it's proprietary. So that's just another issue. When we think about studying this and addressing it, it's just another challenge. Right. So I think it's really important because predictive policing is one thing. And to a certain extent, ours is reactive. So we're not predicting anything. We're reacting to events that are taking place. So there's not a predictive aspect to what we're doing. Right. There's another question back there. Hi. Along the same lines as, excuse me, the previous question. As I understand it, the predictive policing algorithms that you're talking about use existing historical data, but some social scientists are experimenting with much, much more sophisticated types of modeling. I was at a conference yesterday about using agent-based modeling on very large scale to try to capture some of the community attitudes, to try to compensate in sophisticated ways for the biases. All that, to your knowledge, is anyone trying to apply that kind of advanced technology to this? From what I've seen so far in my survey of the existing vendors that are selling systems, no. There's definitely advanced research efforts going on and the NIJ is funding a $1.2 million challenge right now for a better geospatial prediction system. So I'm sure some of that research will be at play there, but the systems that I've researched so far, some do employ more, some systems, a la Hunch Lab, might employ more sophisticated machine learning techniques and might be more sophisticated in their design of their algorithm, whereas some might be a more, a simpler sort of earthquake aftershock algorithm. But as to tailoring an agent model to look back to curb against bias data, I haven't seen that in use today. Any more questions? Back there? In thinking about the variability in police departments around the country, surely outside of these predictive reactive technologies, there are differences in police capability. So I'm wondering if anyone has given thought to the ways in which these technologies, as they're rolled out unevenly, are interacting with existing differences and what the implications are for policing and the way those will interact. Yeah, let's assume these are amazing technologies that actually do make crime obsolete. Are we gonna see wealthy cities benefit and other cities that can't afford them will continue to suffer? I don't think, in our particular case, I don't think it's a funding issue and I thought the question might have been more around the capabilities of the agencies and you do see different capabilities out there. I mean, just like baseball, I mean, there's like really outstanding players and there's kind of average players and I think we'd have to be honest and say that's true with police departments as well. And even if you have a really amazing glove that can do a lot of things. I mean, if you run in a 4-2-40 versus the 4-8-40, a glove's not gonna help you, right, so. It's also just, I think in terms of police departments, because what are we really talking about is police departments sharing best practices and I think that there could be a lot more of that sharing of experiences between police departments. So we have to, as policy makers, figure out ways to make that happen. Yeah, I guess one current observation is that, so our survey showed that 20 of the largest 50 police departments have adopted predictive policing systems, but other surveys have looked at, if you look at, for example, smaller police departments, there seems to be a more rapid adoption of predictive policing systems with smaller departments, with smaller cities and that might be somewhat of a impulse, they might be even more budget constrained so they feel that the right solution is instead of hiring another police officer who might come with a pension benefits, that sort of thing, it might be easier to install a $40,000 predictive policing system that, hey, by the way, you got a federal grant for the first year, but there are also some small departments now who are realizing that with a smaller community it might not actually be that effective and there are some departments in California, for example, who had adopted a certain system and then decided to not renew their contract based off of diverting resources unnecessarily or it not being as productive or efficient as they wanted it to be. Yeah, I mean, is there a fear that like... That because the money's coming from federal grants, like it's sort of a win-win for the police department to just take it and then get the good PR, it's sort of this G-Wiz advance and then it doesn't work, it doesn't work. Yeah, I mean, there are definitely, in terms of an incentive structure, there's very little incentive structure for once you get a federal grant to acquire a system to go through a rigorous city council process where a city council will really interrogate the police chief or their lieutenant about what's occurring. We've seen one or two instances, one in Fresno, California with the system Beware, which is a person-based system that gives you a threat score on a green, yellow, or red scale and then in Bellingham, Washington with some concerns and those are just two instances of many, many grants that have occurred. So I think there's somewhat of a perverse incentive structure currently with abiding by, I mean, it's a good thing that the federal government obviously is helping law enforcement, but ideally would come with a bit more strings of involving the communities and the adoption of that technology. All right, we're about out of time. Thank you so much to our panelists for being here. Thank you for the quick, great question. Yes, right. Thanks all. We now have a short presentation from Denise Ross who is here. Hi everyone, I think it might need to be activated. Hi, it's so great to be here. Today I'm Denise Ross and I'm one of the co-founders of the White House Police Data Initiative and I'm really grateful to be here today because I think we've recently reached a tipping point in culture change toward data transparency and policing and for background in the wake of Ferguson in fall of 2014, there were calls from community activists for data transparency and policing. And as the president assembled his task force on 21st century policing, my colleague Clarence Wardell and I, we were presidential innovation fellows at the time. We started scanning the environment to see if any law enforcement agencies were leaning forward in opening data about police activities. We found just a handful, Seattle, Dallas, Philadelphia, but for the most part there was no public data available about policing back in 2014. Plenty of agencies were publishing data about crime but very few about their own activities. So we thought, what if we bring together these lead agencies to help define what the emerging field of data transparency and policing would look like? So in April of 2015, we hosted 14 police departments and their counterparts in information technology on the city side at the White House. Those 14 agencies on that day collectively committed to opening 87 data sets. A month later, the president announced the creation of the police data initiative in Camden along with the findings of his task force. And those task force recommendations, they reflect the feedback from hundreds of community members, civic leaders, advocates, researchers, police officers, academics, and others. It was no surprise with that type of input that of the 59 recommendations in the task force report, 14 of them had to do a tech data or transparency. And if we fast forward to today, we now have more than 129 agencies across the country who are part of this community of practice committed to opening data on policing. The smallest jurisdiction is Proctor, Minnesota, with a population of 3,000. The largest is LA County with a population of 10 million. We have four college and university police forces on board, and BART is our first transit agency. So this local leadership and data transparency is already bearing fruit. For example, in New Orleans, since the police department launched their open data initiative, citizen satisfaction with NOPD has increased 16%. And the federal judge overseeing their consent decree just described their data work as, quote, a miraculous transformation. As this administration comes to a close, the great news is that the DOJ's Office of Community Oriented Policing Services will continue to provide a platform for peer-to-peer community, this peer-to-peer community of practice. PDI fits right into the office's focus on organizational transformation and building trust and legitimacy. But this approach represents really a new way of doing things. Rather than disseminating knowledge from the top down, these 130 agencies are bubbling up local innovations to move the field forward. And just in brief, the basic commitment that these agencies have made has three parts. First, they identify at least three data sets about policing to work on opening. The key here is it reflects local community priorities and they define which data sets they're gonna work on. And the data releases on the jurisdictions own timetable because different jurisdictions have different capacities to open data. They do not report the data to the federal government, they report it to their citizens. The second element is they assign a point of contact to participate in this community of practice, which has a steady cadence of bi-weekly meetings where agencies report out on progress and successes and barriers and challenges they're facing in opening data. Through these regular check-ins, agencies realize that their challenges are the same ones that others are facing. And more importantly, they're emboldened by the success of others. The final and most important element of the commitment is buy-in from the police chief mayor's office and city CIO. Most police departments do not have the resources to launch an open data initiative in a silo, so we're finding our most successful jurisdictions are those who have a strong collaboration with their technology department where they get economy of scale and technical staff expertise. In just 18 months, more than 175 data sets have been released. That's from like basically zero to 175 in 18 months. These include incident-level open data sets on use of force, officer-involved shootings, traffic stops, and citizen complaints. Our police chiefs also have in their portfolios data sets that allow them to paint a more complete picture of the complexity of modern policing. Data sets like proactive community engagement, trainings that their police officers receive, police force demographics, and recruitment data. But the most exciting thing that happens with this data is when the departments use their open police data as a platform for engaging the public. Time and time again, we see that data transparency can shift the dialogue from one of confrontation to collaboration. Here you see 15-year-old Grace Clark teaching the New Orleans police chief how to write his first line of code. This group of young people were the first citizens ever to see NOPDs use of force and the citizen complaint data. And the experience not only changed these young folks, but it also inspired NOPD to double down on its commitment to data transparency. Other agencies in the community of practice heard about how successful New Orleans event was, so they started hosting their own events, but to address their own local priorities. In January, Orlando convened a group of domestic violence and sexual violence victims advocates, police officers, and civic techies to dive into draft data sets and give feedback on how to best balance the value of open data for transparency and advocacy with protecting victim privacy. This really important work is now being codified in some early recommendations soon to be released by the police foundation and the national network to end domestic violence. And my third example is that just this month, Tucson, Arizona held a mapathon where neighborhood residents, pedestrian and bicycle safety advocates and uniformed officers sat down and poured through 400 traffic collision reports to create data on distracted driving. Again, with data as a platform for civic engagement, Tucson now has an energized and informed core of citizens ready to partner to tackle this scourge of distracted driving. And the police department now has the confidence to move forward with more open data innovations because this first one was so successful. So what does it look like when agencies release data and people start using it? Give you two quick examples. The first is New Orleans released six years of stop and search data with details on date, location, type of stop, demographics of the person stopped, et cetera. A local data scientist took that data and ran it through an analysis that looks at race and gender of people pulled over during traffic stops. The theory here is that the demographics of the people pulled over changes between daytime and nighttime that there might be bias in who's getting pulled over when the officers can conceivably see race and gender. In this case, the analysis revealed no evidence of bias and this finding meant more coming from an independent third party than if NOPD had released it themselves. Similarly, Austin, Texas is releasing a ton of data on their policing. Here's their use of force data and what I really love about their data set is they do a great job with the metadata making it easy for researchers to analyze the data for patterns. The Center for Policing Equity and Urban Institute did just that. They analyzed the data in collaboration with Austin Police Department and indeed they found that Austin uses force more often in black and Hispanic areas. The police chief in response had this to say, you can be an ostrich as a leader and bury your head in the sand or you can be forward thinking and be scanning the environment constantly for threats or for opportunities to do better. I'll close with three thoughts about the future of this work starting with a quote also from Chief Acevedo in Austin. He says, this isn't our data, it's the people's data. There are police chiefs across the country who are making open data a part of their organizational DNA and the public now expects that they will have access to data about law enforcement activities. This is the future of government and it's the future of policing where data transparency is just another element of being the 21st century law enforcement agency. I would add that we have an awesome opportunity here to extend this transparency to predictive algorithms themselves. What Professor Chavez was talking about the secret sauce make that transparent as well and engage the community around those algorithms. My second thought about the future is that tackling the challenges we face as a community will be in all hands on deck endeavor. We know government can't do it alone. I love this quote from Baltimore's police commissioner. He said, we want to be better and we can't do it without the community. The future of collaboration will include having a diverse force of sworn officers that looks like the communities they serve. It means having tech talent inside of the police force to support data driven decision making and a culture of innovation. And it means that neighborhood residents civic technologists and law enforcement will roll up their sleeves together to collaborate to bring safety and justice to their communities. And lastly, we'll continue to see innovations rapidly emerge and evolve from local jurisdictions rather than waiting for top-down mandates. Intermediaries that bring together local jurisdictions will play a crucial role in accelerating progress. Intermediaries like the police data initiative, what works cities, 100 resilient cities and others. Obviously what works for Chattanooga might not work exactly the same for Anchorage, but we see the innovations adapt to local conditions and thrive. So if you'd like to learn more about data transparency and policing, you can reach out to our colleagues at the police foundation who are moving this work forward at PDI at policefoundation.org. Thanks. We're now ready for our second panel. Will crime fighting technologies make privacy obsolete? This panel is hosted by Laura Moy, who is a visiting assistant professor at the Georgetown University Law Center, as well as a program fellow in New America's Open Technology Institute. I'll let you introduce our other panelists. Great, okay, thank you. Thanks very much. I am joined here on my left by Jennifer Lynch, who is a senior staff attorney at the Electronic Frontier Foundation. She works on privacy and civil liberties issues as part of the street level surveillance and transparency projects at EFF. And Lauren Kirchner, who is a senior reporting fellow at ProPublica. And she's also, ProPublica has done a lot of great reporting and FOIA requests on police technology, and she is part of the team that has reported on machine bias, and she's written on the police's use of DNA databases. So thanks so much for having us and for hosting this great event. So I'm gonna jump right into it and just ask, so what does this mean? The title of our panel of our conversation is Will Crime Fighting Technologies Make Privacy Obsolete? What is the tension between police technologies and privacy, maybe you guys can illustrate with a couple examples? Well, I think I wanted to start by saying, Leon started us off by saying that the answer to all of these questions was no. And I think this is one example where the answer might actually be yes. I think about how much we have changed in the amount of data that we find acceptable for government to collect on us over the last 20 or 30 years. You know, I was thinking recently about airport surveillance because the data that's collected on us in airports, because I travel a lot, I live in San Francisco and I flew out to DC. You know, as recently as the early 70s, we didn't even have metal detectors in airports. And the airlines pushed back on the federal government's efforts to install metal detectors in airports because they thought that people wouldn't travel if they felt like they were having their privacy invaded like that. Now, of course, metal detectors came into play. They became normalized. We stopped this crisis of airlines being hijacked, airplanes being hijacked. But I think it's interesting that not that long ago, the American public really believed that that was an invasion of privacy just going through a metal detector was an invasion of privacy. And now we're at a point where we find it acceptable to have our bodies go through a body scanner at the airport. We find it acceptable to have cameras everywhere. We find it acceptable for law enforcement agencies to build vast databases on us and to have access to databases that are collected by private companies. We're really living in a very different society today. And we talked in the last panel about denormalizing gun violence, which I think is a very worthy and laudable goal. I would like to get to a point where we denormalize surveillance, where we no longer have the public saying, well, I can't do anything about that because we already have cameras everywhere. Yeah, I agree with all of that. I think what's amazing is just the speed at which technology is advancing and the infrastructure and the policies of surveillance are building up so fast that oversight and regulation can't possibly keep up. Even public knowledge can't even keep up. I mean, we talked a lot in the last panel about community buy-in, but with a lot of these types of technology, the people who are being surveilled aren't even aware of it necessarily. So you can't get community buy-in. You can't have a really valuable debate about security versus privacy if the people who are giving up that privacy don't even know it. So with some types of technology, the invasion of privacy is the point. For instance, law enforcement collecting DNA swabs to solve crimes and maintain that information in a database to solve future crimes. Privacy invasion is the point there, obviously. But with other types of technology, like police body cameras or shot spotter, privacy invasion might be just an unfortunate side effect. And we're not really always having that conversation when law enforcement agencies quickly decide either in response to maybe a problem or in response to a funding opportunity to just implement this technology and start using it and develop best practices along the way rather than having a real conversation beforehand. So, but on the upside with police technology, I mean, some of these technologies that we may think of as being privacy invasive, obviously there are benefits to having them, right? I mean, they increase the efficiency of police, right? You have, if you have a technology that allows you to track down an individual cell phone without having to deploy a large number of people to go find that person on site, or if you have face recognition technology and closed circuit cameras that can do the same thing, that you mean like there are benefits. So, you know, I think, Jen, you're talking a little bit about the normalization of privacy invasion of surveillance. And you're saying as a society, we have come to accept this, but what are the harms? I mean, how do people feel the harms? Are there harms to community relations? And like what is the downside that we should be so concerned about? Well, I think first we have to challenge the assumption that all of these technologies actually solve crime or fix crime or prevent crimes from happening. We don't have a lot of data on some of the more recent proprietary algorithms and the tools that rely on proprietary algorithms. I think we talked a little bit about that problem with predictive policing in the last panel. But I think that to your question, the threats to communities that are already over-policed and over-surveilled will only increase as the technology allows for greater and more secretive surveillance. So we know that in the years after 9-11, a lot of police departments increased their surveillance and the FBI increased its surveillance of Muslim communities. And there was a great study that was done on Muslim communities in New York and New Jersey that found that there was a real impact of that increased surveillance. And that impact was felt on a very personal, as well as community level. It got down to parents encouraging their children not to speak out against the government for fear of what might happen to their children. And these are citizens living in the United States. It got to the point where people were afraid to talk to somebody they didn't know at their own mosque for fear that that person might either be a government informant or it might be somebody who was more radical than they were and just purely by association could put them in disfavor with the government. So I think that is a real challenge that's posed by this increase in surveillance. And also, when we know we're being surveilled but we don't know where it's coming from, it creates the panopticon problem that we will think twice before we say something that might put us at risk of being targeted by the police or being targeted by the government. There was another study that was done on commenting on Facebook and it found that if people believed that their friends were watching them and listening to what they said on Facebook that they would moderate what they said and what they read about, what they clicked on. So we're already seeing this starting to happen. And also, if you as an individual or if we as society consent to a certain type of surveillance, I don't think that we get asked again if that surveillance changes, if the technology advances. So for instance, you can knowingly consent to give a police officer a DNA swab if you are the victim of a robbery and they say this will help us solve the crime because then we'll remove you from the DNA mix that we're collecting from your apartment or whatever. But then that DNA is stored forever and you may then be implicated rightly or wrongly in a crime five years from now and they don't necessarily come back and ask you can we retest it for something else? Or if surveillance cameras become normalized are we gonna have another additional conversation in a debate when those cameras then have additional features like facial recognition technology or AI reading systems or something like that. I mean, it's not too sci-fi paranoid to think about how these could suppress political action in speech. Yeah. I think just following up on what Lauren said I think this is also an example of how the courts and the legal system have not kept up with the changes in technology. So in the consent context for DNA the example that Lauren gives is correct that oftentimes victims are asked to provide a DNA sample to take out their DNA from the equation or people are asked to provide a DNA sample to show that they are not guilty of a crime. And there was a case in Maryland at the Maryland top appellate court a couple years ago where a homeless man consented to have his DNA collected to show that he was not somebody who had raped a person. And he signed a form and consented to that. And then years later his DNA was run against another crime and he was implicated for that other crime. His defense attorney challenged that he had never consented to that future use of his DNA and the court held, well he didn't limit his consent explicitly and so for that reason the law enforcement agency could use his DNA to implicate him in that future crime. We've also seen this come up with hard drive searches just not too long ago in another case. But how do we properly limit how do we have knowledge of how to limit our consent when we don't know what the future uses of the data that's collecting on us we don't know what those uses are going to be. Right, so the question that this raises for me then, so if knowledge about surveillance chills speech, chills First Amendment protected activity, then I think you're both citing this problem of us not being asked in the future as technology develops or of not being able to anticipate how information that is collected from us or evidence that is collected from us might be used in the future. But is it, is there a problem then that privacy concerns from us, from citizens, from people actually drives the lack of transparency about what information is being collected and how it might be used out of fear perhaps that people will not want surveillance technologies to be adopted, maybe in some cases perhaps unfairly out of concerns about privacy. I mean, I just can't see that as a justification for a law enforcement agency not discussing implementing a surveillance technology on the public. We have a perfect example of that in the city of Compton when this company contracted with the LA Sheriff's Department to fly a plane over Compton and use a sophisticated camera system to record everything that happened on the streets of Compton to the point where you could rewind and go forward and track vehicles. And it was never discussed, not even with the mayor of Compton. It was just something that the LA Sheriff's Department implemented because Compton's part of LA County. And there was public uproar about that. I mean, here's a community that is, talk about an overpoliced community and to have this happen without any consultation with the community, I think that's a real issue. Now, I mean, ShotSpotter is actually a different example because oftentimes there is a lot of discussion about whether to put ShotSpotter systems in a community. And I think a lot of communities have decided that that's a good thing for their communities. So there has to be that discussion in communities and there has to be that continuing discussion going forward. Yeah, I mean, there's no rule that says that a law enforcement agency has to discuss things with the community before they implement whatever technology they want to. I think I was surprised when I was reporting on risk assessment software, which is used to predict whether a defendant in a trial would commit a new crime within two years if they're let out. And then also when I was reporting on DNA databases, the people that I spoke to in the communities were not aware that these things were being used. And in a lot of cases, the defense attorneys were not even aware that it being used. In some towns, the transparency of these agencies really varied. For instance, in Bucks County, Pennsylvania, they were very vocal about the fact that they were using private database and collecting DNA to solve property crimes and having all these successes. And they were talking about it all the time. But what they weren't public about was how they were getting that DNA, which was stopping people on the street who are connected to any crime, you know, just straight stops or field encounters that they call them. And in other counties, they were collecting people's DNA and no one knew that except for the people who happened to have their DNA collected. And then they would wonder, I wonder where that went and I wonder what that's for. So there's just such a wide variation. I don't have a solution to that other than encouraging transparency and, you know, the strengthening of the local press. I don't know what else to say about that. So, you know, bias came up in the first panel, of course, and you alluded to it a couple of times. I would be remiss not to ask a question directly about it. Do the harms that we see or concerns about privacy invasions stemming from police technologies, do they, how, what is the nature of their impact on different communities? Do they, are there concerns that there's a disproportionate impact on some communities and what are police doing about that? Well, I think there is definitely disproportionate impact on certain communities. We can see that with predictive policing. When you have a technology that's based on past crime data and past crime data is based on biased policing, then if you're predicting future crime, it's, you know, it's the garbage in garbage out theory that you can only predict crime that looks like past crime. And if that past crime is based on racially biased policing and data associated with that, then that's what your future crime, that's where you're gonna look for your future crime in those neighborhoods that are already overpoliced. But, you know, I think one of the things that's interesting about these technologies is that they can actually be used anywhere. And I watched Enemy of the State on the plane on the way here and I don't know how many people have seen this movie. It's Will Smith. It is fantastic. Man, it holds up. It was released in 1998. And I had a former NSA employee tell me one time that it was the most accurate portrayal of law enforcement or excuse me, national security surveillance capabilities even today in Hollywood. But, you know, one of the things that Enemy of the State shows is that anybody can be surveilled when the government wants to turn its powers of surveillance on you. It doesn't matter if you are a upper middle class attorney who's never had any run in with the law. If somebody wants to turn their surveillance capabilities on you, for some reason they can do it. And that's where we see things like analyzing your call detail records. But that was one of the things that came up in the film. If a law enforcement has access to your call detail records, they know who you called, when you called them, how frequently you called them. A lot of times they'll know your location information where you went over time. And it's possible to get that information, all that information without a warrant. And so usually that is collected on people who are living in communities of color over police communities, but it can happen to anybody. Sorry, Lauren, I'll give you an opportunity to respond to, but I just wanna get a microphone to the person who's gonna be asking a question first. So raise your hand, sorry, Lauren, go ahead. Sure, yeah, I just wanted to add real quickly that I think about the way that people can be penalized or implicated just based on their social relationships or their personal associations. Not that this is a new concept in the context of world history or anything, but new types of technology are making that so much easier. So I read something this year, the NYPD arrested a bunch of Harlem teenagers for supposedly being members in a violent gang, but the main evidence used against them in court was that they had liked their friends' pictures on Facebook or they had appeared in pictures with them in Facebook, these other guys who had had some gang activity. And in the risk assessment surveys that we looked at where they were scoring defendants, people were given points, like people were declared more risky if they had family members or friends or neighbors who had been arrested or if they lived in neighborhoods where drugs were available or they lived in areas where there was gang activity. And we can see how those particularly invasive questions can really disproportionately harm people who live in certain areas or just to go to school with certain people, so. So I'm gonna open it up to questions from the audience. Sure, here, this gentleman. Without getting to an apocalyptic point here, what's the sort of political end game here if public apathy continues along the same line here? Because one could imagine the dark web may be consuming some of the legitimate web or living in a police state or something along those lines. Hopefully your forecast isn't quite that grim, but assuming nothing changes. I think the responsibility is on all of us as citizens who live in this country to push our people in government to make changes. And this doesn't have to happen at the federal level because it will probably never happen at the federal level. This happens at city councils. One of the things that the ACLU has developed is a model ordinance for communities that are looking to adopt surveillance technology. So now ACLU, and this is something we're working on with them, is going around to city councils and encouraging cities and counties to adopt this ordinance. And the ordinance doesn't say don't adopt this technology. The ordinance says that a law enforcement agency that wants to adopt technology needs to go and present the case for that to the public and needs to be transparent about where the money's coming from and how it will be used and how it will benefit the community. And I think that's something that we all have a responsibility to do to push our cities and counties to do, to make these changes on a local level. Okay. To play a bit of a devil's advocate here. In these kind of conversations, it's easy to start talking about surveillance as being bad because it's surveillance just in its own terms. But if you think about the kind of communities we all say we want, you know, with strong intact families, with strong social fabric and networks, with strong institutions and respected leaders, all the things we say we want, in those kind of communities, there are ears and eyeballs everywhere. If you step out of line, your mother will know it and you will hear about it. So, I mean, one way of thinking about the kind of concerns you're talking about here with, you know, the government and the police stepping in is they're trying to fill desperately to fill a gap that, you know, these incredibly damaged communities can no longer fill. Anyway, I just wanted you to comment on that, if you would. Well, I mean, I guess my first thought in response to that is that my mother never had the ability to put me in jail for a long period of time. I mean, she could ground me or make me clean up my room. But she didn't have the ability to put me in jail and take away my civil liberties. And she didn't have the ability to put anybody else in jail. The communities that you're talking about from the past also had a lot of problems with them. You know, they were not communities where people of color felt like they could speak out. They were not communities where you could be openly gay and walk around with your partner. There are good things that have happened in society due to the ability to be anonymous in society. We have changed our societal norms because of the fact that you can be anonymous in the world and you can take on positions that are not going to hamstring you for the rest of your life. And that's something that I worry about going forward, that the increase in surveillance by the cops, by the government will chill the next social movement, whether it's people who are transgender able to be open about being trans, whether it's something that we haven't even thought of yet. You know, you think about all the social movements that have happened in this country. It used to be that we had slavery. It used to be that women couldn't vote. It used to be up until 1986, you know, the Supreme Court said that state could outlaw a crime that was only prosecuted against homosexuals. So it hasn't been that long in our history that we have had these things and now our society is better because they have been changed. And those changes would not have been possible if the government could track and squelch any social movement before it had the chance to get off the ground. We have time for one more question, maybe. Okay, quick, sorry. Okay, a few more. Okay, all right. I'm back there. So besides the surveillance, have you ever thought of other alternatives in order to help citizens to be more cooperative with the intelligence communities like building trust or whatever between the intelligence and the police and the community residents in order to be more cooperative? I'm sorry, I didn't understand the question. Yeah, no, I'm sorry. Can you just repeat it again? Yeah, I mean, besides conducting surveillance on citizens, have you ever thought of other alternatives that promote cooperation between the intelligence community and the citizens like in a very cooperative way? So cooperative between law enforcement and... And citizens. And citizens or the intelligence community. I think that there are communities that are working on cooperation between community members and law enforcement. I think that I'm probably not the best person to speak to that because it's not my focus. Yeah, I mean, I think some of those programs have similar problems, or in some instances even worse problems maybe with things like bias, right? So if you have... I mean, I'm not sure if this is exactly what you're getting at, but suspicious activity reporting in cities is something that some police departments will use to inform... Well, often to inform deployment of surveillance technology, of course, and then actual eyes on monitoring. And that when you have some individuals reporting suspicious activities of others, that is an environment that is likely to lead to manifestation... You know, to bias, particularly racial bias or bias against immigrants leading to disproportionate deployment of police against certain communities, sometimes unfairly. Right, that gets back to the earlier question of who are the eyes and the ears and then what are the consequences of what they see and hear. Do we have any other questions from the audience? Hi, so I'm at the NAACP and we have a lot of local units that have to deal with thinking through what surveillance looks like in their communities. And there are resources about kind of when you talk about body cameras, what are things that you should be thinking about and you're talking about the model ordinance that the ACLU has. But I guess I would look to you all for... What's your aspiration? For something like body cameras, which we've seen... I don't know about being a game changer, but at least helping advance a conversation about police accountability and police shootings that I don't think that we would have had otherwise. I guess what would be in your 2050, what would body cameras look like in a privacy-respecting world? I think body cameras are a really good example because they present a really, really, really hard problem. The problem with body cameras is that they have the potential to alter police-community interactions for the better. That's the hope with body cameras. They also have the potential to create a larger surveillance state that we've ever seen. And that's because they're cameras. And because companies like Taser have proposed putting face recognition on the back end of body cameras. So what do we do about that? Well, I think we need to have an open discussion. This is a perfect example where the community needs to be able to talk to law enforcement and look at recommendations and balance the risks of body cameras and put in place specific requirements that will ensure that the cameras are doing what the community wants them to do. And those aren't necessarily things like how long the data is stored. It could also be things like analyzing the data for what a law enforcement officer is doing. So, for example, is there a time when a law enforcement officer's interaction with a community member is worse than a time when it's better? Maybe it's because he's been on shift for 10 hours and he's worse at 10 hours versus 2 hours. So then what do you do with that kind of data? So one of the things that I recommend is allowing public access to body camera footage. Now there has to be a way of blurring the images, but I think one of the biggest threats from body cameras is from law enforcement agencies where their default is no community access and their default is no press access because that is going to defeat the transparency purpose behind body cameras. But like I said, body cameras present a really hard issue. How long you store the data is based on lots of different factors. There are reasons to reduce the amount of time you store the data because it's less of a risk of being able to track somebody over time, but you might want to hold on to the data for longer to be able to respond to community feedback on the police and any reports of an officer having a bad interaction with the public. And then of course there's the cost of data storage, which is a real impact for a lot of police agencies. Yeah, I am so conflicted about body cameras. It's such a complicated issue. Just as a reporter, I would appreciate the ability to send in a records request and see the body camera footage of a particular interaction, but if I do, then I'm also going to be seeing the people that the police officer is interacting with who are at their most vulnerable positions and who are having the worst day of their life probably. And should I be getting that and should I be getting that disproportionately more footage from the communities that are being disproportionately policed? And so, yeah, it's a very complicated issue. Great. Well, thank you. Thanks. That was such a perfect question to end on. Thank you. I really appreciate you asking that. It's our last question. And many thanks to the panelists, of course, or to our conversants. Thank you. We're now preparing, I suppose, for our final panel of the day. Will technology improve police-community relations? Here to moderate that panel and to introduce the panelists is Wesley Lowry. Good afternoon, everybody. I appreciate everyone sticking around for the last panel. I guess if you guys don't want to come up. And so, what we're talking about today is will technology improve police-community relations? And I actually think this is a really excellent assembly of people who, rather than stumble through their names myself, I will allow to introduce themselves as we go down the line. And to open up, and I want to start by kind of, if we want to, we don't have to go straight down the line. We can pop corner around. But if you want to introduce yourself, the work you do, and where technology, and how technology and policing intersect with the work you are doing. And I'll pick on Sam and make him start. So my name is Sam Sinyangwe, co-founder of Campaign Zero and Mapping Police Violence, which are platforms to support the ongoing work to ensure accountability and end police violence in the United States. In terms of how technology intersects with the work, so much of the work that I do is enabled by technology. So collecting comprehensive data on people being killed by police nationwide in the absence of federal reporting. Being able to organize and mobilize and support hundreds of thousands, not millions of people across the country with the information and tools and resources they need to strategize and build towards advocacy campaigns. So I think all of that has been enabled to scale through technology and in particular doing so with very little resources but being able to connect and build an infrastructure for activism that has been particularly powerful in securing policy change. Okay, I guess I'm next. My name is David Al. I'm a councilman at Lard from the city of Philadelphia and one of the most important things that I'm responsible for as elected by the people is to provide a safe city. One which, when I say safe is in accordance with the Constitution and all the rights that people have. But at the end of the day I think one of the great problems that we face is that there are disparities in equities and lack of fairness. Technology provides the opportunity for more objective, more equitable and more fair treatment of people that are suspected of criminal activity as well as providing more safe communities and one of the problems that we have is with limited resources and personnel that just like with our public schools some neighborhoods have great neighborhood schools, well resourced and other neighborhoods have very poorly resourced schools. The same is true for police departments and so technology provides the opportunity to provide good quality policing in our poorest neighborhoods. Thank you. Good afternoon, I'm Tracy Cazia I'm the deputy commissioner of training for NYPD and so technology for us poses some very interesting I think dilemmas. So it is all of what you've heard on the first two panels about not just balance not just about public safety the co-production of safety about keeping community safe, about keeping officers safe, about how it's applied about how it's trained, about how it's used and not used and when it's used how it's abused all of those things and more. I think that it's going to pose some very interesting challenges as we move towards the future and I think that some of the concerns and hopefully the discussion will have today one of the bigger concerns I have is about the over-reliance of technology when it comes to what I believe is going to be a necessary human interaction when it comes to law enforcement and not begin to defer those things to technology. My name is Charles Katz I'm director of the Center for Violence Prevention and Community Safety and professor of criminology and criminal justice at Arizona State University Right now we're working with respect to technology and policing in many different ways. One of them is we're doing a lot of work on body-worn cameras. We were one of the first to receive federal dollars to evaluate body-worn cameras along with Phoenix Police Department who implemented that program. We're also working alongside Bureau of Justice Assistance, worked with the White House on developing a body-worn camera toolkit to help agencies implement body-worn cameras. We're also working with Rand Corporation on a technology-driven project to help facilitate small, rural agencies in adopting technology to address their needs and unique needs that they have. And then we also do a lot of work related to technology in terms of helping us better understand what's going on within those communities whether it is how crime data is collected whether it is gathering crime-related data among user groups that provide information to one another online through various user forums or through issues related to predictive policing as others have talked about here. So we've really been working in three different areas one on issues of equity we've also been working on issues related to efficiency as well as how we can increase the effectiveness of what the police do. And so as you'll introduce yourselves there were kind of three specific buckets that naturally came out as we were talking about spaces where technology is interacting with policing. The first was the invocation of body-worn cameras which is where a lot of the conversation about technology and policing rests and where some of the previous conversations today have kind of focused on. The second was police and crime data and data analysis both the collection and need for those things as well as the reliance on them to implement and create policies. And then the third was policing. And that kind of overlays with data but is a separate kind of potentially ethical or policy conversation. To start off we're having this conversation an hour after a decision not to charge an officer in Charlotte, North Carolina following fatal police shooting that went viral as many have for the death of Keith Amont Scott. Now this was a case where you had an officer involved who was wearing a body camera and also you had a dash camera that was there and you also had a a by-stair video. Now none of those videos fully captured the totality of this incident nor did any of the videos beyond any reasonable doubt settle the question of whether or not there was a gun in the hand of Keith Amont Scott. The prosecutor said that himself although the story of the officers there was a gun in his hand and he was killed. What was also interesting in that case to me was it raised several questions about the implementation and the use of body-worn cameras. There was a large debate over when and if the video should be released to the public. And then once the video was released there were questions raised about why the video was initially missing sound which was due to the fact that the camera had not been activated yet. All of that I think serves to underscore some of the limitations of body-worn cameras. In a post-focussion world we had a ton of conversations and many certainly lawmakers at the federal level and local levels pointed to body-worn cameras as one of the major steps that they were going to take. Now that we are two and a half years into a mass implementation of body-worn cameras not that they are everywhere and not that every department has them what are some of the lessons we've learned about those limitations as well as what are some of those practices in terms of both the use and policy of these cameras as well as how they overlay with the public's then right to see these videos or the police officers rights themselves. So you're looking at me so I'm going to take that. I can do it. So I think that you're spot on when we talk about the technology itself is still fairly new. When you think about policing technology as a whole and when we talk about police technology and the concern that you raise in regards to how the discussion of the narratives around what body cameras was going to do or was not going to do for the community that's always been the concern it's always been my concern because early on there was this sort of eggs in one basket that this was going to be it and what's happened over the last couple years has been just it there's limitations to technology and we're seeing those limitations right and there's also whenever there's technology there's a human hand involved in technology that's also when you are turning on a camera turning off whether or not it operates or not all of those things but I think it goes even a little you know war into a layer than that is what's the policy behind how you deploy right how are those things developed what's the voice of the community and how those things are developed when they're deployed whether or not officers can see them you know post you know event we're going to start to interview all of those things are questions that are going to be on the table continue to be on the table for departments that are going to deploy those and how they choose to address those are true community conversations whether they choose to have those are going to be very much on the individual you know community level and on the community of the chief or police commissioner level and I think that is what you're seeing is you know really this sort of complex specifically around body cameras I think the other piece that you know goes to the question of this panel is around trust if trust is not there I think what we were you know for me trading off was assuming that a piece of technology was somehow going to you know take the place of what you can't do unless you have it at first and I think that you're seeing that now I think if videotape would have shown something completely different I question whether or not it would have even been accepted then right because I think that trust is a human component it's not something technology can build for you and I think that as we move forward and we talk about and we will continue to talk about trust specifically in communities of color what does that look like that is something I don't believe that technology solely is going to have hand in that is something that you're going to have to come back to the table and have discussions about what does it look like how is it defined and then how is it co-produced because what we're really talking about is the co-production of public safety here and what does that look like and what does that mean so I think that we have to be very careful about how much we allow technology to leverage a human interaction and I think that's always been part of the concern around body counts as a lawmaker and policy maker I'd say that basically the issues in the city of Philadelphia which is 1.5 million people in what we call the city of neighborhoods is not any it's not very different than the United States in this sense that we don't have a national police force even going from Philadelphia to a city just north of us we run through so many hamlets, municipalities counties they don't need to have one police force in a county many police forces anywhere from two police cars to 20 they're trained differently, they're hired differently they have different protocols, they use technology differently some of them have armored personnel vehicles with machine guns others have blocks you know it's the lack of a uniformity in protocol and the lack of I would say a transparency and a best practices in taking the experiences of different police forces across the nation and putting them in a place where people can see what is happening the technology I don't think is a problem I think the technology is a promise of transparency and a uniform protocol the problem I think is when you start dividing up communities and saying well we won't have this and we're fine and you will have this and you'll be okay and I take it to a place where I think many people can understand in the city of Philadelphia as I said though we're a democracy we have neighborhoods with great schools and neighborhoods with poor schools but when the people who have good schools are satisfied we have art now, we have painting, we have music they go home and they don't speak for and they don't care to speak for the schools that don't have art music because they're not affected I would really argue in the city of Philadelphia if you're going to put body cameras in this neighborhood put them in that neighborhood if you're going to have drones that deliver Narcan in this neighborhood have it in that neighborhood let's level it out and if we could begin with technology and establish as the FAA with drones and other types of things some leveling of protocols so that we are treating these the best way we can then I think we're going to address the issue which is fundamental that was raised trust, the lack of trust and I think the lack of trust begins with the lack of respect and a sense that you're not being provided the level of service that you should be provided and the technology allows for us to now deal with the reality that wealthier, nicer communities have cleaner police cars and more polite folks and they have a certain level of funding and then in poorer communities you're going to have banged up cars and you're going to have police officers that are more on edge and more ready to protect their lives the technology allows for the opportunity to serve the community so people who feel they're not being served can now be served in a more efficient and appropriate manner I'm going to push back on that just a little bit though because the assumption that if you are in an underserved community that you somehow will allow a police force to serve you in a way to reflect the community that you are in is a false notion to me because that's about accountability if I'm a chief and I allow you to behave a certain way in one community not another and not hold you accountable that's problematic and I think that I agree that some technology will allow me to shine the light on that those things are going on but I think that to lay that on technology alone takes the onus off of leadership and accountability 40 years of research suggests that we absolutely know it happens what I think is the exhaustion of community and even some long force officers who internally it's accountability it's like okay we know this is happening let's start making sure we hold people accountable to that I do believe that it doesn't have to be geography based I think that we need to think about and we've already answered some of these questions and research more or less we can't state them as fact but I think we have a lot of information that suggests that body worn cameras decrease complaints reduce use of force increase trust with the citizens that they're interacting with at least we know that a little bit and so we're starting to answer that question are body worn cameras useful in that regard and so then the question that's sort of the larger 30,000 foot question but you know it's sort of when we have folks here talking about these pressures that are placed on local law enforcement agencies to release data that are focusing on issues of equity and equality and having the feds talk to local agencies about how to address those that are the most disenfranchised but when you look at all of these issues nobody's bringing up issues of federal law enforcement nobody is talking about Department of Homeland Security and policing the border the people who have immigrants who have literally have nobody to turn to the feds don't have civilian review boards they don't have body worn cameras and so I think it's a little bit unfair to be putting all this pressure on local agencies that have the fewest amount of resources and are making the most of them when you have the most well resourced agencies like the Federal Bureau of Investigation Department of Homeland Security that aren't being scrutinized at all in these issues and their implementations for that said I was going to also say though in terms of equity of their distribution I don't know that it has to be geographically distributed we know that about 2-5% of officers are responsible for more than half of complaints use of force excessive use of force we're in the really early stages and not all agencies are going to have enough money to throw cameras on everybody and it is possible to identify those officers that need the most training that need the most supervision and equip them with body worn cameras until the rest of the agency catches up well so I would say in terms of community trust police community trust a lot of the conversation sort of happens at this 30,000 foot level and it's very theoretical you know when I think about it is very real right it is the fact that you cannot have police community trust in a black community when the majority of black youth so according to survey research majority of black youth either themselves personally or somebody that they know personally has experienced police violence you just cannot no matter what technology you have you cannot have police community trust under those circumstances and technology alone won't solve that question as you said I think technology what it can do is help with accountability when we talk about body cameras we know that the majority of cases in which an officer was indicted last year in a police shooting involved no footage of that encounter and many of those cases involved body cameras and that that is increasing over time that proportion of incidents that are being filmed so it can be important to securing some level at least getting charges and we talk about convictions that's another issue that still we're not seeing any convictions but when we talk about the front end of prevention how do we actually prevent police violence how do we get to a place where there are conditions for police community trust to emerge and that is going to take a lot more in terms of policy it's going to take policies that identify the bad apples as you were saying early warning systems actually addressing the issues in police union contracts that make it hard to actually discipline officers and hold them accountable but it's also going to take systemic changes changing the use of force policies of police departments right and so now we're at a place where you can actually look at the data and see that there are actually policies that are associated with reductions policies like for example requiring officers to de-escalate situations imposing a restrictive deadly force standard that requires officers to exhaust all other reasonable means before resorting to deadly force those things are associated with reductions in police violence but yet they are the exception and not the norm in terms of police department use of force policies or state legislation and so you know I think we have to think much more broadly about what types of policies and systems need to be put in place so that the technologies can operate effectively so that we're actually preventing police violence from occurring and when it does occur we have the technologies that can help hold those officers accountable and I think that's sort of the broad 30,000 foot system that needs to be built and it can only be built with effective community input and with strengthened community oversight to ensure that those reforms are being implemented on schedule. Since I was pushed back on I just like to point out a couple things and it's great because we are very confident in our opinions. You know the issue of stop and frisk for example has nothing to do with technology that has to do with the fact that there is a executive order that tells police to go out and if somebody looks dangerous or there could be a crime even though you don't have probable cause or reasonable suspicion or any other information that the person is armed and dangerous that you pat them down. Now the human being is going to do what? Because the political intelligent person says don't be unfair start with the wealthy white guy be fair do that but the police officer says listen I don't have time for that you look like the person I need to there's a high probability that you're armed and dangerous and that's where we have a problem and when we look at the abuses of the past you know there was no technology there it was more abusive in the past the fact that people have body cameras at least as a check and a transparency and it lays out for them hey you know if you go around and pat down people and they're all minorities well you're going to have to answer because all that's transparent and public we link that to journalists we provide that we do data we do those type of things you know when you look at Stop and Frisk which again without body cameras or surveillance we wouldn't know what the police officers were doing and so nobody would be training them what happens in Philadelphia the reality is while we have police distributed throughout our cities we focus on homicide I'm not the police commissioner I'm not the mayor the police commissioner takes police officers out of different districts and says go to where the police officers are being shot and shooting each other and then you do Stop and Frisk well that leaves these communities without police officers to respond to burglaries prostitution vandalism and quality of life crimes that make people want to move out of their neighborhoods again so now now disenfranchised communities don't have the political juice to say that tractor trail is parked in front of our house and it's been there for five days in the park and we've called the police and nobody's coming out to look for them we now have the ability through technology through drone technology with thermal to go look for these children to look at those trucks to respond to people in ways where they say hey I'm being served now and they can then interact with human beings you know the thing about the service is that's what builds a community relationship when you're being served you can live a decent quality life whether you're poor or rich and you believe you're getting the services you're being respected that's how we build community whenever someone helps me and I say thank you I have a different view of that person and their job doctor, nurse, police officer whomever, Coast Guard and the problem with many of our communities that are upset about the police is they've only had a negative interaction and it's either like you don't come or when you come you do something bad to me and I want the good stuff I want your good service I want to live a peaceful, decent life in my community how many people can walk to the Wawa how many people can walk to the 7-11 even though it's four blocks away at three o'clock in the morning what a convenience but even educated middle class people don't want to walk outside because it's dangerous there are roads in Philadelphia it looks like prison there's an iron bar outside of every window every door that people don't come out they can't play in the park how much does that lead to domestic violence I mean I think the delivery of service and the limitation of cost technology I'm not saying it's a fantasy of wonderfulness I'm saying it's the promise of the opportunity to deal with people fairly and uniformly and give them the services at the end of the day so they can live the best life they can and we're challenged in America always to live up the promise of not looking at the differences of people and providing them their constitutional rights two things that come up very often by people who are hesitant about the ever-expanding use of technology in spaces of policing not just body-worn cameras the questions about body-worn cameras I think actually are applicable more broadly two of the things that come up very often are one, the specificity and the community input of what the policy is surrounding the technology what they are that to many people the idea of having cameras throughout their neighborhood whether they are mounted or on someone does not necessarily sound like something they would enjoy especially given a lack of trust already perhaps is the conversation about then the public and broad accessibility that theoretically these are tools that provide transparency and yet they only provide transparency if departments allow them to provide transparency you know we do one of the teams I'm on at the post we do a lot of work around both police data and police transparency broadly, right? in one in five fatal police shootings the names of the officers are never released much less the federal officer kills someone but that's a little different because that's by law but much less in the majority of body camera in fatal shootings this year captured on body camera the tape has never been released to the public more often than not it is not released to the public and beyond that there's a question also since the framing of this conversation is about can technology improve trust as a few of you have alluded to video for example can only show you what is happening and so more videos of watching people getting killed by the police I would surmise perhaps not something that will improve trust watching you know I didn't trust the police more after I watched Walter Scott get killed the day before in fact it was something that led me to trust the police less what because it speaks to what is the actual behavior of police in communities as opposed to the idea that simply having a camera will mean the trust is improved what what is the as technology is adapted at the department level what is the role and the best practice for a department what again as this relates to how city and Comstant models are used how neighborhood surveillance cameras might be used how body cameras may be used what is the role in both ensuring that neighborhoods and communities know what they are now what technology is now over but they know that the drone is flying above them and also what is the role and best practice for finding an acceptable level of transparency as it relates to these videos that are being taken and whoever it is well I'm going to speak I'm sorry I'm sorry so what I would say is this you know talking about body cameras that's when the police arrive when you have technology that arrives before the police arrives that is another level of technology and so with limited resources if a drone arrives and assesses a situation that you don't need eight police vehicles you could do with one or if it arrives and says okay there really isn't a fire here so you can pull off the firefighters the ambulances and the police cars and they can go direct traffic and the drone says listen you're sending them in the wrong direction send them over here or when people say listen someone's dumping in my neighborhood the city will not put up a surveillance camera all the construction trash and other stuff but they can send a drone the issues of mistrust can be dealt with with certifications and your inability to operate those drones and technologies if you violate the certain rules and to make them accessible on real time in non-emergency situations non-investigative situations to journalists all of this technology making it available to the public because one of the things that the drone can do for example is arrive at a scene like a hostage crisis and say of the eight building parts there are four criminals there and one small child here the same as they can direct firefighting in a large scale operation to exactly where the fire is so that it's safer for people and I think a lot of times the issue is when we tailor we're not putting our personnel at risk whether it be firefighters or police officers we are calming them because they can assess the situation and are less likely to overreact plus they are being surveilled and if I were a mistrustful citizen and a drone arrived hopefully we've come to the part where I say thank goodness there's eyes on someone who's not necessarily a police officer but someone who is certified and is very objective and a skilled navigator and I believe that those video feeds live feeds are being fed out to journalists and other people now I feel much better about my interaction with the police and I think the police have got to be aware and now you're asking them to be trained to respond in a much better fashion so I'm going to join David in his world so I'm going to get there we will, we'll get there so I think that if I think about we address these issues of trust I can see us getting to this point and I think what it would look like for me is if I were to call for a police officer to come the information that I would get as a community member would be I would know who's coming to respond so this is before you even get to a drone the drone has done the assessment that I would know who this officer is matter of fact I'd have the background I would know exactly how long they've been in this neighborhood whether or not they've worked this neighborhood the training history they've had whether or not how many shootings they've been in whether or not they have been whether or not they're bilingual whether or not they have degrees I'm going to know all of this so that's the kind of transparency I'm talking about how up to date their certification is all of those things attached to the neighborhood because the neighborhood has been involved in the selection of the officers and involved in the development of the curriculum that helped train those officers in that neighborhood so I can see that happening and I think though in order for us to embrace that these steps of the early versions of the body camera the policy around the body camera are those stepping stones we're going to have to get through to get there and I think that you know how do you get the way in and the buy in and how do you or do we even need this national model around best practices the concern around best practices was best for New York it's not going to be good for Denver and I get concerned about thinking that all communities are very monolithic in that thought that it's going to work that way so I think that when we think about 2050 absolutely but I think that the early conversations that we're having now and the research is happening now are those stepping stones that we're going to have to get through we have to get through it to get to it to join you on that end over there and I think that's absolutely doable I think to get there you have to act like a good first step instead of sort of leading with we just need to deploy the technology and figure out how it works is how do we create systems and structures that most impacted are shaping those policies and that deployment how those technologies are being used and it's very rare when we talk about the communities most impacted by policing I mean it's not a mystery which communities are most impacted by policing it is young black and brown folks right and very rarely do young black and brown folks sit on civilian review boards participate in shaping department policies participate in department trainings not just as participants but also in terms of shaping those trainings providing feedback that actually has an impact on how those offices are being evaluated so we also have to think about power right and so if we're going to give more funding more technology and essentially more power to the police to law enforcement that has to come with checks and balances and the check here is to have the communities that will be receiving in that technology playing a role in actually shaping how the strategy is built with the policy is and having a role to play if it is not working in stopping it from happening I think that those structures are very rarely in place in any city in America I think that I think you're absolutely right on target I think that first of all we need law enforcement to be very if we're going to establish trust we need law enforcement to be very transparent in what technologies they are using drones are one aspects of it if you have aerial surveillance that might be another aspect to it but we need to be thinking about issues related to license plate readers we have all sorts of law enforcement agencies that regularly surveil social media sites and record that information and we need a way of pulling all of that together and being very transparent in some of the general things that law enforcement is engaged in so that there can even be a reasonable discussion about it because I do believe most of the public is unaware of the extent to law enforcement surveillance that's taking place whether it be on the micro level within an individual, a group or a larger community if you're talking about stationary cameras in cities like Chicago and New York and some of those issues and then you can start to bring in for example with body worn cameras I don't think that there has to be this all or nothing approach of no the public are not permitted to see something after an event or only the police can see it but it can be there can be this middle ground where you have a civilian review board has access to it they can be they can communicate with the broader public if a community doesn't have a civilian review committee you can have key stakeholders within the community that have been called upon by police leadership to be able to show them what's going on keep them informed make sure that they understand what needs to remain confidential what doesn't need to be remain confidential but I think there's this dilemma that people think that it needs to be all or nothing and there's so much of this going on I do think that we're going to have to develop some whether it's informal, formal mechanisms to communicate between the police and the public and how some of these surveillance apparatuses are taking place of course so we're about nearing our end and so I'm going to throw one more out there and by the time we're done talking about it we'll be over our time right but I guess moving forward it feels as if you all have done work in these spaces for years although in the last two and a half three years there's been an intense national focus in a way that perhaps there had not been in recent history previously looking forward two years three years, five years where do you think a conversation around policing and technology is going and what might we be braced for you mentioned things like license plate readers we've talked a lot about the role of drones what is the next frontier of eventually we're going to stop doing these panels and stop doing panels where we just talk about body cameras what is our next round of panels going to be about I think it'll be the integration of all of the forms of data right now we're just having this emerging discussion about how do we compile and aggregate different forms of electronic data whether it's video, whether it's telephonic evidence whether it's internet based, computer hard drives and I think it's really going to be integrating all of this information and developing systems in which it can be examined in a reasonable basis in a more reasonable period of time versus the very decentralized strategy that we have now where some forms of evidence and intelligence are in one area and other forms of intelligence or information or another where it really requires a lot of human interaction with different segments of an agency and I think managing information is going to be the most important and I think that if we're going to be looking at things years ahead I think we're going to be looking at why is it private sector has been able to move so far so fast and why is it that law enforcement is still to this day decades behind private industry and I think that agencies are just now recognizing the cost of real technology and what that means for the tax base, what that means for the community to be kicking in with body cameras for example there's real costs to it, they're not minimal costs they're substantial costs these aren't cheap endeavors and the idea of relying and leveraging on this technology is going to be very expensive, it's not going to be cheap if you look at what somebody gets paid in Silicon Valley and what somebody is getting paid in the Omaha Police Department IT division it's just a whole different world and people are going to start to I think increasingly recognize this gap between one part of the world in which most people live most of the time and where law enforcement is at that's been largely underfunded and undertrained in these areas and I'm going to add to that so you'll see that sort of coming together of that public private piece when it comes to technology and then not just the integration of all of these desperate sort of data sets that we all have in policing but evidence base when we talk about training how do you use those data sets to inform law enforcement and how is it validated and when it's not how do you move that out and get that out of what officers are continually holding on to old narratives how do you get them to move and change and to think and to be more flexible and work that they do My goal my dream would be that we would see the technology policing leading to a reduction in the amount of money we spend in locking up people putting them in prison going through very uncertain circumstances checking out based on profiling who has what and then begin to have more peaceful communities as part of smart cities smart transportation more intelligent ways more accurate ways of delivering healthcare so that we could put more of this money in communities better education more global perspective on how we fit into the world's economy and how we as at least in Philadelphia are able to benefit other peoples of the world and around us we spend a quarter of our operating budget just on police and prisons not getting into probation courts and everything else I really think with the technology if it works out well given what everybody said we can begin to move toward less violent criminal societies and put that money into more equitable education and better quality of life so I think it's not clear where we're going to be and part of that is because technology is shaped by politics is shaped by the beliefs and values of the people who develop it and who implement it and I think that there really are two trajectories there is the trajectory that you spoke of which I think is the ideal trajectory where we're actually using technology to integrate law enforcement with existing systems that are providing care that are providing support where we're deploying mental health providers and social workers and crisis negotiators and folks who are able to actually address the underlying needs oftentimes of the people who are suspected and then there's another path I think the other path is where we double down on the police state the surveillance state you think about in Dallas where for the first time you had a robot blow somebody up and it was that sort of old school like really old fashioned 1970s robot but in 20 years who knows what that could look like so I think there will be two trajectories and it's up to us politically to shape which path we're going to be because they're very different outcomes How about a round of applause for our great guy