 Thanks for joining us. Good morning for those in Hawaii afternoon or evening for those elsewhere. Thanks so much for joining us at think tech Hawaii. And we have the good fortune to have with us today. Professor emerita for nearly Randall from University of Dayton School of Law. Professor emeritus and Davis from University of Toledo School of Law and now teaching at Washington Lee School of Law, and active professor and chair of the American Bar Association's dispute resolution section, David Larson in Welcome all of you. So, rather than trying to attack shortcomings in current media, and by media the spectrum is huge, it can't lump them all into one category. But if we were going to see the kind of responsible balanced media coverage that would help us understand important issues, and how to see them, and how to see choices relating to them. What might that look like and press around or you would start us off with the example of yet another recent shooting of an arm black man in a traffic stop. Yeah, a minor unregistered license plates. That's what the person was being stopped for. And that's always pretextual I can't tell you how many times I have for how long I have driven around with unregistered license plates because I forgot to do it. And the my son came back I've been driving around for months and my son came back from England, and the very day that he drives the car, even though I've been driving in around for months. He gets stopped. So, it's not, they're not running around stopping everybody they're using it as a pretext and so in this particular case, the guy, the young man. Let me get his name because I want to say his name. Patrick, why oh yeah, I'm probably mispronouncing it started walking away from the top. Probably getting mad I can understand you know and the car he got out and the cop. Anyway, he started walking away from the cop. He walked across the lawn the cop came after him and tackled him, and then try to tease him with a taser and the cop touched him first. He's a taser. He struggles, walk some more. The cop tackles him a second time. He's on his stomach, laying on the ground face down. The cop takes out his gun and shoots him in point blank range in the back of the head. Execution style videos from the cop videos from. There's all kind of videos showing this. So here's the problem we have the two fold problem and I think this was how the media now the media has been a bit better on the coverage of this but in general, what they do is slant the discussion so that it sounds like there was just some accident that happened. You know, the, that, you know, where they say you, you know, if you this killer in New York killed X number people are the people going to school shootings killed X number people, they, they tend to soft pedal cop killings of unarmed people. It's not just black people, but it's disproportionately. So it's, I'd like to explore that some more, especially how we end up not really dealing with. mechanisms in place to cause the cops, the, the institution of policing to change their behavior. There's a lot there. I appreciate all those observations. One thing in terms of our media and how things are being presented to us one thing to ask is, how diverse is the media outlet, because if you've got a diverse media outlet, maybe you are going to get some of those different perspectives so I think that's an important consideration. If we want stories to be reported in a very diverse way in terms of perspectives, then we need a diverse media. And I don't know that that's happening everywhere, although I'd like to think it's getting better. I think as viewers we just always right now have to keep in mind that media has a lot of control when it comes to selecting the stories they want, how they present those stories what's a perspective. The only thing that happens that happens in all kinds of intercommunication is that we frame our stories. And that sets up the acceptance of the conclusion. So you frame the story before you tell it, and then you kind of take you take it to a conclusion, but the conclusions a little foretold because of the way you frame the story. And we just have to be aware of that as consumers of media that that's happening. And we just can't always be led by the nose to a particular conclusion. And the other thing with media is just really interesting. We talk a lot about explicit and implicit bias in employment, we talk about it all the time. I think there's a lot of awareness growing in employment, but implicit bias and that we have to set up some some processes to address that and overcome that. I don't hear nearly as much discussion about implicit bias, when it comes to news media, the idea that we know that there are some explicitly biased stations and there's a question what is that okay or is it not. It's a little better. Okay, if they tell us their perspective, people are going to have different perspectives at least at least if they're honest about the perspective that helps a lot. But when it comes to media, you know the people telling the stories are going to have implicit unconscious bias just like employers are. And I don't think there's probably been enough discussion about that. And let's maybe add another aspect of it is that media are private owned private owned media is for the most part, looking for maximizing its profitability. You know the famous if it bleeds it leads right ratings and all that and advertising and all, and saying something of that is not a regularly used line of approach or frame let's say let's, there's a frame as was said, that frame and doing it and then that frame is not something that sells maybe. And so that you know you have that that difficulty of what will sell or what we're not. I am reminded of course everyone is reminded here of the George Floyd case again, where the thing was actually the media was kind of step through, because you had this private video of what was eight minutes and minutes and 26 seconds that no one could not I mean it was just running the thing you hear the people telling the police officer to get off of the person on me and it just that you know that you couldn't snip it in a way, or frame it in a way that was anything else than what it was. And I'm not exactly certain, although there were efforts to do it. I'm not exactly certain, you know, the frame couldn't be kept. Right. The frame couldn't be kept so then what did the frame turn into a turn sort of like what a loose can police officer that's another frame, right, which is basically saying that there's not a systemic problem that there was a bad police problem. We have all these different sort of standard frames that are used in the states. And whenever somebody tries to raise the fact there's a systemic kind of problem that is there, you know, that's not profitable. We always, we're always pushed away from looking at any systemic problem. And give you another example. I just got an email or something I read which was looking at the 400 most the 400 highest income individuals in the United States and the tax rates that they paid right you know. And we never talk about that, you know, I mean we it doesn't become a central issue because it's not a frame that is profitable. I think it's, I think that profitability is a big factor I also think the political meanings of corporate meetings core corporations are are the media corporations there's five large ones. There's five large ones are conservative center right or center left and except, you know, one or two are really right. There is no really progressive large media so there's no. And one of the things the law could do is break up the media, sort of like they did AT&T, and basically said, look, you know what, if you're too big to control all of this, we have to have room from other voices for other political viewpoints. And there can be no room as long as you know it's concentrated in for five to do that you could also forbid local ownership of media, the other most people don't realize that their local news shows local stations are owned by one of the large media companies for the most part are and so we could break that up but I think in terms of this whole issue with police officers. I think it, it doesn't sell because the people, the American public is uninterested in it. It's not just if it leads it leads well, you know, there's a lot of bleeding and police killing someone. I think it doesn't sell because we don't want it to sell because we want police. We want this vision of the police as a bunch of good people and in the system, not just individual police. I think Ben made a good point that, you know, we're talking about media and we're talking about television stations but media is a lot more than that today. And the fact that that was that event with George Floyd and murder with George Floyd was filmed from start to finish. And then presented something that has been, I think, very properly observes that it's something you couldn't really without exposing the heavy editing you were doing you really had to present it as it was. It was an extremely powerful event because of that opportunity that social media presented. So, you know, as we talk about breaking up the media and breaking up power, we have to talk about Meta formerly Facebook and Google and some of the internet powers that are controlling some of the conversation now also. And, you know, there's a lot of talk about that. There's not much has happened as far as I can tell. But when we talk about the power and the ability of pretty much 90% of the population is accessing the internet probably more than that as we sit there in 2022. I mean if you want to if you want to get people's attention. It's online. So, as much as we pay attention to television and radio, we have to pay attention to the, to the internet to, and that raises all kinds of interesting questions because when you're on the internet. You get these virtual echo chamber set up, where just the same message keeps getting repeated and amplified exponentially. But over and over and suddenly people all believe it and we saw that with the January 6 assault on the capital that kept talking about conspiracy and saving America and saving democracy and people kept hearing and hearing and hearing it. And they actually believed it. And so they attacked and committed all kinds of crimes resulting in the death of a number of people. As we think about media and power, we do have to think about the internet and the ability of two things really, I think that would scare me of these virtual echo chambers that really hyper get people hyper excited. And then this whole idea of these and Ben always is very attentive to this, you know the power and danger of algorithms, and the fact that when we go online, everything we're doing online is getting tracked. And everything that we're going to receive tomorrow is based on what we access ourselves and pretty soon, all we're going to be getting fed our things consistent with what we've accessed before. And the message we're going to get repeated, and we're going to come to believe that that's kind of what the news is and what the world believes. And it isn't. And we just need to be ourselves take some responsibility and understand that things are being pushed out to us in ways that really may not may not be reliable. And we need to make an effort to find some, some different viewpoints. Actually, I agree with you I just don't know that the way most people, I'm in different lines and I've come disturbed by how readily and freely people share stuff that that has no factual basis. And when you question them about it like the most recent one for me was, was his name the actor black actor. No, not Will Smith older black actor Denzel. No, it'll cause me. Don't say it anymore. He plays presidents all the time. Morgan Freeman. Morgan Freeman. More day someone had met someone had a meme with Morgan Freeman saying something. And so my question was, to the person who shared it, did he say that, or is this someone's mean that they just use his likeness to put words in his mouth. And their answer was, they didn't know and didn't care. Right. They, they like the meaning I said but that's how, you know, that's how this information because you, somebody is going to say, Morgan Freeman said, and then before you know it everybody's on the internet saying Morgan Freeman said this, when someone just made up a mean and the echo chamber thing I'm also really, I had another instance where someone said, I'm on one of the Facebook line I'm on, they said, those liberals aren't going to take away my rights as a parent. They want to be surprised. And so I said, Okay, so what rights are the liberals trying to take away. Quiet, no response, but that, but the echo chamber they were in people agree without even being able to articulate what their fear was. So I think you're right, I think the internet may be harder to control because I don't know that I think individual responsibility is not going to cut it. I think the echo chambers can be are going to be developed by people who are not taking individual responsibility, and that's going to be 99% of the people, and when people like you and I and Ben and others try to say something to them. They're going to get mad because they think you're trying to ruin the dump on them in an inappropriate way. And I was just thinking that, you know, the, the feed that you'll get is the in the algorithm that you get in the feed its intent is to get you stimulated right so that you keep going right I mean that I did a rant the other day about something that I got as a from somebody. And it was the term was Negro peon right. And I wrote the term Negro peon of Facebook link came up for something called Negro peons ain't shite, SHYT. And I was like, I could pick on that and I would be all of a sudden taken to some website where you know, but I was like, Wow, interesting look at this algorithm. And I was that put the word in tried to take me to something to provoke me, you know, just like that I mean the word I had never used in all these years right. And I was like, Hmm, you know, maybe as part of anti trust law which I understand sometimes is taught as a historic law as opposed to an actual law being applied. It's kind of tongue in cheek. Right. But the idea is that as anti trust law, you know, part of what it could be would be about requiring the actual. If you don't break up the organization, requiring the algorithms to be transparently available to everyone publicly available publicly available that they can be scrutinized, and they become in the public domain, because of the impact that they could potentially have. I mean, I don't know. But I think that's a great idea. You know, because what those, those, you know, garbage in garbage out and one of the garbage in is racism. And bias discrimination, though they just because it's being done by a computer doesn't make it free racial bias and make it free. It's not the algorithm is not racially biased. It is programmed to perform in a racially biased way. I mean, they're people writing on that kind of stuff. I know, you know, where, you know, like doing Google searches and what comes up as the top search kind of things that, you know, but I want to mention a book coming out soon by a guy named Chris Draper who has come up with I think the greatest term I haven't heard anybody use it except it's called digital robber barons. And I said, that's exactly that we are only data anymore we're not like ourselves we're this data that gets exploited right. And that concept of the 19th century robber barons and the antitrust reaction to it. And thinking of all of these spaces as dealing with digital robber barons and you know the same kind of sort of rapacious predatory nature. I think is really, really something that that's powerful. I mean, I saw for example, you know, Amazon had a union right, right. They named it their Staten Island facility that Amazon has a sort of employee chat technology where it specifies certain kinds of words can't be used in it, you know, like if you use the word union I don't know what happens but probably get fired you know what I mean you know, like words you can't use in a so called speaking space, because it's a privately controlled constructed algorithmic space that affects I don't know how many thousands hundreds of thousands of Amazon employees and the spaces in which they, they are, you know, but that's just within one company you can think of any company in the world right now that is somehow doing the linked employees kind of situation and, and these dialogue spaces of any kind. And you can ask yourself to what extent is there some kind of limit being placed on what can be safe. I had one other thing. I got, I worked at a place where there was this thing called Microsoft Viva, right, you know, and it would read my emails, and then send me the next day a message about sort of like a to do list that was generated by the Microsoft Viva based on decrypting what the emails I got the day before said, and I tried to delete it, you know, I said I don't want this. And guess what, it was like groundhog day every day. I am back, here I am, you know, I mean, it's heavy these kinds of things that are. I got a similar kind of thing that kind of summarized my daytime activity online, which, which is here's here's the minute you spent doing this and this and this and this said I want I don't necessarily want this out there. The internet work would be intercepted. Yeah, that was very disturbing. But I do believe that we do need to take some responsibility to educate ourselves but I also appreciate the fact brought up by Professor Randall that maybe people are going to do it because it's hard takes a lot of time so I particularly like Ben's idea of making this transparent okay so we'll make it easier for you. You know let's let's make the information easily accessible, so that you don't have to dig it up it's right in front of you and if you want to understand how you're being manipulated, it's, it's very easy to do. And yeah I really I like that idea not as a necessarily an end around to the address laws, but it kind of responded the fact that maybe they're not being implemented as aggressively as one would think they should be. This is a really interesting discussion. And if we could go another way with it it's like there's like degrees of things right so, for example we're talking sort of basically in the civil kind of space, but I've seen something about the use of in pre sentencing reports of these algorithmic things to make a decision about how dangerous somebody is in terms of recidivism or something like that in criminal sense. And the idea is that, you know, Defense Council would like to know what the algorithm is that based that number off, right. And it's a, it's a contract between the, the, the, the court and the company that provides this that would have been something that would say you, you know, you can't reveal this intellectual property, right, and, and so it's like a black box right that's right and there's like this whole black box versus white box versus gray box discussion, but, but that's in a particular setting of criminal processes where judges make decisions, part of it being informed by something that's coming out of an algorithm you know, it's not the same as social media but it's this, you might say there's heightened scrutiny maybe in that kind of space as opposed to other areas I don't know I you know, but, and I'm not even convinced that having, I mean I think that having these transparency is one thing that would be good but you know there's a lot of transparency is there's a lot of really rich people you know is what I'm trying to say who have their own transparency is the start but at the end of the day. I think the law has to be structured in a way to make people to make organizations and systems responsible, because if you rely on individuals, it's just not going to happen, no matter how much transparency goes on. Maybe transparent make, I mean, maybe the thing I one of the flaws I have with the Civil Rights Act of 1964 as a an effective tool for 21st century discrimination is so much of the discrimination is hidden from the individual and relying on an individual to bring lawsuit means that organizations who would monitor have to spend a lot of time finding someone to bring a lawsuit. One of the things we could do, both in terms of suing in the lots of areas is say organizations can have standing to sue based on data they've collected and an information they've collected and they don't have to find an individual person that's been harmed by media are by discrimination. They can bring a lawsuit that would be that would be a way of putting more impact than just individual responsibility because then you would have organizations who would take it upon themselves to collected data and sue on the dates data, especially if the law gave them attorney fees. It is punitive damages for repeated violations. Well, that is the theory of the EOC. That's why we have it EOC the problem is that is always been understaffed. I had an appointment managerial appointment at the Pella Division Washington, and the case load for EAC EOC employees was the highest in the federal government. It was so overwhelmed with cases that it just couldn't begin to address the problems around the country. So certainly one thing we could do is better fund and staff, the EOC the federal enforcement agency, so that they can take on a lot of more than not more of these cases, and they can file lawsuits but they don't have the resources and personnel necessary to do a comprehensive job. One of the problems I see I agree with you the EOC. One of the problems is EOC is subject to political administrative oversight and funding. And so any particular emphasis and then all the administrations Democratic Republicans they're not that interested in really kind of correcting the funding and correcting the thing by changing the law so that a nonprofit organizations can bring a lawsuit without having to have an individual person, you will give, there would be organizations who hold, you know, the whole thing in life would be to hunt down and file lawsuits against organizations who were basically doing it sort of like what happens with class actions but better because you're not looking to have to try to articulate and get class approval. Yeah, that's an important observation about the EOC. And if you recall Clarence Thomas was chair of the EOC before I came to the Supreme Court and when he was chair of the EOC. He said that the EOC is no longer going to bring disparate impact claims one of the theories of discrimination. Well that's one way one of the best ways to get at any kind of systemic discrimination is to bring a disparate impact claim which is a wide reaching spanning kind of claim. But yes the EOC is subject to political influences and he's a political appointee from a conservative administration. And he said is we're just not going to bring those anymore because I don't think that's what we should be doing. We should be focusing on the individual case of discrimination. So, and I hate to do this, but we're out of time for today. We had another high point. And you folks have somehow wound your way through the mess to point us forward to directions that may make more sense for responsible understanding and change. Thank you so much. Come back see us in two weeks. We'll be back. Thank you all for joining us today. Thank you. Thank you so much for watching think tech Hawaii. If you like what we do, please like us and click the subscribe button on YouTube and the follow button on Vimeo. You can also follow us on Facebook, Instagram, Twitter and LinkedIn and donate to us at think tech Hawaii.com. Mahalo.