 It's July the 21st, 2021. Welcome to What Now, America. I'm Tim Apachele, your host. Today's title is Facebook, Suspended to Due to Poor Oversight. The last weekend, President Joe Biden was asked by a reporter what he thought about social media platforms and their contribution to misinformation, or not misinformation about the vaccine. And President Biden kind of shot from the hip, and he said, they're killing people. Well, on July 19, President Biden kind of took a walk back from that, and he wanted to clarify it. And he said, my hope is Facebook, instead of taking it personally, that Facebook is killing people, they would do something about the misinformation. President Biden then further said, Facebook isn't killing people. These 12 people out there are giving misinformation. Anyone's listening to it and getting hurt by it. It's killing people. It's bad information. So given the title of the show, and given our discussions in the past about Facebook and Twitter and social media platforms, Jay Fidel, CEO of Think Tech Hawaii, has determined that Think Tech's participation on Facebook should be suspended. And I'd like to go to that question. Before I do, I'd like to introduce our guests. With us today, Jay Fidel, CEO of Think Tech Hawaii, and Cynthia Lee Sinclair. Good morning. Good morning, Tim. Jay, you took a big step. I got two questions for you. The first question is, do you agree with President Biden's statement about social media? That in a sense, they are killing people as a result of the misinformation. And secondly is, how did you come to your decision to suspend Think Tech Hawaii from Facebook platform? Let me address those one at a time. It was sharp rhetoric. No doubt about it when he charged Facebook with killing people. I mean, it has the implications of murder. It has the implications of an affirmatively wrong for lack, if you don't mind. And he did walk it back in a mild mannered kind of way. He walked it back. And I think that was because Facebook attacked him in a vigorous attack over the use of the word killing people. But you think about it. It was that word that was troublesome. It was that word that he walked back from. And if you think about what's going on here, Facebook has an enormous bottom line. It has really hundreds of millions of customers. I say customers because it makes money on them. And it has a few people. It's not clear how many who are doing, you say, misinformation. I say disinformation. Those are lies. When you say that vaccines don't work, when you say you shouldn't take a vaccine, that it's bad for you, that it's politically inadvisable to take a vaccine, that those are lies. And those words that not, they're destructive lies. And we know already, just look at the map, that people are dying. So I guess the problem in the using the word killing was that Facebook itself did not was not the positive agent of the deaths, but it set up a platform where other people would be a positive agent of the deaths and are, and are still doing that, by the way. And it did not effectively stop them from doing that. So OK, maybe allowed them to die instead of killing. The word's too sharp, I suppose. But I wouldn't really disagree with the notion of calling them out on this. They weren't fit in their bottom line than they are in saving people's lives. And people are dying as a result of disinformation, lies that are being spread on the Facebook platform. So that's the answer to your question, the first question. The second question is why do I feel that way? I don't feel we ought to be involved in disinformation of that nature when people's lives are at stake. I look at the map, and I think the right opinion here is to say, gee, Wiz, if you don't take a vaccine and you have no good reason for it for not taking the vaccine, you're participating in a very gruesome national experience where hospitals in a number of counties and cities and states around the country are being overtaken by COVID, where they don't have the resources. It is a replay of last March and April in this country. And it is political. It's the red states. It's the Trumpers. It's quite remarkable that they would do things that destroy themselves, their families, friends, communities and all that intentionally. Intentionally, when did they actually learn? And in fact, we entertained and have included a commentary on that point on our ThinkTech lineup, which is an official statement of how ThinkTech feels, but it is certainly a commentary worth listening to. So what we have today is disinformation that kills people and disinformation that can be prevented. It's not that Facebook doesn't have the resources to stop these guys. And it's not that Facebook doesn't have the intellectual ability to stop these guys. What's interesting, I suppose, is a piece that appeared on National Public Radio a couple days ago about a fellow named Ben Shapiro who runs something called Daily Wire, which is a very conservative website, I guess. Maybe it's also a podcast thing. And they do what Fox News is now doing. They say, well, it's okay that you take the vaccine, but let me tell you all the reasons why people feel you shouldn't take the vaccine. And the net effect of all of that is to discourage people from taking the vaccine. It's very clever. It's very slick. And that was the upshot of the piece in National Public Radio. And there are many organizations around that use Facebook to spread that kind of lie and to diminish the president's message and the government's, the current government's message about taking the vaccine to save your life, your family's life. You know, one of our colleagues isn't here and that's Winston Welch. And I just saw just a quick blurb from Winston just now and he said, you know, Facebook is a gossip rag. It's become, you know, basically a gossip rag. And so you're seeing real news being intermixed with gossip and real news being intermixed with fake news depending on what side you're looking at on the coin on that one. But it's a mismatch of information. And do you think that Facebook has improved or Twitter has improved? Given all the spotlight it's been under, the scrutiny by Congress has been under, do you think they've improved somewhat, say from a year ago? And let's all acknowledge their role, be it Facebook or Twitter or anyone else, the role in false information about the election, 2016 election and even to some extent the 2020 election. So they have a hand in that. You think they've improved their ability to ferret out bad information and prohibit it? Well, I think surely they've improved their ability but I don't think they've used their improved ability. I think a lot of misinformation and my word disinformation gets through. And I'm not satisfied that they're not doing business with the likes of Cambridge Analytica politically. They sold themselves, they sold themselves in the 2016 election and made millions and millions of dollars selling themselves to Trump. So and having a huge effect on the country. I mean, anybody with a national conscience or a shred of patriotism would realize that the platform that they are providing is doing incredible damage to the country. They know that people rely on Facebook, a lot of people rely on Facebook as a sole source of news. And when these slick guys get in there and they publish things that make you confused, that take you in the wrong direction, they are affecting public opinion by the hundreds of millions and they know that. And as a matter of fact, I feel that they are really taking perfectly in efforts to stop that from happening on their platform. This is going to require a huge change in Facebook. And to answer your question, there has not been a huge change in Facebook. And they're doing PR, but that it doesn't reflect the real change. You saw the response they gave to President Biden when he criticized- There was a blog response, which we're going to go through some of that right now. Yes, I saw their response and it's right out of the ABCs of public relations as far as how to respond to criticism. And we're going to go through some of the points here in a minute. But I guess the question is to what extent if they don't dramatically change their approach to misinformation, what consequences did they suffer? If any. Well, first, what consequences does the public suffer? What consequences does the government suffer? What consequences does democracy suffer? You know, if people are being misinformed and disinformed, our very democracy is at risk. We've seen that. We saw it all through the Trump administration and Facebook was part of it. You know, I'm sure they realize the huge leverage they have, the huge effect they have on the national conversation and national public opinion. And so they haven't really done anything to improve that. Like what was the second part of your question? Well, you've answered it. You know, I mean, the consequences of what they should bear to their obligation as a social media- Well, if we had a real Congress, if we had a real Congress, Congress would do something. Congress was investigating this and- Okay, but you know, Facebook is international. What about other countries? Should they dive into this and say, well, if you're going to be operating in the UK or in Canada or wherever, you have a social responsibility. You bet. But let me add one thing that, you know, we don't want to undermine the First Amendment or at least we don't want to violate the Constitution about having, you know, a government control the media. We can't do that. But there are ways to change it. Partly, I suppose, is the law of defamation, disparagement, and the law, for example, that came up in the voting machine case, where statements were made about the voting machine company, I forget its name, Cynthia will remember its name. Dominion. Dominion, where they lied about Dominion and Dominion sued them for slander or libel. And, you know, that is one way that you hold these social media companies at bay and that you have some control. Another way is this section that was supposed to be included in the law that allows suits by the public against social media companies when they lie. And it makes them responsible for the information they repeat on their platform. This would be a huge step forward. But of course, we have no Congress. I say that only because Congress hasn't done anything in years, through the Trump years and especially now. Well, they did something years ago. They had their strove vote this morning. They had their strove vote this morning, you know, a couple of hours ago on infrastructure. And of course, the Republicans squashed it all and it doesn't look like there's going to be an infrastructure bill. So if there's not going to be an infrastructure bill, how in the world are they going to reform the social media in a Congress which doesn't do anything? Thank you to the Republican Party, question mark party. Okay, good point. Great points, Jay. Hey, Cynthia, before I go to you with a question, I'd just like to bullet some of the response from Facebook, they did a blog post and I'm just going to highlight some of their defense. They basically said, don't point the finger at Facebook. They think they're a responsible corporate citizen and their response is the following. They claimed they removed 18 million instances of COVID misinformation or as Jay would put it disinformation. Now that's interesting, that's a high number. They have billions of people on Facebook and they've removed 18 million. But what that response doesn't say, and I want your reaction on this, it doesn't say how many posts are out there that they didn't catch, they just caught the 18 million. Was it 25 million? Was it 40 million? Was it 1 billion? So you could clearly see this as a public relation response. Your reaction to that one bullet point alone from Facebook. Well, Marjorie Taylor Greene is the perfect example for this question to be answered, right? It took her how many times of putting out misinformation specifically about vaccines, specifically about COVID, specifically about the big lie. And yet now finally, all she gets is 12 hours of not being able to post anything to me. Well, let's clarify something. That's Twitter, not Facebook. Twitter, excuse me. Oh, they're owned by the same people? You're pretty sure they're not? No, no. Oh, I guess they're not, that's right. Okay. Yeah, so it's, but your point as well, founded that she has repeatedly spread disinformation about the election and has whipped up a, you know, millions of Americans about the validity of our election versus their perceived, you know, fraudulent election night. So again, misinformation, disinformation. And now she gets a 12 hour suspension from Twitter for her comments about, she said two things. She said, non obese people probably won't get COVID and it's not dangerous if you're under 65 years of age. So those are clearly two highly erroneous messages to be putting out there and she gets slapped on the wrist for 12 hours. Is that appropriate? No, it's not appropriate. And neither is any of the stuff that Facebook is doing. And I agree with you where you say, you know, they blocked all these different ones, but what about all the rest of them? And I see them on Facebook. So I know they didn't block them all. And I'm still friends with a lot of people from the South and it's interesting what is on their feeds as opposed to what's on my food or our feeds here in a blue state, right? I think they very specifically target people and until they start to really address the algorithms that they use in order to target people, then they're not doing anything as far as I'm concerned. Well, and thank you for saying the word algorithm because they're saying their algorithms are catching disinformation, misinformation. I'm not sure I buy that. I'm sure it does some of it. But whatever happened to good old fashioned employees that would police, you know, I know it's a lot of work. They're gonna have to hire a lot more police. But why don't they? They have the capital resources to do so. Why aren't they have actual human beings looking for bad content? Well, I believe that they are in the same way that Kevin McCarthy is covering his own butt. I believe they are too. I believe they are compromised and have been completely, it compromises the word, yeah? By not just the Trump administration, but the people that were behind a lot of that misinformation that came through in the 2016 election. So I think they are still compromised by their own complicity back then. That their hands are kind of tied as to what they can do, let alone what they wanna do. We sort of already know, you know, Mark Zuckerberg's level of not wanting to compromise himself and being greedy and obviously because he has made so much money and done so many things that have showed us who he is and that he's not trustworthy. Okay. Hey, we just got a question coming in. We really do appreciate questions that come in while we're doing this show live stream. And for you, Cynthia, the question is, how does Facebook determine what is true and what is false? How do they go about it doing it? And is there a better way? Yes, there is a better way. It's called the fairness doctrine. It was in place from 1949 until 1987 and they need to put it back in place so that news is news and social is social. And when we can separate those two, then perhaps we can make some of the headway. But I think until we separate the two, we're not gonna get very far because people's opinions are people's opinions, but the opinions that get retweeted or not retweeted, but reposted and reposted and reposted end up becoming true. It's like the lie that gets told over and over and over becomes true. It's the same sort of thing with this. So until we make news, news, and there's consequences for people like Tucker Carlson and John Hannity and Laura Ingram and the whole Fox News Network, until there's consequences with the fairness doctrine where we can catch them a ton of times and see. Well, let's go to Jay's point that you can't get anything done in Congress. How is that gonna get approved? Well, let me jump in on that. It's us guys. Sorry about that, I didn't mean to stop the band. Forget about Congress on this, forget about it. It's just a waste of time. So many of the initiatives that we'd like to see passed are gonna be are being killed and delayed out of existence just to make Joe Biden look bad for the next election. That's why the Republicans are doing that. That they have no ideology, no agenda, except to make them look bad. Don't forget Congress. So the question is, who speaks on this? Well, he answered Tim, Cynthia, we speak on it. We're a small fry, we're just streaming video, but we work consistent and we take the position that if they wanna create a platform for disinformation and not take any steps to remove it or stop it, we don't want any part of that. And we should and we did suspend them as far as submitting content to them and showing Facebook on our systems. And we'll wait, we'll see what they do. The ball's in their court. But the ball's also in, maybe they'll clean it up. I wouldn't hold my breath, but maybe they'll clean it up. And so how are they affected? Well, they're affected by public opinion. And it has to be guys like us and maybe a lot of other people who would say Facebook, you're off base here. You have to take affirmative step. Now, so it won't be Congress, it probably won't be government, it'll be the people. Well, wait a minute, wait a minute. What's to prevent President Biden from taking a page out of Donald Trump's book and that is the power of the executive order? What's to prevent President Biden from saying, from here on in, we're gonna re-institute the fairness doctrine that was in place for many years that was deemed legal. It was only by politics that it was redacted, if you will, erased. What prevents Joe Biden from doing that? Tani? Well, thank you, Cynthia, you know more, but wasn't the fairness doctrine a statute? Well, I can tell you because I've got it right here. Well, it doesn't tell me as much as I'd like to know about this, but it was in 1949 is when it was put in and it required broadcasters to present both sides of political or controversial issues in an equal and honest way. So we can- Well, that was the equal time. Okay, well, yeah, I'm not sure that balance reporting is what it used to be. If what you're talking about is, quote, balance reporting, then for every truth, there is a lie and we have to report both sides. That's what some people would argue. And the problem is they have no basis for arguing their side of it because it's not true. So I'm not sure that's a solution. Let me go to the solution that I have in my mind, okay? So it's the people and the people have to stand up, the people have to say, we're not gonna play on a platform that broadcast this information. It's not good for us, not good for the country, to have lies being propagated this way or the world. So we're out of here or we're gonna complain until you change your system. Now, the system, okay? Actually, we have a show this afternoon about artificial intelligence and ethics. And I think there's a role for that. And Mark Zuckerberg knows there's a role for that. I'm sure he uses AI in the way he propagates whatever he's doing and the way he operates his platform. Remember, even in the international relations and surveillance business, the way it works is you look for key words, key words in email, key words in any kind of text document, key words in the spoken word or the video word. And if you find a combination of words using AI, you can determine that this is something that hits on truth or hits on lies. You have to program it. And this is always dangerous to have the government program, this kind of thing. But if Mark Zuckerberg took it upon himself to program it and say, look, people who are bound playing the vaccine, we're not gonna let them on the platform. We're not gonna do it. If we lose money, that's okay. We're not gonna do it. This is an ethical moral question. We have AI to identify every time somebody is doing this information on that issue, we're gonna stop it. But hereup, we're going to, we're gonna refer this to a committee and he can afford to have a committee. He can afford to have thousands of committees and the committee is gonna determine human people, whether this is, you know, this information. And if the individual who put this information in has a problem with the result of that committee, he can appeal it. But it doesn't get on the platform until it gets processed that way. If the AI lifts it in at first, fine. If the AI stops it, it goes through this process. And it doesn't get on the platform without being cleared. And so, you know, I think this can be done in-house. I think if the power of Mark Zuckerberg or if Mark Zuckerberg were ethical moral guy, not so interested in power and money, he would do exactly this. He has the resources. He has the funding. He has the technology. He has the people. Why doesn't he do this? It wouldn't be 18 million. It would be many more than that. And it would be more sophisticated than that. You know, if you think about it, college professors are doing this all the time to make sure their students don't plagiarize. They have that software and it's readily available. All the algorithms we were just talking about, it's a simple algorithm that they could put on the whole thing. And it would do exactly what you're talking about. But you also said that if Mark Zuckerberg was had some integrity, but we kind of already know that he doesn't. And that's what I was beginning to say before. Just look at what he did to his partner when he first started his business. He, you know, ripped him off. He does not have integrity. So we cannot expect him to do it. I hate to say this, but I think you're right in the sense, Jay, that it's up to the people. We just need to pull our involvement in Facebook. And when that happens, we'll be going better. But this is what I don't know. Maybe it'll improve. Maybe it'll improve. You know, for example, he's got stockholders. He's got directors. You know, if they're independent thinkers, if they entertain, you know, rational and patriotic thought, they will say to him, Mark, you can't run the company this way. You'll have to do better. And he will have to do better or lose his job. Within the company, this is not government. These are the individuals who will participate in the corporate structure. Do I would say consequences is a bigger stick and the consequences of breaking up Facebook because it's a monopoly of sorts, that would be a bigger deterrent than him not getting on the bandwagon with this information. So I mean, right now, J.I. grew with both points you made as A.I. would be a nice little helpful tool. But in addition, he's got to spend a lot more money on employees to come up with these committees and to just be on the hunt, not just a technology solution because I think he's implementing that already. He's claiming that his algorithms is ferreting out all this bad information and you and I both know that's not the case. Yeah, we got to do better. He's got to do better. And your point about breaking it up is very interesting because I was going to tell you before that a year or two ago, there were hearings in Congress about big tech being too big, right? And they were considering this issue and they were asking him questions. But it's kind of interesting that the Congress doesn't have a great ability to ask probative questions. They have these hearings and they all make political speeches. They don't know how to ask a question. So they got no answers. So then the hearings were suspended for a year or more. So the staff was to go back and do some research and find out what questions would be probative. I think they came back, but I don't think they did anything. What Congress has to do, if it's going to entertain the antitrust approach on this, it has to know what it's talking about. Yeah. And it's like what Fauci said to, was it Jim Paul, I've got Paul. Ben Paul. Yeah. Ben Paul. Yeah, yesterday he said, Senator, you don't know what you're talking about. We have to have senators who know what they are talking about. And then maybe they could entertain some kind of breakup strategy that would be hard politically, make no mistake about it. And as I said before, there is no Congress. But assuming there was, that would be an interesting approach to break them up and to foment the competition. I read the other day that there is actually a small company that is entering the field on Google searches. And they are head on in competition with Google on searches. And I welcome that. I think that's good for the country. It's good for tech, good for politics and so forth. Good for us. And the same thing could and should happen with Facebook. And in a funny way, Cynthia, you know, Quitter is a different company. It's owned and managed by different people. Some people think that the management is better at Twitter. And I would be included in that. But bottom line is we need competition in social media because if we don't have competition, we have effectively a monopoly on public opinion. This is very threatening to our democracy. Yeah. Hey, we've run out of time. So I'm going to go around for last comments. But I want your last comments to take into this last thing that I'm starting to watch. And that is the last day or so, Sean Hannity has stepped up to the plate. He is full-throated, endorsed vaccines, which I've never seen him do before, and Mitch McConnell just recently, full-throated, encouraged people to get vaccinated. So your last comments, maybe you could entertain, what happened? What's the turnaround? What prompted some talking heads on Fox News to start getting serious about not putting out disinformation, misinformation, and what prompted Mitch McConnell to step up to the podium in the microphone? Cynthia? Well, I think that it's that their constituents are dying and getting sick. And so, you know, they don't want to come out the other side and have people go, why did you tell us to take it, you know, when it's their people that are dying now because they're the ones who are vaccine hesitant. So I think that's part of it. I like to say one last thing about Facebook as far as, you know, yeah, what they're doing about the vaccines and all this stuff is one thing. But what about their involvement in propagating the January 6th insurrection? They put violent stuff all over the place and cannot tell me that their algorithms did not know. There's just no way they didn't know it was happening. Yet they did nothing about it. And that's what I'd like to see addressed even more so than anything else. Sure, they, you know, they are trying to look like they're doing better now. They've, you know, their vaccine hesitancy has gone up by 50% and blah, blah, blah, since they've changed their ways. You know, there's still people talking about January 6th, supporting it, thinking it wasn't bad, talking the big lie all over the place. So why suddenly are we going to look at them just for vaccine issues? I think we need to look at them across the board. They are a danger to our country. Okay, appreciate those words. Jay, your last thoughts. I'm proud of Jay for doing this and saying no more and taking a stand against it. I'm very proud of you. Good job, Jay. Well, I hope they change. I mean, it's certainly possible. It should happen within the corporate structure and without the need for government intervention, that would be better. But let me say the reason that Fox News, my reaction to that, your question, Tim. Number one is it appears that a lot of those guys are taking the vaccine and it's really kind of strange to on the one hand be taking the vaccine and let that get out into the press and on the other hand discourage people. That's one factor. Don't know if that's the driving factor. Another factor is they want to build credibility, okay, so that they can do outrageous things on other issues, such as the big live, for example, Cynthia. I think they keep on doing that. And the third possibility that I've been thinking about is what you have here is a slick operation. It's like the operation that was reported on a national public radio with regard to the daily wire. They don't necessarily come out and tell you that vaccines are bad, but they give you, for every time they say, ah, you should take a vaccine, they give you some kind of pap about how there are very serious consequences, there are side effects that could kill you and they talk about mom and pop who decided they weren't gonna do it. So there's all these side stories and you're left confused at best and you're left wondering whether you should take their advice when they give you this very soft advice that maybe a vaccine would be a good idea. And so it's deniability. We never said that it was a bad idea, but we gave you all the reasons you could consider in coming to your own conclusion. It's very slick. It's the new approach. The new approach and disinformation. I heard Tucker Carlson say just two days ago, it's your right as an American to ask solid questions about these things before they force you to take a vaccine while no one's forcing anybody to take a vaccine yet. Number two is he's using the red, white, and blue mom and apple pie to say that you as an American should be questioning everything and that should prevent you from taking the vaccine. This is one other thing I wanna mention and that's a reality that we observers of the Trump administration should understand by now is if you say that it's just a sniffle back in March of 2020 and then later, you evolve off that point, there's a very substantial number of your base that is looking for signals. And when you first said it was a sniffle that's the signal they caught. And everything you said after that to the contrary, they don't take seriously. They believe what you're really telling them is it's a sniffle. This is the theory of my theory anyway of momentum information. You make your statement at first and you get people to believe that's how you really feel. And then when you change it under pressure they don't change their minds. They think today it's still a sniffle and they die over it. So it's the theory of these guys at Fox News, yeah, they're changing their tone but the guys in their base and Trump's base who have heard the original message still believe the original message. And I think we're watching that phenomenon in public relations happening right now. Okay. Can I have to think, Tim? Can I just? Well, we've run out of time, but very quickly. Very quickly, it is a quote from Trump himself. Very quickly. The vaccine thing. People are refusing to take the vaccine because they don't trust Biden's administration. They don't trust the election results and they certainly don't trust fake news which is refusing to tell the truth. So that's behind all of it. All right, all right. One guy behind all of it just blowing everything up anyway. All righty. Hey, great discussion, great points made. Thank you very much, Jay Fidel. Great points, Cynthia. Cynthia, Lee Sinclair. Join us next Wednesday at 11 o'clock for What In America. I'm Tim Appachello. I can never say my name correctly when I'm doing this show. I'm Tim Appachello, your host and we will see you next week. Aloha.