 It's Wednesday, October the 27th, 2021. Welcome to What Now America. I'm Tim Apachele, your host. And today's title is Facebook Papers. Will Congress break them up? Back in February of 2019, the Pew Research Organization, which is a renowned polling research entity, estimated that 54% of Americans get their news from social media, either sometimes or often. And their social media platform of choice is Facebook. Now, let's overlay the testimony from Francine Haugen in front of Congress in this week, most recently in the United Kingdom. And basically what she has testified with thousands of papers from Facebook, internal documents, is that Facebook, as a profit model, a profit scheme, takes a well-meaning audience and takes a mainstream audience on the internet and pushes them to the extremes of different messages, messages that polarize Americans. And again, the more polarization, the more clicks, the more clicks, the more views, the more views. That's where advertisers like to put their ads. It's all about viewership. So guess what? Extremism, misinformation, and polarization has worked out quite well for Facebook, and it seems to be a profit model, which leads us to our question. Will Congress do something about it? And here to discuss that is Jay Fiedel, Winston Welch, and Cynthia Sinclair. Good morning, everyone. Good morning, Kim. Serious times and serious topics. Jay, how does Congress begin to scratch the surface of either regulating or breaking up a huge company like Facebook? Specifically, how do they regulate an algorithm or how do they begin to try to limit content if they do? Well, the first thing they do is they understand what's going on. They understand what the algorithm is. They're not going to be able to regulate it very well unless they understand it. And algorithms are not easy to understand. You have to sort of decompress them and look at the individual coding, which is going to be proprietary. So it's not easy, but Amy Klobuchar is the chair of the committee that's looking into this, and she swears she's going to do something about it. She swears she's going to do some serious regulation here somehow. A year or two ago, when Congress looked into what was going on in the big tech companies, they got nowhere because they didn't understand, and their staffers didn't understand, and they never really drafted any bills. But now I think it's different. They've had some time to study it. They've had a lot of press about it lately, and I think Amy will do something. But to go to the bottom line of your question, okay, this is going to have to go through the Senate. Remember them, and they always vote in self-interest. So let's see. Zuckerberg and Facebook likes division. Well, the Republicans like division. Zuckerberg and Facebook did some awful things in the elections, various elections, including the Brexit election, and believe it or not, the Haiti election years ago that didn't do them any good. And arguably, maybe more than arguably, some evidence from Algin and the like, that Facebook was doing stuff in the 2016 election and the 2020 election, and it was wrong. It was not just dividing the country. It was pushing voters one way or the other, which is hideous and horrendous in a democracy, and they haven't fixed it. They haven't spent the time, money, resources to fix it. They've been too busy raking in the coins. Okay, so now you ask, you must ask, what the Republicans are going to do? And they're going to do it as a block. They're going to say, we like what happened with Cambridge Analytica. We like division. We pray on that. We enjoy that. That is mainstream for us. We want that. Do you think for a moment they're going to vote to regulate or dismember Facebook? So is Amy Klobuchar, is she kind of in fantasy land when she says she believes she has bipartisan support for a bill that is either going to look into Facebook as a monopoly or look into restricting content and or the algorithms? Ask Mitch McConnell. I don't think this bipartisan support that's going to carry the United States Senate, which is completely dysfunctional. Okay. Winston, I'm going to read a quote from Mark Zuckerberg that responded to the most recent trove of documents that have come to light in the media. And here's this quote, good faith criticism helps us get better. But my view is we are seeing a coordinated effort to selectively leak documents and paint a false picture of our company. Is Mark Zuckerberg the right person to have in front of the camera and address this serious hole in the boat, this public relations nightmare that they're now dealing with? Is he the right guy? Because I hear a lot of people say he may know things on a technical basis, but he is not the should not be in front of a camera. He comes off as insincere and at times snarky. And what are they going to do about it? Is he the right guy? He's the main shareholder. He is the main shareholder. Yeah. What's his shareholding? This is very important because the question has arisen as to whether he should be CEO. If he controls the board, he can stay as CEO. If he doesn't, the board can get rid of him. What's his shareholding? I don't know the answer to that question, but it's interesting to watch this. We've had, after Cambridge Analytica, you would have thought that there would have been a whole bunch of changes that came down the pike. But we have seeded so much control. People worry about the mainstream media, Fox. That's all true. But when we ourselves, now, of course, our cable provider is knowing every time we flip the channel too, and I'm sure they sell that information wherever they can. Google knows everything that we search. If you go into your where you've been setting, whoa, they know exactly where you were, how long you were there. They put every picture and person together, every query you ever made. And to extricate yourself from this is extremely difficult. Amazon, same thing. But when you go to Amazon, you think, oh, what was that gadget I bought? I wanted another one. What was that tea that I bought? You know, you go in your purchase history and you see you bought that. So we've traded this convenience and laziness. But as far as Facebook goes and the specific algorithms that they have used designed to divide the public, designed to foment animosity, it does need to be regulated. I don't think we need an actor of Congress for this. I don't think we should expect one. We have something called the Department of Justice, last time I checked. And there are laws that they can probably go back to, to the big oil trust when they broke that up and said this is an antitrust, plain and simple. These people have too much control over, too much information, information being the product. You are a product, in fact. And so when you have a tiny handful of companies, and when Instagram is owned by Facebook, that's even, that's even more insidious, because it just dives right into whatever you're doing. They're collecting every click. They're selling it. They're manipulating us still, no matter what they say. He goes before Congress. He looks like boy wonder still. Oh, I'm sorry. I'm totally to blame for this. Has anything changed? I don't think so. And as far as Jay's point, if I were a Republican or Democrat, who knows what Zuckerberg is? Maybe he'll, he's, his job, remember, is to make money for his shareholders. But as far as the Republicans go, they don't know when that click is, they always talk about the anti Republican bias in tech. So what's to keep them from thinking that Facebook next time around wouldn't be much more so. I want to go back to his quote, if I may, Winston. I want to go back to his quote, because he said, this is a coordinated effort to selectively leak documents. In Zuckerberg's mind, who's coordinating this? Is this some proactive attack on Facebook? Is that what he's trying to imply? And then of course, he's saying it's a false picture of our company. So out of a thousand documents that seem to be pretty damning, how could that possibly be false? Is this a bad PR statement that he's put out, or is this further just, you know, amplified that he's incapable of addressing the problem? Maybe both. Just because someone says something doesn't make it true. And he says, oh, look, they're so mean to us. Oh, that lady so mean she left the company and she's exposing what we, we did something bad a couple years ago, but we're not doing it. Everybody knows Facebook is still doing it. Google is still doing it. You remember their old motto, don't be evil? I think their new motto is can't make money at all cost. And we're all part of it. And we're too small to even barely have control over clicking off the buttons. Does Google any article on how to reset your privacy settings on Amazon, Facebook, Google? It's exhausting to go through everything. And if you don't get every single one of your little gadgets that you use, it's still collecting. And it's even collecting when it says it's not collecting or it's collecting in ways that we don't even know or understand. It does take the Department of Justice. It may take an act of Congress, Jay, just to be fair, but I'm not expecting a lot out of the Congress. Department of Justice, they go in with some antitrust technology. Well, I think A.B. Klobuchar cited the FTC as the agency that would handle that task. And I think they are. But okay, let me move on to Cynthia here. Cynthia, I want you to tell me what you think is the most egregious aspect that's come to light from these papers and the most egregious policy or procedure that Facebook is conducting, because it's not just about polarizing Americans. It's also about the kids and the things that the kids are tuning in on. And even kids under 13 are getting into Facebook, even though there's a 13-age age limit, they're still getting in. And so there's content that could be potentially harmful and not helpful. Cynthia, in your opinion, what is the most egregious thing that Facebook's up to? Oh gosh, well it's hard to pick one specific thing because there are so many. Right now, out of the Facebook papers as they're calling them, there are eight complaints to the SEC about misleading investors, misleading the public. When it comes to the kids, I believe that we are going to keep this bipartisan sort of approach to dealing with Facebook until we get to the Kate speech. We'll keep that when it's talking about the kids, because everybody wants to protect the kids. But I'm along the same lines as Jay that once they start attacking hate speech, it's really going to curtail the ability for the Republicans to spread misinformation like they have been. And that's going to attack their personal way of doing things in politics. So I think they're going to... Let me interrupt that point. You said when they start attacking hate speech, is they meaning government agencies? Well, yeah, when the investigation starts to go after that and not after what's happening with the children. I have an example that I think speaks to the egregious nature of Facebook. Period end. First off, Facebook is used by more than 3.6 billion people worldwide. That's how huge it is. Now, from the papers, we got this one of the things that has been stated by Hagan, the former Facebook product manager. She left in May and she took a lot of the documents with her and that has become what we now call the Facebook papers. And it's important to remember though, there was a second whistleblower that came out on Friday. And we don't really know what they're saying yet. But this was the specific thing that I took out. Vietnam, okay, told Facebook that if they didn't curtail their anti-government posts, they would get kicked out of the country. So did they say, well, no, we're not going to do that. We're going to do whatever we want. No, they didn't want to get kicked out of the country. So it turns out that there were five times the amount of anti-government posts that were censored and taken out during the last few months for this big thing that was happening for them over there. So we know he doesn't care. He does not care. This is the first time we really see evidence of something that is completely below the ethics line when it comes to that we have proof of, right? And one, you know, Zuckerberg isn't about free speech, even though that's how he keeps coming to try to make himself out to be this free speech zealot, right? Well, he takes that away as soon as it changes his position in business. So we can see that from the Vietnam papers. That is hard evidence of just how much we can trust whatever Mark Zuckerberg decides to say. Okay. You had mentioned that there was a second whistleblower that's come out. Now, I guess my question is, given this quotation that Mark Zuckerberg gave during a third quarter investors meeting, how does that sit with employees? All the thousands of employees they have, do you think there's people just steaming going, I know the truth? And besides Francine Haugen, I have a different set of papers or I have a different set of knowledge. And do you expect to see more people come out, more employees? I do. I absolutely do, because they're not treated really well either. I mean, they might as well go to the job at Amazon for the amount of money they're making, yet they're having to really compromise their ethics. They have to really look inside themselves and say, do I want to be part of somebody that is going to, you know, curtail speech, anti-government speech, just so that they can stay there? Do I want to be part of a company that puts kids at risk? There's a Washington Post article that starts with the fact that they're subsidizing hate. And that is exactly right. You know, we know now from the papers that they make more money when people are fighting and going back and forth. And I have seen in my own experience with Facebook that they send me stuff just to see if I'll respond when they know full well that I'm not a Republican. I don't agree with that stuff, but they send it to me anyway, from obscure friends just to see if I'll get a reaction. And I try to approach that. And that's what I say to everybody out there. If you use Facebook, when you see these kinds of things happening, don't just use Facebook blindly anymore. We know what they're about. So it's great to be able to contact your friends and all that stuff, but it's also important to watch what they're doing and don't just throw yourself over and trust completely. Okay. Well, I recall in the 60s there was something called the Good Seal of Broadcasting. Doesn't sound like Facebook is following those guidelines anymore. All right, we're going to switch direction here, Jay. And I want to ask you about a civil trial that's recently started to take place. They're in jury selection. It's the Charlottesville Nine, if you will. It's those who have been identified as organizing the Charlottesville protests, which of course led to a death and many people injured in that protest. And again, it was a protest to basically say we don't like Jews, we don't like anybody. This is a different strategy. This is not a criminal trial. This is a civil trial. And what I've gathered up to this point is it's effective because it's bankrupting these organizations that are that perpetrate hate. The leaders of these organizations are now paying thousands of dollars for a defense attorney and it's draining the bank accounts. Your thoughts about this pending trial and the strategy of using civil lawsuits to undermine white supremacist groups? I like the idea of using civil lawsuits because in effect, the victims have greater control with the prosecutor. The prosecutor may or may not do anything. That's his choice. And if the prosecutor is a Republican, he won't do anything. But there's a bunch of other reactions that I have. Number one is three years. It's a long time. Our courts are too slow. Sorry. That's one of the problems in the country is our courts are just as delayed as justice denied. Why did it take three years to get to a jury? That's terrible. The other interesting point about this trial is a number of the defendants who were named and sued have defaulted. And judgments have been entered already against them. I don't know if the dollar amount of those judgments is settled, but they have defaulted and they are liable for whatever ultimately it is. What happened to the prosecution anyway? What happened to the criminal prosecutors? What happened? I mean, does it really have to go this way? And finally, finally this, the notion that it's been in all the papers, you know, that you're going to bankrupt these organizations. They won't be able to function anymore. Really? In what world do that conclusion live? Because they drop out of the one organization, sure, it's bankrupt. That's fine. And 24 hours later, they form another, even a non-profit tax-exempt organization where the mission statement is a lie. And bingo, you got a replacement organization that quick. So the bankruptcy of these organizations is, it's not really effective. I'm sorry. You have to go after the individuals. And I don't know in the Charlottesville trial, whether they're doing that, they should be hitting on every single officer and every single director of these organizations and bankrupting them individually. The organization. Isn't that what the FBI did in the 1930s and made these cockroaches scatter for cover? And they didn't really rear their ugly head until, you know, some protests in the 1970s where there was people killed. But didn't the United States government take a very aggressive stand against white supremacist groups, the KKK and others? Yeah, in the thirties. But in the thirties, there weren't an awful lot of them. And in the thirties, it's hard to say that the FBI was all that effective. And it's hard to say that the FBI and other prosecuting organizations are being effective now. And maybe the civilian trials would be a solution, because you don't have to rely on governmental institutions and prosecutors who could be politicized. You go against the individuals, though, that means a lot more than going against the organizations. Organizations can be recreated overnight. Yeah. All right, thanks, Jay. Winston, to you, almost on the same line here is, you know, Christopher Ray, the FBI director, spent an hour or so in front of Congress basically noting the threat of domestic terrorism. And certainly white supremacist groups is a huge element of that. Do you agree with Jay that really these organizations can fold and start up again? And the United States government, specifically the FBI, is helpless in how they try to tackle and prosecute those that are trying to create insurrection and racial crimes, hate crimes? Other governments not helpless at all. It's maybe lacking in some willpower or I don't think manpower. But they're tracking tens of thousands of people, I'm sure, probably hundreds, maybe more groups domestically. And I think they're assisted by other groups like the ACLU or the Southern Poverty and Law Center, Vice-President of Law and Poverty Center. They track these extremist groups. And I think a lot of the time, they're preventing stuff that we don't even know they're preventing. It's just happening. And that's, you know, it's a byproduct of our monitored society. And one way that they're communicating is on these groups like Facebook, you know, or Parler or something. So there's something to be said for having them be more organized than secretive for being in a 501C3 or whatever they are, rather than just people that meet down at the river. Like they used to in the old days, are you going to get rid of this? I mean, my question is, is like, what happened to our nation? What are just common values where we have to rely on these things? You know, and just back to Facebook for a second, this is a very confusing issue because Facebook went out a week ago for a worldwide outage. And I missed it. But apparently a lot of people did it. Like, didn't you hear? I said, what? Facebook was down for 12 hours. And you would have thought the world would come to an end. But for many people, it did because their businesses rely on it. And they do their meetings on it. And they have their very good organizations on it. And that's the primary method of communication for a lot of these things. How we're going to break this up, how we navigate this very complex system, not to mention the white supremacists and all of that. These are really advanced questions. And I hesitate, you know, we don't have speech loss in this country. We have a free speech loss. In fact, it's enshrined in our Constitution. People can say, in a sense, whatever they want to say, however they want to say it. The only thing that stops them is these corporate regulations that say, hey, you're having hate speech here. We're banning you for this. I know I posted some things on Facebook where I got put in little Facebook jail. And when I said, hey, this is not, it's not anything, it's not, it's not hate speech. It's not political speech. It was just for an organization that automatically gets flagged for everything they put up. What about if, you know, someone decides that think tech is, you know, had a shell one that they didn't agree with, and then suddenly it gets... Well, that would be this one. It would be this one. But they ban all think tech. And then they ban, and then they'd say, Google, you two, you've got to disappear the words think tech from your search algorithms and, you know, Tim's name and all of that. So we can't, we can't be found anymore. That's what Cynthia was talking about in Vietnam. They do it in China. They do it in Russia. You know, they've just blanked out entire types of groups. And that's the real danger that we run when we're trying to say the government needs to step in here and say, oh, this is hate speech or this group is a hate group. I mean, the NRA might be declared it one day and the, you know, the think tech show on what's next America might be declared the next day. And that's not the society that we want. So we've got to navigate very carefully here how we go about this. I'm sure. Okay. So I'm opining. Yeah. Yeah. Okay. Thank you, Winston, very much. We're almost out of time. I did want to get to this and Cynthia, you're going to have the last probably word on this. You know, the FDA has just one of their committees has okayed a vaccine for five year old to 11 year olds. But do you think parents are on board with that? Some are and some aren't, you know, even people that aren't anti-vax are nervous and afraid. It's all seem so new, you know, to people. They were nervous and afraid to even get it for themselves. So, you know, of course, they're going to be nervous and afraid about further kids. And it's just going to take some time. It's going to take a lot of PR. It's going to talk, you know, a lot of positive results. And I think it's important for people to, to remember that, you know, 800 kids have died since the beginning of this. That's a lot of kids. And we say that it doesn't affect the kids, but it does. And I wish I could remember the number, but it was the exact number. But I know it's five times the amount of kids were hospitalized through this pandemic from COVID, then are normally hospitalized in a year from just regular flu and other things. So five times more, we know that they are affected. And, you know, we heard all that talk in the beginning from the last administration or, oh, kids aren't affected. It doesn't hurt kids. Well, you know what it does? In my mind, one kid is too many. So, you know, what can we do to stop that? And I think that's where our scientist minds are, is that one kid is too many. What can we do to stop it? And I think that's really important. And, and I just need to say one thing about where we were before in regards to this whole trial for the nine, that on the nine that were injured in Charlottesville. And I think something that they're trying to do is establish the difference and get it in, you know, in court. So that's legal jargon even separates that free speech from, from planned encouraged violence, right? And, and it needs to go through court to be able to get that. And it's precedent that we can use for January six people also. And I'm pretty sure the Supreme Court in the 1960s came up with many cases that did limit free speech from destructive speech. And I'm pretty sure those are, those cases are still, still on the books. They are, and hopefully they will come into play during this trial, because it really does set a precedent for some of the January six cases going forward. Alrighty. Well, I want to go around the horn, we're out of time, but I want to go around the horn and get everyone's last thoughts and opinions about either what's transpired this week or what's coming up here in the next few days or in the next week. Jay, to you. Didn't start right now, but it, but it is accelerating right now. And that is the possibility that our, that our First Amendment will degrade is degrading. And the hard to say how it's going to come out, but if you regulate the algorithm, if you regulate Facebook, if you try to regulate speech in general, it will be regulated. And the First Amendment will be undermined. And I can't say exactly how, but I just feel that we're on, we're on a sea change there. And the change is little by little, free speech is going away. So get used to it. Get used to it in Facebook, get used to it in, in all manner of things. And before you know it, we'll be in 1984 where it's, it's not so much that you will feel a lack of freedom to speak, but that other people will have freedom to take draconian steps against you for what you have said. And fear will be the central point there. People will be afraid of exercising, expressing their views. And I think that's already happening in some ways. How would you like to be a member of that jury in Charlottesville? How would you like to get some death threats if you're a voting official? It's not so easy to, to speak out anymore. And it's going to be less easy going forward, no matter what Amy Klobuchar does. And in fact, maybe it's part of what she does. So I agree with you, Jay. Ask anyone in Congress that dare speaks out about the dear Donald Trump. So I agree. Winston, your last thoughts. Jay just left me demoralized, but I think he's true. You know, what he's saying, and it was, it's what I was getting at too, is when we get on the slippery slope, of course you can't incite people to violence. You can't yell, don't fire in a crowd at theater. That's true. But when we start going down the slope of, this speech is okay and that speech isn't. And where that line is drawn, when it may be unpopular speech, you know, of course these, many people are absolutely just reprehensible in their speech. And we know that and we understand that. And ideally, the sun shining on them has the purifying factor. It puts them out there and they go into the pillory, the hall of public shame. Unfortunately, our hall of public shame is not the same as it used to be. And people have a different set of values that are somehow we just diverged. Facebook was part of that. It exacerbates it. It'll be an interesting thing to see where all of that goes and how they break it up and how it comes into pieces. Because unless we all stop using it and go back to the Stone Age of communications, it's going to be around. So a lot of very complex issues. Ones that we tackle here on think tech, which is a free speech zone, we can save what we want. And that is a precious thing of not only for our viewers, but for our nation and for us as individuals. Well said. Cynthia, your last thoughts? I agree with Winston. That was very well said. I agree. I have a quote, but I want to say something real quick about that. I think that one of the things that this Charlottesville trial can show is that they're looking to show that the violence was planned and encouraged, that weapons were used. They dissect the coded language and the cloaked language, which is the most important thing I think here. They hide their plans under just joking, right? And there was a precedent that's already been set in regards to that for Taylor Dumpson. And the landmark case, she won, they won $725,000 against the neo-Nazi Andrew Anglin because of the warm welcome things that were put out to, they never said, you know, go out and harass this lady. He said, go give her a warm welcome when everybody knew what they meant. And the same... Sarcasm, Sarcasm, as Donald Trump did a lot of that during his presidency. To your quote, please. We've got a time. I was done. It is from Robert Kennedy. And it's every time we turn our heads, the other way when we see the law flouted, when we tolerate what we know to be wrong, when we close our eyes and ears to the corrupt, because we are too busy or too frightened, when we fail to speak up and speak out, we strike a blow against freedom, decency, and justice. Robert Kennedy. All right, that's a great way to end the show. And that's a great quote, and it's true, full true. I'd like to thank everyone who joined us today, Jay Fidel, Winston Welch, and you, Cynthia Lees and Claire. Thanks for joining us on What Now America. Please join us next Wednesday at 11 o'clock. I'm Tim Appichelle, your host, and we'll see you then. Aloha.