 All right. Hello, Charles. Thanks for joining me. How are you doing today? I'm very good, Chris. It's really good to be here. Yeah, I'm glad. I'm glad we were able to set this up. Yeah, I hadn't heard of the book. I heard of it. I looked at it. I'm like, OK, I need this. So Social Warming is your latest book. Can you kind of discuss what inspired you to write this book? Or even if you want to, can you kind of define what you mean by social warming? Let me start with the definition first. So social warming is a phrase that I came up with. Obviously, I'm reflecting the phrase that people use all the time of global warming, which is that very subtle, very gradual, but completely irresistible way in which the planet is getting hotter because of the things that we are doing. And all the scientific evidence is there. We can see it all around us. Very subtle things like the winters are just a bit warmer. There's that bit less snow. And you can get extreme conditions as well. So you get once in a century hurricanes, which hit sort of once every 20 years in Louisiana, for example. All those sorts of behaviors that the climate gets it to. And so I was thinking about social effects, about the effects of social networks. And the thing that for me got me really interested in it, which I know has also been a sort of thing that came to your eyes was the 2016 election of Donald Trump, which stemmed to some extent from a very small number of votes in three critical states. So there was Wisconsin, Michigan, and Pennsylvania. And about a few of the 100,000 votes there swung each of those states from the potential to be Hillary Clinton's because they've been of arms before to being Donald Trump's. And those were a very small fraction of the total number of votes cast, even in just those states. And if you looked at it compared to the whole number of votes cast in the whole of the United States in the 2016 election, tiny fraction that got me thinking about, well, you know, very small changes that make very big differences that and also that the Trump campaign had focused a lot of its efforts on Facebook advertising during that campaign that it had focused on what to many people seem like a new medium in terms of trying to persuade people either to go out and vote or not to go out and vote in some demographics. It was just a small difference, but it may have been sufficient to make enough of a difference. It might not have been all of the votes which you do to Facebook, but it could have been some. And that's the sort of warming thing that I'm talking about. It's these very small effects, these very small analog effects that have an effect which tips something over from being in one state to being in another state. So, you know, it's rather like, you know, you take take water, you know, zero degree centigrade, it might still be ice. And then you take it to zero point once and you got water. It's a phase change. It's a it's a difference in how things behave. So that was really what got me thinking about the whole social effects of social networks, because I've been a technology journalist for a long time. So there's decades, I'm afraid. You know, I've got to the point where, you know, my first story as I was writing on literal physical typewriters with a carbon copy was a real thing for me. So, you know, I've seen the technologies come along. And social networks have been one of these things which have really made a very big difference, I think, to how people think about what they know, what they don't know and how they find out about things. I mean, Google was one of the things that really changed how people thought about how do I find stuff out? How do I know stuff? And social networks have had a similar dramatic effect. I'm not sure it's been as good as Google was in the first days. Yeah, yeah, no, exactly. And and yeah, I really like that that analogy that you made. And that's one of the reasons I was drawn into the book. But yeah, in a similar way, like I I've always been kind of like this kind of like, oh, yeah, you know, who cares about politics? Who cares about anything? There's nothing I could do. But in 2016, I took a step back. And I'm like, something crazy is going on. And that's when I really started reading books and I wanted to educate myself. And so that's that's why I read so many freaking books. But one thing that I thought was interesting and I would love to, you know, just just get into your your head a little bit because there's so many books on social media, right? There's so many. No, hey, social media is destroying our minds. There's like books on how it's affecting our kids. There's there's so many. But as I mentioned, like when I reviewed the book and I sang your books praises on on Twitter, it seems like, you know, you you touch on things that I haven't seen. So when you were working on this, when you were writing this, like, did you feel like there was a gap that needed to be filled that you were like, hey, how come nobody's talking about these aspects of how social media is kind of affecting the world? I guess that I sort of started with the with the feeling that this was something that was happening. Social warming was a thing. And then I went out to see, you know, in a sort of scientific way, because I've got a sort of scientific training, to some extent, to sort of see if I could, you know, put together the the bits that would make the hypothesis work. And so a lot of the a lot of what I've described in the book comes out of sort of looking around at, well, what's the what's the science in this? And there's, you know, there are quite a lot of scientific studies in the book. But I, you know, I hope I try to make them as accessible as I can, but I try to sort of relate what they do to the way that people behave. But I mean, it's a four step thing. There's there's this process by which we're all tribal. If you look at humans, there was a sort of a there's a theory that about two million years ago or so there were really very few humans that survival was pretty much at a premium. So you had to stick together and that made us tribal. You had to be in the tribe. You know, the tribe supported you. You supported the tribe. If you didn't do things that were good for the tribe, you could be chucked out of the tribe basically mean you were dead. And and that is what gives us the feeling of outrage. If you offend against the tribe's sensibilities, the things that keep the tribe going, then people will react by being outraged at you and you could be chucked out of the tribe. So you have, in effect, you create the idea of the in group and the out group. You are tribal. These people are in our tribe. These people are not in our tribe. And outrage is one of the ways that you sort between them. Nowadays, of course, you know, the tribalism doesn't matter. We're not so worried about our survival in the same way. But we still have the function of outrage. So the next thing that happens is the social networks get involved. You have algorithms and the algorithms will look at what we do. They have no understanding of the concepts of tribalism or outrage. But they just are sorting and selecting for what they think is engagement. To the algorithm is you spending more time on the social network. That to them is a good thing, whether or not it's people or shouting each other and calling each other names or spreading this information. If there's time spent on it, it's good to them. So the algorithm reinforces the outrage that the sorting them, the in group, out group stuff. And finally, the fourth part of this four step process is moderation or the lack of it. The social networks are not interested in strong moderation. That's not good for them, because if you throw people off your networks rapidly for spreading misinformation or or sort of, you know, spreading outrage for no reason and all those sorts of things, then you reduce the time that they're going to spend on it. Obviously, you're getting fewer people on it. And this lack of moderation is the thing that allows this whole cycle of tribalism, outrage, however, that reinforce me to keep on happening. Yeah. Yeah. That's that's something that I I really, you know, got interested as well. In 2019, I had I had the algorithms working against me to get, you know, outraged mobs coming up to me. And I, you know, that's when I started learning about, you know, tribalism and looking at evolution of psychology. And I wanted to understand groups, like group psychology and to see what was going on, right? And and yeah, you you discuss how like these algorithms are kind of amoral and they just they just kind of do their things. But one thing I'm curious about, whenever I read one of these books, right? Because there's, you know, there's that thing, you know, the curse of knowledge, right? And this is something I've been interested in. I grew up in the 90s. I was there when social media started to help help me because I was socially anxious. I'm like, cool, I could talk to people online and and all that. So I've been just kind of in it since. And so when you're writing this book and I ask everybody this who discusses this topic, like how how big of an issue do you think this is, like the lack of awareness? Like, do you think, you know, there are still millions or even billions of people on this planet who don't understand that these algorithms are having they have their own agendas and they're, you know, polarizing us. And like, you know what I mean? Because that's what I'm always wondering, because if there are, I guess the question is, how do we get these books in front of them? You know what I mean? Sure. So the question to people are people unaware that the algorithm. Doing these things to them, I think, I think the vast majority of people aren't aware of the mechanics of it. But then they're not aware that that the things you see are being picked for you and chosen for the attention that you've shown in the past to other things that the things are for that. I mean, you know, I forget it myself when I'm from time to time. You know, I sort of look at a post and think, oh, yeah, this is interesting. And it says, well, that's all that's it for the posts. So I think is that really it? Is that really everything? Have you actually, or is it just that you've algorithmically chosen not to show me any other things? And yeah, I think the vast majority of people have no no conception of the the indifference of the the algorithms behind these things to to their benefit. It's it's not about your benefit. It's about the networks benefit. Yeah. So do you do you think there's a certain demographic that's that's less aware? So like I'm, you know, for example, like I'm 36 years old, my son is 12 years old, right? So our younger people, they're they're even more involved in this stuff than we were and stuff like that. So do you think there's certain demographics that are unaware that the algorithms are kind of picking and choosing and manipulating and just trying to keep you on as long as possible by showing you certain things? I think it varies. I think that, you know, some people are aware of it in some places and they're not aware of it in others. So TikTok is is interesting. I didn't I didn't really have time to examine TikTok in the book. And it's it's influences a little difficult to discern. But one of the interesting things about TikTok is that it's really, really powerfully driven by. Yeah, it's and it's not even explicit. You don't tell it, you know, I'm interested in this. And, you know, so I'm interested in sports and, you know, and poetry. None of that. It just looks at what you're interested in. It just measures every microsecond that you spend looking at things. Do you flick down, do you flick up? Do you spend a lot of time on this? Do you like it? It looks at those things and it figures out what you're about really quickly. So I suspect that the younger kids are able to see that that's happening. They're able to perceive it in some places. So I think on TikTok, on YouTube, I think that that it becomes more, more obvious. And when they're perhaps their idols, you know, people who they think are really great are somehow cast down by the algorithm. I think they notice that too. And, you know, I've got a I've got a son who's 20 and he talks about things that he puts on YouTube. And he says, yeah, I don't even bother with the SEO. I don't, you know, I'm not trying to feed the algorithm on this. Yeah, that there is awareness of it. To some extent, you know, the problem is the older demographic, the people who think they're, you know, well informed and now stuff who are actually badly informed and don't realize that when they do, for example, a Google search, that the Google search is not indifferent. It's not giving you the 10 best results. It's giving you the 10 results that thinks you'll click on. And that's a distinction that most people are not aware of. Yeah, I was actually just listening to a podcast this morning. And it's so true. Like if you're trying to find, you know, the actual quote or source to make sure that it wasn't kind of tweaked or manipulated or, you know, whatever, it's really hard. Like there's certain things that I've looked for. And and I know all the tricks like all open up like incognito mode so it doesn't have any of my cookies and I'll try to like find stuff. And even then, you know, and and I guess, you know, part of my education about like by day, I work in like digital marketing and stuff. So I'm in the SEO and all that. But but yeah, even when you try to get around it to find different information or better information, it's hard because there's people investing a lot of money because they understand how these algorithms work. You know what I mean? There are people spending a lot of money and that's something else. I don't know if people understand. But yeah, in in in reference to Tik Tok, I've I've at least played around with just about every social media app and Tik Tok is insane. Like it it will send you down rabbit holes. And like I mentioned, I have a you know, I have a 12 year old son. My my girlfriend has two younger brothers around my son's age and stuff. And seeing the kind of rabbit holes that it leads them down, it kind of blows my mind. I'm like, I can see how these things happen. And, you know, fortunately, we're able to talk with them about it and explain it to them and all that. So it is something like I I look at with, you know, just other parents and stuff. Like I I personally put a lot of responsibility on parents. I'm like, you need to understand how this works. So you know what your kids might be getting into. But with with you being in the tech field for so long and writing about all this stuff, right, for decades, you know, you you talk about the early days of the social media platforms. Like did you or anybody see this coming, this kind of social warming? Or was it so gradual that it was like, wait a second, like you it was too far gone for you to really realize it? No, no one had any conception of of what it would be like at all. Yeah. I mean, the early days of social media, it really was the sort of thing where you would come in and, you know, Facebook would look different and you'd come in and, you know, Twitter has added this whole new feature thing called retweeting. And things are what were sort of both incremental, you know, you would you'd sort of see that there seem to be more people joining them. But there didn't seem to be any effect from you. So, you know, one of the key ways that Twitter has made a difference to the world, I think, is the way that it's become a conduit for news that, you know, often if there's a big event and people go to people are on Twitter, at least, we go to Twitter to look for things about that and Twitter tries to encourage that. It has its sort of events and it has its moments stuff and it has various tabs where it tries to show things that are happening in the world. And that's had an effect as I write in the book on journalism on how people try to get their information. And it's significant. I think that people go to Twitter rather than Facebook for news because the algorithms is so powerful on Facebook that it won't give you sort of breaking news. So, for example, when there were all the, you know, disturbances in Ferguson in the US in, I think, 2014, you know, the big thing on Facebook was the ice bucket challenge. Yeah. That was the main thing that was being shown. Whereas on Twitter, you know, it was all over all over the place. So so there is that that difference. So no, but we didn't have an idea that it would that it would come up and be this thing that that where you'd have waves of outrage flooding around the place where Facebook would be radicalizing people from the far right in the US to go and, you know, try to. You know, protest in in bizarre ways and try to invade the capital and stuff. Yeah. You'd you'd have to you'd be a science fiction writer, to be honest. In 2006 or 2000 to write something like that to say, yeah, you know, I sort of think that, you know, give it 15 years or so and and this will be wildly out of control, that that just didn't seem feasible. Again, you know, that was back in the days when the principal means of getting information was a news websites and be Google, which was reflecting what the news websites were telling you. If you wanted to find something out, you'd go to Google and it would probably direct you to a Wikipedia page. And, you know, actually through all of this, Wikipedia remains the place where, you know, you can at least get some sort of sensible, reflective, informative news or most certainly information. But but to a large extent, you know, all the social network certainly YouTube and to a lot of largest and Google, they've all been sort of corrupted, so maybe they're all word or maybe the right word. But they've they've been overtaken by the people who want to profit from misinformation. And there's a big market in profiting from misinformation now. Yeah. Yeah. And that that kind of transitions perfectly to my my next question because you spend a, you know, a chunk of the book talking about the early days and like a lot of these started with good intentions, right? Like, you know, for example, like there used to only be a few main news sources, which was good and bad, right? But and, you know, social media kind of created this public forum where people could have conversations and, you know, it kind of leveled the playing field where anybody could, you know, even start a blog, right? And put their opinions out there and it got rid of some of the gatekeepers. So there were a lot of good intentions when this all started. And like, I'm just curious your thoughts. Like you just mentioned some of the, you know, you know, money incentives, right? There's a lot of money to be made from the people who want to spread misinformation, all that stuff. Like, do you think there was, you know, there's a like a big place where where some of these platforms went wrong? Like when their value shifted from this, you know, good intentions to kind of, oh, you know, and just forgot about that? I don't think there was ever a particular point where where that happened. I think it's actually built into the system. If you're if you're the basic model that you work on is we get lots of people, we algorithmically hold them on the network, you know, by showing them things that we think they're engaged with, we make money from showing them adverts because they spend their time there, then actually it's built into the system that if people can find their own content, which they can somehow monetize on there, then you're going to get false content on there, you're going to get misinformation. The thing about misinformation, and again, this is in the book, the thing that the reason why fake news or false news, false rumors spread so much better than the real news, the actual truth is that you can say pretty much anything you like and fake stuff. You can tweak it to what what people are interested in. And there are sites out there, you know, sites that call themselves quote, satire sites, which are actually, they're just it's just fake news. They're just trolling for ads. But they they write all sorts of crazy things that that aren't true, but will encourage people to click on them. And that's that's the the advantage that they have over, you know, the truth is that it's far more attractive, because you can you can make it up, you know, it's why stories attract us so much more than the nuance of reality. And you see it in things like, you know, the Afghanistan pullout that there's been there are all sorts of stories going around. I mean, the most the most recent one is about, oh, there was this guy who was hanged from a from a helicopter. Turns out that's not true. You know, in fact, he was it seems he was trying to he was perfectly well alive. He's trying to paint something. But that's not the story. It's so much more convenient to tell the full story because it gives you a way to get at the other side, you know, calls in the tribalism. So unless the networks were prepared to take the gigantic step of moderating everything and say, Well, this isn't true, I'm just going to kick this off the, I'm going to kick this post off the network, because it's not true. Which to some extent they did during the pandemic, you know, they sort of, you know, they, they go to themselves to the whole question of, Okay, this is a deadly pandemic, we've actually got to, you know, take our responsibilities seriously now. And they really did almost a 180 on what they thought about, should we allow this to be up and should or should we remove it? Facebook in particular, which for years have been saying, no, you know, anything's fine. And Mark Zuckerberg had even gone on the record saying, people who are holocaust and I as well, you know, everyone's got a point of view. There's one quite his words, but it was sort of like that. When it came to the pandemic, they really changed their tune and started removing people removing content that they thought was actually going to be harmful. And in the same way, they've changed their tune about holocaust and they now remove that. So there's, there's evolution there. And I think that they're starting to see that it's not actually a great model to be like this all the time. Yeah. So, you know, that brings up interesting, you know, stories. So real quick, like, because my, my, you know, I primarily grew on YouTube, right? And for a while, like I really into like, especially during COVID and all the misinformation, I wanted to, you know, debunk some of the stuff. And I really try to teach people, you know, just recognizing our own biases and how to think better and stuff. Well, anyways, before they really cracked down on YouTube on QAnon channels, I made a video debunking a QAnon conspiracy video about COVID, right? Saying don't wear masks and all this other stuff. This is pre vaccine. Well, YouTube, something happened with the algorithms that ended up taking my video down and giving my channel a strike and you get three before it deletes your channel, right? But you do have the opportunity to manually appeal. So I did, right? And you get a, you know, pretty quick response within 24 hours. And they say, Nope, we're upholding our decision. And that's when I get to a point where I'm like, Okay, so even when we do get you like, so you have the algorithms, your second layer for quality assurance is a human, right? But if the human is then not checking it, that is a huge issue, right? So my question is, because I try to empathize, I was talking with somebody about this yesterday, YouTube, for example, has thousands of hours of content uploaded every single minute, right? So you got to you got to outsource some of that to the algorithms and say, Hey, check this stuff, and then that'll filter to the humans, right? But even still, I'm wondering, I'm like, do they have is it is it even realistic to have enough manpower, right? So so I empathize with them. But my my video didn't go back up. And the other video didn't get taken down until I got the press involved. You know what I mean? So I'm curious your thoughts about that. Like, is it even something that can be done by having that many humans moderating and checking? You don't want stuff to get to everyone in the world. The failure is in the model, the failure, the failure is, it's absolutely in there. You can't, you can't do it perfectly. And the algorithms are always going to fail as well, because they don't understand the nuance, the reality of what, what is criticism? What is criticism? You know, saying the same thing, but in a particular tone of voice, what is a tone of voice? You can't program this stuff in. And if you're doing extracts in this video and saying, but, but, but, but this and but that, you know, it's just going on trigger words. And the person who rejected you or appeal, yeah, probably had 1000, you know, 1000 of them to deal with. And they're going to take the time to actually go through a whole video. I mean, I, I find it pretty exhausting just, you know, trying to watch videos on YouTube anyway, because unless it's a subject I've particularly chosen to watch, then I find all the, all those sort of introductions and the sort of getting used to the people and things like this is, it's like cognitive load. I'd rather just read it. Yeah, that, you know, YouTube, YouTube has set itself up to fail in that respect. It can never be perfect. You could only do it if you, if you basically asked, you know, I don't know, everyone in India and maybe China as well to, to check each video as it went up, or else you have trusted sources and you say, okay, well, you know, stuff from this TV station, stuff from this producer, stuff from that producer, this stuff we, we accept has got to, has been, you know, had a gatekeeper already. Everything else goes into a pile. And, and, you know, we don't entirely trust it. But, but YouTube doesn't want to do that. They don't want to be difficult in that way and say, well, you know, we don't actually necessarily trust, you know, 99% of what the world uploads to us, because that puts in, puts them into the difficult position of, well, are we here to put stuff out and to, to be a video platform? Or are we actually in some way a publisher? They never want to be a publisher because that's got all sorts of other responsibilities. They just want to be a place that where stuff happens and they show people ads. They'd be so happy. You know, if it wasn't for the humans, this would all be great. You know, it'd be a terrific thing if it wasn't for the humans, if they could just get, you know, algorithms to create the, the video content and algorithms to, to review it and, you know, humans to watch the ads, they'd, they'd just be in heaven. But sadly, the world's not like that. Yeah. So I'm wondering, you know, what you think about this, because you're like, you reference like Wikipedia, right? So Wikipedia is largely like, it's, it's crowdsourced fact checking. They have like, there's volunteers, there's people who come in and say, oh, this is bad information, they change it and, you know, all that stuff. I've, I've recently been, you know, educating myself on how the wiki ecosystem kind of works. So then recently, Twitter came out with this new feature called Birdwatch and as somebody, you know, who cares about the truth and stuff like that, I got into the program. I'm not super active, but I like to check and see what people are saying. Because for those who don't know what Birdwatch is, like it's, it's the community kind of policing each other and they can go in and say, this is misinformation or whatever. And I'm not a thousand percent sure if it's trying to train an algorithm or what the end goal is with it. But anyways, I guess what I'm getting at is, do you think that that is a potential solution is getting the community to flag stuff as, hey, this is misinformation because I could see how that can go terribly wrong. But I can also see how, hey, then you don't got to hire a bunch of people and then you just get people who care about the truth. Police in the community, what do you think? The difficulty is, I mean, in principle, yes, it works. But the difficulty is always how you reward people for doing it. If the reward for doing it is some sort of status, then the incentives are bad. But if the reward is simply that you get accuracy, then the incentive is good. But people prefer status. Yeah, people are very status driven. You had one of the people who you had on before, I think it was Chris Pail was was talking about, you know, we're very status driven when we go on these networks. The thing about Wikipedia is that actually no one sees who the Wikipedia readers are unless you dive into the ecosystem around sort of, you know, go around the back of the building, as it were. And that, I think, is part of why Wikipedia does work is because actually you don't know who the hell the editors are. And whereas if you have something like Birdwatch and if there is a thing where you say, yeah, you know, hey, you know, I did all this fact checking today, then that carries the risk of bad incentives where people gather around someone and say, well, this person, this person's fact checked all these things and, you know, so they're right. So they should fact check these other things. And again, you then have the problem of domains of expertise, you know, if you're a virologist, are you there for competent to be an expert on epidemiology and vice versa? And if you're really good at epidemiology, are you the sort of person who should be fact checking stuff about education? How narrowly did you define the debate? It becomes really problematic. And, yeah, in the book, I talk about the Facebook's fact checking efforts Oh, yeah, where it would, you know, the algorithms, it seems would pull out things that were questionable and send them to fact checkers who would then laboriously and, you know, humans would go and check the facts and, you know, slap labels on things to say they're false or not. Problem being that it took too long to do the checking and the stuff that was false would be up there in the meantime. And by the time that people had actually checked it, you know, it basically had its viral lifetime and, you know, you couldn't catch it in time. Again, you're one of the problem of, you know, fake stuff tends to be more popular than true stuff. Yeah, a lot of what I'm thinking about here is very much in the context sort of the pandemic and, you know, anti vaccine thinking, which is actually, you know, is physically dangerous. It's and yet the problem is how do you get incentives for fact checking about this stuff, which works, which works effectively? You know, I've sort of given up now basically on trying to discuss this stuff with people who are anti-vax on Twitter, you know, you get people in more replies about it. I just block them because I know they're not going to change their mind. You know, I can see the four I can see the flaws in their thinking. I had someone today who was saying to me, oh, yeah, well, you know, these vaccines aren't live attenuated, you know, the mRNA vaccines. And to explain why you, you know, an mRNA vaccine cannot be live attenuated. It's a completely meaningless idea. But, you know, they'd heard the phrase and they thought, well, you know, live atten- that's a thing, that's a thing for viruses, you know, that's how they used to be. And that's that's not how you make an effective virus these days. You couldn't have produced it quickly enough. The level of sort of misconception is colossal. And how you break through that is one of the big problems of our age. And the problem that I also see, you know, rather like global warming is all being enabled by us driving cars around, you know, social warming is being enabled by the way that this sort of misinformation can be picked up and then spread. And it's really hard to put the genie back in the bottle, so to speak, you know, to sort of get the misinformation to go away, because the answer to misinformation doesn't seem to be more information. And I'm not entirely sure what the answer is about it. Yeah, no. And I love just, you know, just seeing what people's views are, and you brought up a lot of nuances and things to think about, you know, if we did do this kind of crowdsourcing one, and you mentioned status, I actually talked to Will Store and the, you know, at the time of recording this, his new book, The Status Game comes out this week. And in his book, he talks about how you can get in status within your in-group, right? So it's not even inverse out, it's within your in-group, you're raising in status by being even more just anti-vax or anti-mask or whatever. So one thing that I'm constantly wondering, so it's just, you know, like, let's say, you know, the social media platforms like Mark Zuckerberg and, you know, Jack Dorsey, they do the best they can, they get some great rules. The question then that I wonder, and it's kind of something that you're talking about, I often wonder, do people actually care about the truth? And just for example, I had John Petracelli on here, he wrote the book, The Life Changing Science of Detecting Bullshit, right? And it's about, like, caring about the truth, right? Because bullshitting is like, oh, yeah, well, you know, just a complete disregard for the truth. So I'm wondering, like, you know, because I'm trying to learn more about, you know, group identity and stuff like, like when you give people the truth, when you give them the evidence, like, do they care? Like, even if Facebook and even if, you know, Twitter only showed you the truth that doesn't get rid of certain news outlets and stuff like that. So do you think people are just, you know, driven by their confirmation bias and group identity? You know what I mean? Oh, yeah, people are people are enormously driven by that. The trouble with the truth is the truth is can be sometimes it's really hard to understand. And sometimes it's very nuanced, you know, sometimes it's got subtleties embedded in it. So for example, in the UK, children aged over 16, between 16 and 18, are only having a single vaccine shot at the US where they're having two vaccine shots. And the reason for that is because the the committee that looks at the safety of vaccines here reckons that the risk of myocarditis or endocarditis, which is heart inflammation, gut inflammation is slightly elevated from the second shot, but not elevated enough to make it worrisome after the first shot, while still giving you protection from the virus. So the truth about that is quite subtle. It requires balancing of risk. And if you say this to people, then they go, well, that means the vaccine is not safe. And, yeah, when that becomes a way of strengthening their existing as a confirmation bias thing, if that's what they feel or else if they're or else if they're people who, you know, are very much in favor of vaccination, they say, well, that's, but, you know, really, that's, that's not a problem. It's so small anyway. And the thing of saying, well, you know, that you've got to weigh it up. You've got to be open minded to the possibility. But, but, you know, once you've got the definitive evidence, then you stay on one side or the other. That's that's a difficult thing to explain to people. And, and, you know, a lot of it, I think comes from from the scientific process from being able to consider lots of different things as possibilities, throw them away when they show not to work and then move on to, you know, the things that you do know are true. The difficulty is, as you say, you know, you show, you say this stuff to people and they weren't necessarily accepted and high status people, which might just mean some with a lot of followers. High status people are going to be very reluctant to show that they've changed their mind. They've got to be quite humble in that way to to, you know, accept that they've been wrong, that they've been saying things that weren't true or that, you know, they, they can't support now. That's one of the most difficult things for people to do generally. And, you know, weirdly, journalists actually get a lot of practice out of these days. Yeah, you know, because, because, you know, things come along and you say, OK, well, we thought it was this and now it's that. And so for all the way that the people actually attacked journalists on social media, actually journalists get really good practice at saying, so, you know, yesterday we told you this today, you know, actually it's different. A lot of people are not accustomed to that sort of flexibility of thinking. If the social networks were to show you just the things that were true, you know, that they're probably another social network would spring up, which would say, now we're going to show you the things that are really true. And, you know, there'd be there'd be fresh biases to have there. And to some extent, you know, the problem is in ourselves. The problem is in our inability to break out of our ways of thinking and our rigidity of thinking. And, you know, it takes a lot of practice to do. Yeah, yeah, it's definitely difficult and something that I I spend a lot of time trying to figure out, right? Like, because we we get defensive when our beliefs are challenged. And like you mentioned, you know, let's let's say, you know, just somebody who's been very anti-vax or anti-nost or whatever, they realize they're wrong. There's so many things that keep them from admitting that they're wrong. Their status, right? They, you know, what if their their job went at risk? Like you think about a news person who's been doing this and then one day they snap out of it. And what's crazy is we've recently seen, you know, especially here in the states, you know, conservative news host dying from COVID who were against the vaccine. You know what I mean? So I'm like, how many how many friends do they have that are also loud voices that would say like, oh, crap, you know, but then, you know, then maybe their media organization would get mad at them because there's like this admission of being wrong and then is there legal issues and all it's so. So it seems like there's also incentives to not admit you're wrong when you have that platform. You know what I mean? Oh, absolutely. I think it's three, three radio hosts now who have said, you know, COVID is not not an important thing to worry about who died in the U.S. from it. You know, clearly the problem is being a radio host. It seems to be a, you know, it's a real indicator of risk. But in many cases, you know, it's interesting. I've seen examples where, you know, they've reported examples of people on the deathbeds where they said, you know, crap, I really wish I got a vaccine and people who don't at all people who who insist to the end that, you know, actually it's they're fine. They're going to be they're going to be OK and they don't regret anything. So, you know, if you're not going to have a deathbed conversion, then I guess you're pretty set in your ways. And yeah. How, you know, again, how do you get out of an information ecosystem that's I would say deteriorated so badly that people can't change their views? I wonder to some extent whether the U.S. has this worse in terms of polarization because it's such a strong two party system where you it's really difficult to be in the middle, you know, there's there's sort of no no center party in the way that there is, for example, in the UK or in other Western Democratic countries where you have multiple parties at elections. And I think that to some extent that the polarization in the U.S. is is also a result of the rural urban split. But, you know, the thing is always that the social networks exacerbated. Oh, yeah. It's it's, you know, it's to their benefit to for it to go to go on because that's what the algorithms see as engagement. That's what they see as a good thing. This is the effect. This is the social warming effect that you see all around you is that, you know, people seem to be a bit angrier, they seem to be a bit more set in their ways. And I know that you again, I think it was Chris Powell who was saying, you know, that the echo chamber thing is not quite true. People see stuff that's comes from outside the echo chamber quite a lot. But what it does is it reinforces what they think. Yeah, they stick more closely to what they believe before. They don't change their minds. And again, this is this is the problem is how do you how do you get a population that will that will be open to new ideas? I mean, you look around the world and is there any country where people are like that? You know, yeah. So the end place where you might think it sort of happens is China. But the way that happens in China is they tell people to have a different idea. Authoritarian system where if they if they say, OK, kids are now going to have one hour a week to play video games, then that becomes the way that kids are. And that's not quite the sort of model that Western democracy is going to follow. Yeah. Yeah. Yeah, it's it's weird. It almost seems like incentive structures as a whole would have to change, right? There have to be some kind of incentive for wanting to change your idea where the incentive isn't like you mentioned, like in China, where it's like, hey, change your idea like this is like. But but yeah, we were kind of talking about, you know, you know, pundits or people with big followings on social media. So something I'm hoping that you can kind of unpack a little one. One part you mentioned in the book is that, you know, we have this multitude of choices for like sources and new sources. And you would think that would create more equality. But you say it actually creates more inequality because there's so many choices. Can you kind of like explain what you mean by that? How that happens? Sure. So there's an effect called the power law, which is seen a lot in nature when you have sort of self organizing systems. So most people are familiar with the idea of the bell curve, you know, how tall are people in the population? How much do the babies weigh up when they're born? And that's the familiar bell curve. And the one that goes up, the big middle and then it goes down. The power law is very different. The power law is really big at one end and it tails away. It falls down really fast, tells away to infinity. And the power law describes things like how big are cities? You know, how many people in a city or the size of beaches or the size of forests. It's a sort of a thing to do with self organizing the way that things sort of come together when they have the chance. And what you find is, for example, with blogs, when the bloggers first started getting going, that was great. You know, anyone could start a blog. What soon turned out to be the case was that you had a few people who were bloggers who had really big followings. And then you had a gazillion other bloggers who, you know, pretty much nobody linked to nobody read their stuff. You know, one of the people down the long tail there was quite discouraging and write something and maybe once a week someone will come along and maybe put a like or something or a little comment on it. And you link it to the big bloggers and you thought that might help. But actually, all that seemed to do was just bring them up in the search results. So the power law like that is a very powerful process that is sort of natural outcome of how are we how we are as people. We sort of looked at a few sources and we discount many others. And you see the same with social media, social media followings, which is that you have comparatively small number of people who have gigantic followings, you know, 14 million followers, 20 million followers, whatever. And you have bazillions of people who have, you know, 10 followers or whatever. And for them, you know, life on social network seems like a scrabble almost, you know, to get attention, which is the thing that we cover it as humans. You know, we just love the attention. And so they'll sort of respond to the big accounts, you know, they'll they'll give all the accounts these sort of, you know, the response that the interaction that the means that the accounts get even bigger, you know, they follow and hoping that they'll notice, you know, sort of notice means empire sort of stuff. And they never get the love back. So that's that's a process by which it's a sort of sorts of naturally into this system, whereby you have colossal numbers for a few people, very powerful influences. And most of the rest of the population doesn't get much for say, it doesn't have much influence, doesn't have much effect on what is said. They can all agree with some of the big accounts. They can all be part of a huge wave. They can all be part of a way that, you know, everyone agrees about something that can be part of a tribe, a gigantic tribe in that sense. But they're not actually influencing direction. Yeah. So so it kind of the power law kind of sounds like are you familiar with the Matthew effect? They call the Matthew effect. So it sounds similar like to the Matthew effect. They got a name from something from the Bible, but it's like the rich get richer kind of ideal. But kind of like what you're talking about, like a big account, you know, it gets consolidated up there. Like, for example, going back to coming from the YouTube world, you know, I covered a lot about like mental health and addiction recovery and stuff like that. But you like I'm very like look like into like looking at analytics and stuff like that and seeing. But anyways, you know, knowing how the algorithms work and, you know, keeping people on the platform, you see that, you know, a channel that's big just gets bigger because that's how the algorithms are set up. They're like, oh, well, OK, so you have two people talking about the exact same subject. One of them gets 50,000 views in the first 24 hours. The other one gets 200 views in the in the first 24 hours, they're going to say, oh, well, clearly there's something about that one with the algorithm just thinks, hey, there's something about that 50,000 view one that must be good. So now we're going to send more people to it. And the one that has 200 views we're not going to send people over there. So so that's kind of how yeah, I've seen it happen. And it's harder for smaller people to kind of, you know, break through like I've just kind of realized, you know, it's just part of my crazy work ethic. I'm like, OK, well, I just got to work 10 times harder because I don't have a gigantic platform. You know what I mean? But you do see it. It kind of consolidating at the top. And I'm curious, like, you know, because there's there's like the halo effect where people think, oh, this person has a big following or this person makes a lot of money. They must know what they're talking about. So do you think this is more of an algorithmic issue or just our issue of just being like, oh, this person has a lot of followers. So they must know what they're talking about type deal. So it's us rather than us looking for something that might be smaller and kind of independent. You know what I mean? Well, the algorithm follows what the people do. So it was interesting recently Facebook released a report about the most viewed posts on Facebook in the previous three months. And it turned out that a number of those posts were actually plagiarized. They had already appeared elsewhere by someone else on Facebook, and then it'd be copied by a particular person who just had a bit more leverage, a few more followers and those things have gone gigantic. So it's not even about the content. It can just be about the person that can just be rule luck that someone picks it up, following retweets it or shares it with their followers. And it gets big. You know, it's it's too easy to underestimate how much of a part luck plays in this. Luck is luck is one of these things. Luck and being early, I guess, you know, being early is always one of the big things on social networks because that's a way to get recommended by the algorithm recommended by, you know, the people who are the early people who build up the big followings, which is why, you know, whenever a new social network gets started, people are always eager to jump onto it, reserve their handle on it and try to milk it for as much as they can, because that can be the best chance to get in on the ground floor. Yeah. You know, with Snapchat, with TikTok, that's that's very much what we've seen. I think that when it comes to the way that the way that the question of is that the person is at the algorithm, it's you can look around at anything. You know, what why do some bands really make it another bands who you think are just as good don't? You know, it's it can be the luck that your song got played on the radio at a particular time and someone heard it and they they played it. And you know, you never underestimate the role of luck in this world. Yeah. So the algorithm can help. But but yes, stupid dumb luck. Actually, I think there's a problem as well. Yeah, no, exactly, exactly. So I've been dying to ask yet since we're getting towards the end here is is if you can because I need everybody to go by your book, but your book was the first one you you discussed the Myanmar situation and how social media played a role. I've heard about it for the last year and I understand that there's a problem. I understand, you know, you know, with the Uyghur population and everything like that. I understand the problem, but I didn't understand how social media played a role and you broke it down. And I know you have like a chapter dedicated to this in the book almost, but can you kind of simplify it a little bit and discuss like just just kind of like the key points of how social media played a role in the situation in Myanmar. Sure. So Myanmar used to be known as Burma. It's a country down in the south east Asia. It's sort of borders, Bangladesh, sort of near Vietnam, all those that sort of area. It's got a very mixed ethnic population. Yeah, there are dozens of different ethnicities there, but the principal religion is Buddhist. It's one of almost the last pre-doubt to Buddhism in the world, but it's also got a minority Muslim population, principally Rohingyas, who sort of to some extent some of them came from Bangladesh, but some are also, you know, they've been living there for centuries as well. In 2010, there was a military government which began to open the country up to sort of democratic change and they opened it up, particularly to mobile phones and also to the internet. So Facebook went from zero, and there's really only Facebook, which is the important social network there, Facebook went from zero in 2010 to there being something like half the population owning smartphones by 2015, I think, and something like half the population being on Facebook. It just basically went with everyone had a smartphone, everyone had a Facebook account. Yeah, the thing is, this was a population which had no experience with computers. They didn't know what, you know, what like meant they didn't know what report abuse meant, you know, except that it might get the goons to come around and you know, you disappear in the night. You had a population entirely innocent of the whole thinking about what goes on in a country where, you know, what the internet is about. And yet you expose them to a social network, which is trying to gain their attention all the time and to show them things. There was an extra problem, which is that there's a language barrier. So Facebook works using a computer system using Unicode for what is written in on it. Whereas, because they had to grow up basically outside the internet outside computer use, they used a system called Zorgui, which is a way of rendering the Burmese language so that work on the screen. Long story short, Facebook couldn't moderate Zorgui. It couldn't actually understand what it was. The translations would come out wrong. So it didn't realise when people were fermenting trouble on social media and basically on Facebook. So you had existing ethnic tensions, which are considerable and, you know, predate the arrival of Facebook. You know, there had been genocides before involving the Rohingya by the Buddhist majority. But what happened was that the arrival of Facebook meant that people who are looking to cause trouble, people looking to cause ethnic conflict, people looking to stir things up, got a free reign. No moderation, algorithmic reinforcement, excessive tribalism, excessive outrage means that a lot of people got killed. Now, the actual genocide that took place in 2016 when the Burmese army drove hundreds of thousands of Rohingya from the villages, burned thing, you know, burned villages, bulldozed them, killed people, drove them almost into the sea, over to the border into Bangladesh. That's not directly down to Facebook. Facebook did not ferment that, but it did make the conditions exist whereby the government could excuse it and where the the Buddhist majority would not question it at all because they had been led to believe that the Rohingya were the bad people in all this, that they were looking to displace the Buddhist population, which is something that couldn't happen. They're just far too numerous, the Buddhists compared to Muslims who are a tiny proportion. So you had a situation which was sort of it's almost like a like a lab experiment, you know, what happens with social warming if you take a population that's never been exposed to the internet, it's completely naive about what the effects would be. What happens if you just let this loosen them and basically it's not warming. It's like, you know, turning the gas up to, turning the gas up to maximum and, you know, in the course of six years, you go from a situation where things are a bit uneasy, but you know, there's a generally, generally peaceful to one where you have a literal genocide and Facebook was implicated in it for for its role in helping to stir up conflict. Yeah. Yeah. And I think, you know, going back to what we were talking about earlier, like you've been, you know, in the, you know, a tech journalist for years. And like, did you see this coming? Like there's nothing that could prepare you like, what happens if you take an entire population unfamiliar with this stuff, drop millions of smartphones and a new social network on there and then don't really have the ability to moderate in any way, let that just keep going. Like there's no way to anticipate that. And, and yeah, so, so to brighten up the conversation towards the end, you talk a lot about like some solutions and stuff. And one thing you talk about is, you know, like, like there's some debates around like regulations and stuff like, I'm, I'm very familiar with the conversations here in the United States, right? Like there's a, there's always like a free speech debate and, you know, should court how much power should corporations have, should they be deciding the truth and all that. But then country by country, it kind of varies as well. So, yeah, I'm just curious, like, do you think it's like, you know, regulation, do you think it's us getting together and having more discussions or, or what are what are some of the solutions? And if you want, you can start with, you know, are there reasonable regulations that you, you have in mind? My suggestion is that you actually limit the maximum size that a social network can be the maximum number of users that can have in any country. So I think sort of a ceiling of 250 million is actually a pretty good number. The reason why is because as the size of a social network increases arithmetically, the problems with it, the requirement for moderation increases geometrically. So if you have 10 people on a social network, you know, you can have, say, 100 interactions. When you have 20 people on a social network, you suddenly have the potential for 400 interactions. Your moderation problem has gone up geometrically, even though you've only doubled in size. When you've got something like Facebook, which is 2 billion, 3 billion or whatever it is, like, in the multiple billions, your moderation problem is colossal. It's absolutely enormous. You know, all the, all the ways of the interactions that happen, you simply cannot keep, keep charged of it. And if you restrict it to 250 million people and say, you know, if someone leaves, then someone else can join. But, you know, beyond that, that's our limit. Then suddenly you make the moderation a lot simpler. You reduce the potential for harmful interaction, even at the same time, you reduce the potential for good interaction. But, you know, you can't have everything at once. And you also, it's quite easy to implement. You simply say, you know, this is the maximum number we're going to allow. And it might be inconvenient, but, you know, sometimes you've got to take steps in life. You know, regulation is about reducing the bad effects of things. You know, that's why you have regulations in the first place. You know, it's, it's why, you know, you don't sort of allow people to carry nuclear weapons around. Because, you know, actually you've decided that's not a great idea. So, yeah, it's, it's that's, I think would be the thing that would make the big difference as quickly as possible. It might be tricky to implement, but actually I think you'd start to see the effects and you'd have a sort of a social cooling effect that you'd, you know, rather in the way that when you don't belong to a social network, you know, when people talk to me about, I don't know, gavel, parlor or, or, or snapchat or whatever. And they say, oh yeah, look at this thing from it. You go and look at it and it's sort of like going and visiting something in a zoo. You go and look at it and go, yeah, that's interesting. And then you go away from it. And, you know, you don't spend any more time on it. It's, it's, it's a sort of a curio rather than being something that assaults the very fabric of your, you know, of your thinking. It's less important when it's somewhere else. And I think that's, that's a huge benefit. I think that's something that we really need. I think because I think, you know, social warming is having a bad effect just the way that global warming is. But one of them is a lot easier to solve than the other. Yeah, no, it's, it's interesting too. Like, like in the book, you know, you discuss kind of how, you know, with this, you know, limitless amount of users that can come on how things then spread from country to country and people aren't as familiar with what's going on. Like it, it's, it's almost like, you know, just no travel restrictions during COVID, right? You don't know what's going where, but it's interesting too because I, maybe it's because I'm like my main thing is like psychology and human behavior and stuff. But like, I almost think that social issues like this should be delegated to people like Chris Bale, right? Like they've done controlled experiments, like getting people in groups and having their own like kind of experimental platforms to see what decreases polarization, what helps, what hurts and, you know, and stuff like that. And, you know, it's almost like she's at least like has social media platforms hiring people like this to do that, you know, and look into it because I could see, I could see, you know, your suggestion working and limiting that stuff. And I think there's ways to test it and just give things a try because one of the things that, you know, I look at and just not just with social media, but with a lot of issues is just like, okay, like I know that, you know, you might have issues against this experiment, but the thing is it's not working right now. So we should at least try something else because, you know, with, with, you know, just the spread of misinformation, like, like storming the Capitol like 2016 was crazy, but sitting here, sitting here like watching this happen for a conspiracy theory that, you know, bubbled up online. I'm like, that blows my mind. Like people are risking their lives over something and trying to overthrow the government over something that we don't have evidence for. Like that, that freaks me out as someone who tries to live in reality, you know what I mean? But, but yeah, at the end of the day, you know, something needs to be done. And, and yeah, like, yeah, Charles, we did, I have a whole page of notes. We didn't even touch on half the stuff that you go over in your book. Yeah, there's so much you talk about distrust of the media, how it increases polarization, you know, inoculation, all sorts of stuff. So I hope everybody grabs the book and can you let everybody know where the book is? Is it available in all countries and where can they find you to keep up with your work? So the book is available in the UK and the US so far. I think we sold the rights also in Korea recently. It's available on Kindle through Audible as well. You can find me on Twitter, I'm Charles Arthur, ARTHUR. Don't bother to try to find me on Facebook because I'd never been able to really go on Facebook, do you understand? Yeah. And I do a daily sort of newsletter of tech news and links at the overspill, the overspill.blog. So yeah, yeah. So questions. So some kind of a tech nerd too. What kind of stuff do you cover in the newsletter for tech stuff? Things that interest me. So, you know, it'll, you know, some of it's always Facebook stuff. Some of it is stuff about, you know, supply chain hassles sometimes. There's a, you know, good look really. It's because it is technology. It's science. It's all stuff. It's all sorts. And sometimes it's complete random stuff like, you know, how to tell if someone's drowning. Because they look like they're drowning. But yeah, it's a big variety. I dig it because I fell in love with your writing from this book. So I'm like, I need some more Charles. So yeah, so thank you. Thank you so much. I love the book. And I'll link all that stuff down in the description. And yeah, maybe we'll be able to do this again sometime. Chris, thank you so much.