 Hello and welcome to NewsClick. Facebook is in the news again for the wrong reasons. whistleblower Francis Hogan has revealed a large trove of information on how the social media giant was profiting as the hate speech and divisive content was surging on its platform. Today we have with us senior journalist Paranjoy Guha Thakuta and our resident tech expert Bappa Sinha. They're here to discuss what's going on. So Paranjoy, first to you, why is this disclosure of information different? We've had them before. You're right. This story is not new. I mean, when Cyril Sam and I wrote a book called The Real Face of Facebook in India in 2019, even at that time it was rather well known. We published a series of articles in NewsClick. Thereafter, Washington Post, Buzzfeed, Time Magazine, Aussie.com, Washington Post, New York Times, everybody's been writing about all the hateful speech that Facebook has been propagating. And the new part of what Francis Hogan the whistleblower said is that Facebook knew very well. The management of Facebook headed by Mr. Mark Zuckerberg knew what was happening but did not do anything about it. They turned a blind eye. So the new part of it is that this whistleblower has given concrete examples and many of them from India and many of them pertaining to people from the ruling party, the Bharti Janta Party, whose accounts and fake accounts and real accounts disseminated huge volumes of hateful, fake, false, incendiary speech. And Facebook did very little to constrain or curb them despite claiming that they had fact checkers. And even now they are sort of on the back foot when a spokesperson of Facebook was quoted by the BBC's Sothik Biswas. He said the company would do a within code deeper and more rigorous analysis of its recommendation systems and do product changes. I mean everything is a product for them. I mean the human beings behind the hateful speech, the algorithms, I personally think they've just not done enough. You know Panachai, why should the users of Facebook care? Do you think with the millions of users they have in India, their experience of Facebook has been roughly the same for many years. How do you also get users to think that something needs to change? When Facebook started in India, it opened its office in 2011. It had barely 15 million users. Next year it doubled. Then it was 100 million in 2014. After that it's been an exponential rise. It was 220 million by the end of 2018. And as of now, we understand there are at least 314 million users of Facebook within India. This is only Facebook in India. Not. No. Facebook in India, not WhatsApp or Instagram or Instagram. WhatsApp has more users in India. I mean they claim 400 million somewhere in the region of 150 million. I mean everybody in India who has a mobile phone with internet connectivity. I mean almost all of them are using WhatsApp. The point is this is the biggest, the largest user base of Facebook. Nowhere in the world. They spend 80% or 87% of the budget on fact checking and tackling misinformation in the US and North America where they have 10% of the users. They spend a minuscule amount of it. They buy their own admission as admitted. They don't have enough people who understand Hindi in Bangla. Because I mean you can expect them to do all this artificial intelligence and check by machines. What they say, machine learning. I think you need human intervention. Maybe Bappa knows more about this. Yes. Can AI actually solve the problems that Facebook is creating? AI I think can solve a lot of the problems. But the question is does Facebook want its AI to solve the problems? I mean that is the more fundamental question rather than whether AI can do it or not do it. See, what is Facebook's business model? Facebook claims its business model is to connect people. But how does Facebook really make money? Facebook's 98% of Facebook's revenues come from ads. So Facebook is an advertisement company. And it makes tons of money. Just last quarter it made 30 billion dollars in revenues and roughly 10 billion dollars in profits in one quarter. And all of that 98% of that is from ads. So Facebook's business is about selling ads. And how does it make money? It makes money when you see an ad and when you click on it. So Facebook's entire technology is driven towards hooking people onto their app and making people spend more and more time on it. How do they ensure that more people will come to their platform? So now there has been Facebook has done, by the way, there is like Facebook's own admission. They have undertaken like scientific studies. They have recruited behavioral scientists, they have recruited people who understand human psychology to figure out how people spend more time on that. And in 2018, a group of researchers from MIT, they published a paper which where they found that fake news and hate speech goes viral far more than real news. So they did this study and they looked at one was stories and they found that a real news story would travel six times slower than a fake news. And if you have a tweet stream, so most people will have a series of posts linking them and creating a narrative. So if you have a tweet stream, let's say of 10 or more, then a fake news or a hate speech tweet stream is likely to travel 20 times more than a than a fact based. So that is a this is from MIT. Now clearly Facebook engineers and Facebook management knows this, right? So see their goal is to get people hooked on. Now what makes people hooked on is fake news, hate news, things which which trigger your inherent prejudices. Maybe I just add a point toward Bappa said, you see, he's absolutely right, the entire business model of Facebook, and they are a private organization they're one of the world's biggest private multinational conglomerates is based is predicated on their information going viral. So how does one then regulate a business model like this? The point is, once it goes viral, they don't care whether it's correct or false, whether it's hateful or not. So long as they rake in the advertising box. This is really the point they have and this is what the whistleblower Francis Hogan has shown that even when it is violating their own community standards, they tend to be very, very slow in acting. I mean, what was the reason the former policy head Aki Das, you know, she quit ostensibly for personal reasons, but everybody knows that she took her own sweet time and and she actually told her colleagues, he don't touch them. They are Bharti Janta Party people. They are they're well known. The Hits Beach Kapil Mishra T. Raja Singh, this member of the legislative legislative Assembly of Telangana, you add to a point that and then we have the opposite example where some people were censored, who were not spreading. You know, to add to what Bappa was saying, there is a person called Alan Ruspiger. He's the he's a journalist and he's a member of this oversight board which Facebook has set up. It's supposed to come include eminent individuals, public figures, including journalists who are all supposed to be independent and they can be critical of his because this is I'm quoting him. What Alan Ruspiger has said is that it is well known that the algorithms reward emotional content that polarizes communities because that makes it more addictive. You know, social media platforms and drug pushers that the only set of the two two people or two sets of people who call their consumers users, you know, you see it. Now this is exactly like there are you telling me about how the groups form on Facebook, how communities form because Facebook in I think 2018 decided that they would push friends and family circles, but it backfired. It became more angry. The platform became angrier because of that push. How does that work? It backfired publicly, but like I'm saying that the Facebook's business model is what Facebook's business model is. So internally, they very well knew what their algorithms were going to do. They are fully aware of it. It is not it is not a work work. It is not a bug. It is what the algorithm is supposed to do, which is get people hooked on. And this is not just an India specific phenomenon all over the world. New Nazi groups in Germany, which were banned by the German government. Facebook did not pull down their sites. No, it was in Myanmar in Sri Lanka, where after New York Times, after the media, after United Nations bodies criticized them. That's when this took action. What happened in New Zealand, where you know, there was a live Facebook broadcast of a person who was shooting down, gunning down people outside a mosque in New Zealand. And you see, they are knew exactly that their system is abused. They're extremely negligent about taking prior action. Now, they claim they have all these fact checkers. They are fact checking partners in English, 11 different languages. They claim they have tens of thousands of people. But the fact is, is that enough? Are they taking action? Because these fact checkers, even when they point out that this is hateful, do something about it, it's up to Facebook whether they take it down or not. And I'll give you one very interesting quote, which came out in this Francis Hohog and the disclosures. In 2019, a Facebook researcher wrote that, I have seen more images of dead people in the past three weeks than I've seen in my entire life. This is the kind of stuff that goes viral. And this is how did this come to be? How did this viewer end up with more ghastly images than ever before? Well, even in the in the in this Hogan revelations, right? They there is the story about a couple of Facebook researchers who set up a dummy account. They they impersonated Indian 21 year old female in North India. And they just created the account and then let it be that they didn't interact with that account at all. And what they found was that account within days was filled with pro-Modi and anti-Muslim posts. So that just without any inter manual intervention, just purely Facebook algorithms working for a new user, you get bombarded with anti-Muslim speech in India. And then give you another example. 12 hours of live video was put up on a site that fanned all kinds of conspiracy theories. And they were all of a communal nature about the death, the suicide, the unfortunate suicide of Sushant Singh Rajput. This is the kind of trash that goes viral and they do little or nothing to constrain or check it. And there's like Sushant Singh Rajput, look at the the international connections. If you follow the Sushant Singh Rajput conspiracy, they have borrowed heavily from QAnon, which is an American side. Now the FBI told that QAnon was a terrorist threat. And Facebook still did not pull it down for 13 months. So in Germany, where the German government banned these groups, Facebook still allowed them. In India, where the government actually promotes this kind of hate speech, it's great. Like in India as a marriage made in heaven, right? For Facebook and BJP. Can I give you some figures or how huge they are? I had a question. Yes, please go on. Facebook has claimed that their AI catches 90% of hate speech. Others, independent people say 5%. Yeah, how can the number be so different? Yeah, clearly like Mark Zuckerberg is lying, right? But for example, India, like Poranjava was saying, India is Facebook's biggest market, right? 340 million Facebook and 450 million WhatsApp. So Facebook did not have AI modules which worked in Hindi and Bengali for detecting violence, calls for violence and hate speech till this year. It's only after all these outrages they have put on these Hindi and Bengali modules. So they simply don't care about the Delhi riots and the West Bengal election. It's only after that they put the Hindi and Bengali modules. It has been very well established that the riots of February 2020 in northeastern Delhi, the fake news in the run up to the elections in West Bengal and other parts of the country that they did little or nothing about it. They were extremely negligent. In fact, we use a quote from one of the American publications. They said they are celebrating violence. They're celebrating communal hatred and you know how important it is. We in India, we have a population of 1.35 billion and the median age of India is around 27. I mean, half your population is below that age roughly. And we know that young people are far, they're far bigger users of the social media than relatively older people. This has been very, very conclusively demonstrated about more than half of this population is today using these social media platforms and they are monopolies. And so you can imagine in a country like India, where in most parts of the country, there are more Sims, that is subscriber identity modules than human beings. I mean, there are according to the telecom regulatory authority of India, you have over a billion, 1.15 billion SIMs. So it's not just confined any more to the big metros or the large cities or even the small towns. It's spread across India. That's the unfortunate part. And as Ravish Kumar pointed out, an entire generation of people are growing up hating another community because they are growing up in WhatsApp University. How do you solve this problem? Would there be an independent oversight possible? Would Facebook, would public pressure work on Facebook? What would work? It has to be a regulatory survey. Look, frankly, Facebook should be shut down, right? I mean, but it's not going to be shut down. So, so Facebook in the US, there is bipartisan support both from the Republicans and Democrats to do something and to basically use anti-monopoly laws to break it up, right? At the very least. So without that kind of action, Facebook left to itself will do what it has been doing for the last decade. But we want Facebook to be closed down. But it seems as he rightly points out, most unlikely that Facebook will be closed down. But Facebook was set up very recently, you know. I mean, look, it was set up as recently as 2004. So it's what? It's barely seven. I mean, it's what, 17 years old or little more than 17 years old. Now, it's one of the world's biggest once biggest conglomerates. Now the point he's made this push back from all kinds of lawmakers in the US, there's pushback from Australia, New Zealand, Germany, France, Canada. But here in India, we pretend that we're acting tough. And WhatsApp, please do something about the originator of the first content. But the fact is, Facebook has tied up with geo platforms, which is a part of India's biggest privately owned corporate conglomerate, they're investing big bucks in geo. And in my opinion, the government will just keep shouting, but do nothing. Unfortunately, for example, with the free basics, the regulator did step in and say, no, you can't have that. Would would would you say that the regulator see for free basic, there was a coalition of forces from the free software movement from from activists who cared about free speech and all, but also from industry groups who did not want to get tied into Facebook's monopoly. Because with free basics, you could have an entire like large section of the population whose access to the internet will be through Facebook. I mean, it violated the basic principles of net neutrality. And even the telecom regulatory authority lobby, how hard they lobby, the amount of money Facebook spend they put up hoardings all over there. They put advertisements in the front pages of all the newspapers, say hello, why you after us, we're giving you something free. It's like telling you, you know, I'm giving you free food, but I'll only give you the kind of free food that I want to give you take it or leave it. Or here is a library. I'll allow you to go into this library, this part of it is free, but that part you have to pay for. I mean, look in many parts of the world, Facebook is virtually synonymous with the internet. I'm thankful it's not yet happened in India, but they're very, very powerful. But the pushback for net neutrality also came from the industry. And that's why the T.A. try did not let it happen. Now the thing is, in this case, the industry doesn't care about hate speech and about propagating BJP analysis philosophy. But in the West, like in the US, in the Euro, in Europe, there is concern about Facebook promoting this kind of very divisive agenda, right? I mean, in Trump, in the case of Trump, and in Germany and in France, promoting the neo-Nazis and the and Marylip and party and all that, right? So there is concern there. What may actually bring its downfall is Facebook has been lying not just about these things, but they've been also lying about their ad numbers, right? So Facebook actually fudged their reach numbers. So which may actually affect other businesses? That may hurt, you know, because there was a period Bappa, if you remember, when some major advertisers pulled their advertising from Facebook. Now, that would, you know, hurt their ability to keep taking Indian audiences or audiences elsewhere for granted and disturbing the social harmony and changing the sort of influencing the political outcomes that would get limited by their business. Well, there's one more thing that the silver lining, the silver lining is that the young generation, my daughter, that generation is not in Facebook and they hate Facebook and they're not going to get on the Facebook. So Facebook will over a generation die a natural death. I wish I was as optimistic as Bappa is. I'm not, you know, I mean, I mean, let's understand something. There are some people who actually see Facebook as something very, very useful. Okay, I can put up my family pictures, you know, but I'm going on a holiday. I'll put all my pictures. I remember people's birthdays. So there is a large section of people and I don't want to delude myself into believing that a large section of people who are using Facebook don't use it because they perceive something quote unquote useful, friendly and they often don't realize how these algorithms work, how they often live in an echo chamber. They're shut out from a whole lot of other information because and how they're trying to influence not just what they buy and the clothes they wear and the food they eat or their favorite music or their favorite actors the way they think their political preferences, their behavior. And as people like Shoshana Zuboff and others have written trying to predict their behavior, how will you predict and that is really not just predictive behavior but even nudge you in a certain direction. Right, to use your using habits to feed you to the advertisers. Well, to influence your behavior. To get you to buy things which you would not influence your political preferences. So with all these things came out during the Cambridge Analytica stuff, right? But Cambridge Analytica was just the tip of the iceberg. Cambridge Analytica was a company which was like a partner to Facebook which was doing it. And at that time, Facebook kind of kind of protected itself saying that, oh, they are third party they did bad things. It's not us. But it is Facebook. Like the Facebook business is what Cambridge Analytica we thought only was limited to Cambridge Analytica. But now the extent of information which is out there makes it very clear that something is deeply amiss with this platform and that it is actually changing the course of societies. It is actually reiterating and emphasizing the worst fears that people had about the way it abuses its monopoly position as a social media platform and seeks to manipulate human behavior, political preferences and human psychology and try and predict it. So all of that has been now reiterated and re-emphasized by the Francis Hogan disclosures. It is making an impact. Like for example in the US Facebook's market cap has dropped by 200 billion dollars since these things came out, right? In the last two months just in the last two months. But Mr. Mark Zuckerberg born on the 14th of May 1984 aged 37 his net worth in Indian rupees is over 11,000 crores. His personal net worth. Okay. Okay. Thanks. Thanks very much for joining us Bappa and Paranjoy. Thank you for watching NewsClick. Do subscribe to NewsClick and follow us on Twitter, Facebook and Instagram.