 Hello and welcome to NewsClick. Facebook is in the news again for the wrong reasons. whistleblower Francis Hogan has revealed a large trove of information on how the social media giant was profiting as the hate speech and divisive content was surging on its platform. Today we have with us senior journalist Paranjai Guha Thakuta and our resident tech expert Bappa Sinha. They're here to discuss what's going on. So, Paranjai, first to you, why is this disclosure of information different? We've had them before. You're right. This story is not new. I mean, when Cyril Sam and I wrote a book called The Real Face of Facebook in India in 2019, even at that time it was rather well known. We published a series of articles in NewsClick. Thereafter, Washington Post, Buzzfeed, Time magazine, Aussie.com, Washington Post, New York Times. Everybody's been writing about all the hateful speech that Facebook has been propagating. And the new part of what Francis Hogan, the whistleblower, said is that Facebook knew very well. The management of Facebook, headed by Mr. Mark Zuckerberg, knew what was happening but did not do anything about it. They turned a blind eye. So the new part of it is that this whistleblower has given concrete examples and many of them from India and many of them pertaining to people from the ruling party, the Bharti Janta Party, whose accounts and fake accounts and real accounts disseminated huge volumes of hateful, fake, false, incendiary speech. And Facebook did very little to constrain or curb them despite claiming that they had fact checkers. And even now they are sort of on the back foot when a spokesperson of Facebook was quoted by the BBC's Sotik Biswas. He said the company would do a within quote deeper and more rigorous analysis of its recommendation systems and do product changes. I mean, everything is a product for them. I mean, the human beings behind the hateful speech, the algorithms, I personally think they've just not done enough. You know, Panachai, why should the users of Facebook care? Do you think, you know, with the millions of users they have in India, their experience of Facebook case has been roughly the same for many years. How do you also get users to think that something needs to change? When Facebook started in India, it opened its office in 2011. It barely 15 million users. Next year it doubled. Then it was 100 million in 2014. After that, it's been an exponential rise. It was 220 million by the end of 2018. And as of now, we understand there are at least 314 million users of Facebook within India. This is only Facebook. Only in India. No. Facebook in India, not WhatsApp or Instagram. Or Instagram. WhatsApp has more users in India. I mean, they claim 400 million somewhere in the region of 450 million. I mean, everybody in India who has a mobile phone with Internet connectivity, I mean, almost all of them are using WhatsApp. The point is, this is the biggest, the largest user base of Facebook, nowhere in the world. They spend 80% or 87% of the budget on fact-checking and tackling misinformation in the US and North America where they have 10% of the users, they spend a miniscule amount of it. They buy their own admission as admitted. They don't have enough people who understand Hindi in Bangla. Because, I mean, you can expect them to do all this artificial intelligence and check by machines what they say, machine learning. But I think you need human intervention. Maybe Bappa knows more about this. Bappa, yes. Can AI actually solve the problems that Facebook is creating? AI, I think, can solve a lot of the problems. But the question is, does Facebook want its AI to solve the problems? I mean, that is the more fundamental question rather than whether AI can do it or not do it. See, what is Facebook's business model? Facebook claims its business model is to connect people. But how does Facebook really make money? Facebook's 98% of Facebook's revenues come from ads. So Facebook is an advertisement company. And it makes tons of money. Just last quarter, it made $30 billion in revenues and roughly $10 billion in profits in one quarter. And all of that, 98% of that is from ads. So Facebook's business is about selling ads. And how does it make money? It makes money when you see an ad and when you click on it. So Facebook's entire technology is driven towards hooking people onto their app and making people spend more and more time on it. How do they ensure that more people will come to their platform? So now there has been, Facebook has done, by the way, there is Facebook's own admission. They have undertaken scientific studies. They have recruited behavioral scientists. They have recruited people who understand human psychology to figure out how people spend more time on that. And in 2018, a group of researchers from MIT, they published a paper where they found that fake news and hate speech goes viral far more than real news. So they did this study and they looked at stories and they found that a real news story would travel six times slower than a fake news. And if you have a tweet stream, most people will have a series of posts linking them and creating a narrative. So if you have a tweet stream of 10 or more, then a fake news or a hate speech tweet stream is likely to travel 20 times more than a fact-based news. So this is from MIT. Now clearly Facebook engineers and Facebook management knows this. So their goal is to get people hooked on. Now what makes people hooked on is fake news, hate news, things which trigger your inherent prejudices. Maybe I'll just add a point to what Bapa said. You see, he's absolutely right. The entire business model of Facebook, and they are a private organization. They're one of the world's biggest private multinational conglomerates, is predicated on their information going viral. So how does one then regulate a business model like this? The point is, once it goes viral, they don't care whether it's correct or false, whether it's hateful or not, so long as they rake in the advertising box. This is really the point. They have, and this is what the whistleblower Francis Hogan has shown, that even when it is violating their own community standards, they tend to be very, very slow in acting. I mean, what was the reason the former policy head, Akhi Das, you know, she quit ostensibly for personal reasons, but everybody knows that she took her own sweet time, and she actually told her colleagues, don't touch them. They are Bharti Janta Party people. They are well-known. Let's speak, Kapil Mishra, T. Raja Singh, this member of the Legislative Assembly of Telangana. You have to add to a point that... And then we have the opposite example, where some people were censored, who were not spreading. You know, to add to what Bappa was saying, there is a person called Alan Raspiger. He's a journalist, and he's a member of this oversight board which Facebook has set up. They are supposed to include eminent individuals, public figures, including journalists, who are all supposed to be independent, and they can be critical of his. I'm quoting him. What Alan Raspiger has said is that it is well-known that the algorithms reward emotional content that polarizes communities because that makes it more addictive. You know, social media platforms and drug pushers, that the only set of two people or two sets of people who call their consumers users. You know, you see it. Now, this is exactly... Bappa, you were telling me about how the groups form on Facebook, how communities form, because Facebook, I think 2018, decided that they would push friends and family circles, but it backfired. It became more angry. The platform became angrier because of that push. Maybe it backfired publicly, but like I'm saying that the Facebook's business model is what Facebook's business model is. So internally, they very well knew what their algorithms were going to do. They were fully aware of it. It is not a quirk. It is not a bug. It is what the algorithm is supposed to do, which is get people hooked on. And this is not just an India-specific phenomena all over the world. Neonazi groups in Germany, which were banned by the German government, Facebook did not pull down their sites. No, it was in Myanmar, in Sri Lanka, where after New York Times, after the media, after United Nations bodies criticized them, that's when this took action. What happened in New Zealand where, you know, there was a live Facebook broadcast of a person who was shooting down, gunning down people outside a mosque in New Zealand. And, you see, they are knew exactly that their system is abused. They are extremely negligent about taking prior action. Now, they claim they have all these fact-checkers. They have fact-checking partners in English, 11 different languages. They claim they have tens of thousands of people. But the fact is, is that enough? Are they taking action? Because these fact-checkers, even when they point out that this is hateful, do something about it, it's up to Facebook whether they take it down or not. And I'll give you one very interesting quote, which came out in this Francis Hohog and the disclosures. In 2019, a Facebook researcher wrote that, I have seen more images of dead people in the past three weeks than I have seen in my entire life. This is the kind of stuff that goes viral. Papa, how did this come to be? How did this viewer end up with more ghastly images than ever before? Well, even in this Hogan revelations, right, there is this story about a couple of Facebook researchers who set up a dummy account. They impersonated an Indian 21-year-old female in North India. And they just created the account and then let it be. They didn't interact with that account at all. And what they found was that account, within days, was filled with pro-Modi and anti-Muslim posts. So that just, without any manual intervention, just purely Facebook algorithms working for a new user, you get bombarded with anti-Muslim speech in India. And I'll give you another example. 12 hours of live video was put up on a site that fanned all kinds of conspiracy theories. And they were all of a communal nature about the death, the suicide, the unfortunate suicide of Sushant Singh Rajput. This is the kind of trash that goes viral and they do little or nothing to constrain or check it. And like Sushant Singh Rajput, look at the international connections. Follow the Sushant Singh Rajput conspiracy. They have borrowed heavily from QAnon, which is an American site. Now, the FBI told that QAnon was a terrorist threat. And Facebook still did not pull it down for 13 months. So in Germany, where the German government banned these groups, Facebook still allowed them. In India, where the government actually promotes this kind of hate speech, India is a marriage made in heaven for Facebook and BJP. Can I give you some figures over how huge they are? I had a question. Yes, please go on. Actually, Facebook has claimed that their AI catches 90% of hate speech, others independent people say 5%. How can the number be so different? Clearly, like Mark Zuckerberg is lying. But for example, India, like Paranjava was saying, India is Facebook's biggest market. 340 million Facebook and 450 million WhatsApp. So Facebook did not have AI modules which worked in Hindi and Bengali for detecting violence, calls for violence and hate speech till this year. It's only after all these outrages they have put on these Hindi and Bengali modules. So they simply don't care about curbing. You mean the Delhi riots and the West Bengal election? Yes, it's only after that they put the Hindi and Bengali modules. It has been very well established that the riots of February 2020 in North-Eastern Delhi, the fake news in the run-up to the elections in West Bengal and other parts of the country that they did little or nothing about it. They were extremely negligent. In fact, to use a quote from one of the American publications, they said they're celebrating violence. They're celebrating communal hatred. And you know how important it is? We in India, we have a population of 1.35 billion. And the median age of India is around 27. I mean half your population is below that age roughly. And we know that young people are far, they're far bigger users of the social media than relatively older people. This has been very, very conclusively demonstrated about more than half of this population is today using these social media platforms and their monopolies. And so you can imagine in a country like India, where in most parts of the country there are more sims, that is subscriber identity modules than human beings. I mean there are, according to the Telecom Regulatory Authority of India, you have over a billion 1.15 billion SIMs. So it's not just confined anymore to the big metros or the large cities or even the small towns, it's spread across India. That's the unfortunate part. And as Ravish Kumar pointed out, an entire generation of people are growing up hating another community because they are growing up in WhatsApp University. How do you solve this problem? Would there be an independent oversight possible? Would Facebook, would public pressure work on Facebook? What would work? It has to be regulatory. Look, frankly, Facebook should be shut down. But it's not going to be shut down. So Facebook in the US, there is bipartisan support both from the Republicans and Democrats to do something and to basically use anti-monopoly laws to break it up at the very least. So without that kind of action, Facebook left to itself will do what it has been doing for the last decade. Bappa, we want Facebook to be closed down. But it seems as he rightly points out, most unlikely that Facebook will be closed down, where Facebook was set up very recently. I mean, look, it was set up as recently as 2004. So it's what? It's barely, I mean, it's what, 17 years old or a little more than 17 years old. Now, it's one of the world's biggest conglomerates. Now, the point he's made, there's pushback from all kinds of lawmakers in the US. There's pushback from Australia, New Zealand, Germany, France, Canada. But here in India, we pretend that we are acting tough. Then WhatsApp, please do something about the originator of the first content. But the fact is Facebook has tied up with Geo platforms, which is a part of India's biggest privately owned corporate conglomerate. They're investing big bucks in Geo. And in my opinion, the government will just keep shouting, but do nothing, unfortunately. For example, with the Free Basics, the regulator did step in and say, no, you can't have that. Would you say that the regulator? See, for Free Basics, there was a coalition of forces from the free software movement from activists who cared about free speech and all, but also from industry groups who did not want to get tied into Facebook's monopoly. With Free Basics, you could have an entire large section of the population whose access to the internet will be through Facebook. I mean, it violated the basic principles of net neutrality. And even the telecom regulatory authority, they lobby. How hard they lobby? The amount of money Facebook spent. They put up hoardings all over. They put advertisements in the front pages of all the newspapers saying, hello, why are you after us? We are giving you something free. You know, I'm giving you free food, but I'll only give you the kind of free food that I want to give you. Take it or leave it. Or here is a library. I'll allow you to go into this library. This part of it is free, but that part you have to pay for. I mean, look, in many parts of the world, Facebook is virtually synonymous with the internet. I'm thankful it's not yet happened in India, but they're very, very powerful. But the pushback for net neutrality also came from the industry. And that's why the TAR try did not let it happen. Now, the thing is, in this case, the industry doesn't care. I would hate speech and about propagating BJP analysis philosophy. But in the West, like in the US, in Europe, there is concern about Facebook promoting this kind of very divisive agenda. In the case of Trump. And in Germany and in France promoting the neo-Nazis and the Mary Lipin party and all that. So there is concern there. What may actually bring its downfall is, Facebook has been lying not just about these things, but they've been also lying about their ad numbers. So Facebook actually fudged their reach numbers. So which may actually affect other businesses. That may hurt. Because there was a period, Bappa, if you remember, when some major advertisers pulled their advertising from Facebook. Now that would hurt. So their ability to keep taking Indian audiences or audiences elsewhere for granted and disturbing the social harmony and changing the sort of influencing the political outcomes that would get limited by their business. There is one more thing. The silver lining is that the young generation, my daughter, that generation is not in Facebook and they hate Facebook. And they're not going to get under Facebook. So Facebook will over a generation die a natural death. I wish I was as optimistic as Bappa is. I'm not. Let's understand something. There are some people who actually see Facebook as something very, very useful. Okay, I can put up my family pictures. I'm going on a holiday. I'll put all of our pictures. I remember people's birthdays. So there is a large section of people and I don't want to delude myself into believing that a large section of people who are using Facebook don't use it because they perceive something quote unquote useful and friendly. And they often don't realize how these algorithms work, how they often live in an echo chamber. They're shut out from a whole lot of other information because and how they're trying to influence not just what they buy and the clothes they wear and the food they eat or their favorite music or their favorite actors the way they think, their political preferences, their behavior. And as people like Shoshana Suboff and others have written trying to predict their behavior. How will you predict? And that is really not just predictive behavior but even nudge you in a certain direction. Right. To use your using habits to feed you to the advertisers. Well, to influence your behavior. To get you to buy things which you not otherwise have bought. And influence your political preferences. All these things came out during the Cambridge Analytica stuff. But Cambridge Analytica was just the tip of the iceberg. Cambridge Analytica was a company which was like a partner to Facebook which was doing it. And at that time Facebook kind of protected itself saying that, oh, they are third party. They did bad things. It's not us. But it is Facebook. Like the Facebook business is what Cambridge Analytica we thought only was limited to Cambridge Analytica. But now the extent of information which is out there makes it very clear that something is deeply amiss with this platform and that it is actually changing the course of societies. It is actually reiterating and emphasizing the worst fears that people had about the way it abuses its monopoly position as a social media platform. And seeks to manipulate human behavior, political preferences, and human psychology. And try and predict it. So all of that has been now reiterated and re-emphasized by the Francis Hogan disclosures. It is making an impact. Like for example in the US Facebook's market cap has dropped by 200 billion dollars since these things came out. Right? In the last two months. Just in the last two months. Mr. Mark Zuckerberg born on the 14th of May 1984, age 37, his net worth in Indian rupees is over 11,000 crores. His personal net worth. Okay. Thanks very much for joining us, Bappa and Paranjoy. Thank you for watching NewsClick. Do subscribe to NewsClick and follow us on Twitter, Facebook and Instagram.