 Aya, from your perspective, this is an opportunity to speak to the major heads of industry and governments, all sorts of people are watching. What other things that you think governments and the tech industry leaders need to do differently when it comes to ensuring a healthier digital future for all of us? First of all, it's great to be on the platform with the Danish Minister, but we have high expectation as Africa on the outcome of the Tech for Democracy initiative and what this really means for Africa. To your question, I think what we need to do differently first is acknowledgment. I don't think we can move forward with the debate to solutions if governments and big tech companies do not want to acknowledge their responsibility and be willing to do the necessary and be accountable. In Africa, the discussion that is relevant to us right now is really about access and freedom. The reality is 70% of Africa's population is offline. We can talk about digital transformation, but digital for us is a close privilege and even dangerous space. So the question is, what did governments and companies do to close the digital divide? And even to democratize access to technology itself before using tech for democracy. While you can discuss sophisticated AI, I'm talking about basic infrastructure. 51% of Africans do not have access to electricity or a reliable source of energy. So with no energy, there is no digitalization. And it makes total sense why international news and what becomes viral and mainstream political narrative on social media is very much Eurocentric and US-centric because imagine if 90% of Africa is actually online. We have a population of average age of 20 years old, which means we would become the largest online population and set the narrative. I think the second biggest debate in Africa right now is really freedom. The tech debate sometimes can become a self-celebratory exercise rather than an actual strategy for empowerment. If we talk about big tech companies, it means the future they want is always related to monetary value. But the future we need to build has to be about freedom, about opportunities for young people to make lives for themselves, to have equal access to tech, to spaces, to information. So if technology is made to get benefits and exploit people and compromise with government on data collection, or even make us reliant on technology, then we're becoming part of the problem. So I think to me access and freedom are just the tip of the iceberg to the discussion on tech for democracy. Wow. Now. I kind of, I lost count of how many goosebumps moments I had in that because that was just, you know, a super battle cry for what we're all here for, tech for democracy. And you laid out a kind of manifesto as to what needs to be done. Let's go to you, Mr. Clegg. That was quite a wish list for tech companies. What's your response? Well, I don't think I'd be able to rival Aya for goosebumps moments, but I do agree with her fundamental assertion that even before you have a discussion about AI content moderation and so on, people need access to technology in the first place, which is interestingly why one of the things that I think is often overlooked, certainly in the three years that I've been in Silicon Valley, I've noticed that one of the things that people don't talk about very much is the huge infrastructure investment that is still needed to connect the world. We don't talk about a global internet as if it exists, but it doesn't exist. There are still billions of people who don't have access to the internet. And that's why, for instance, Metta, and of course all the other big tech companies are doing this, is investing billions of dollars right now to build a subsea marine cable, for instance, just one example amongst many, around the whole continent of Africa to make sure that connectivity in Africa is significantly improved. And you see that, you know, you see that those infrastructure investments around the world. I still agree with IA that, of course, everyone who is a participant in this, most particularly I think governments and the industry is concerned, need to acknowledge their responsibility, need to acknowledge their accountability. My own view is however big these big Silicon Valley companies are, they can't, and they shouldn't be making all these decisions on their own about exactly where to draw the line between free expression and content moderation. They shouldn't be making determinations on their own about people's privacy or how elections are conducted online. We desperately need rules of the road, new rules of the road. Hate speech, for instance. We publish every quarter the prevalence of hate speech, which is now thankfully being brought down to 0.03%. That means that for every 10,000 bits of content, you'll see three bits of hate speech. I hope it can go down lower, but that's already a real achievement. So first of all, I mean, I'm glad Mr. Clegg agrees with me, but I would just hope he would walk the talk someday. Big tech companies need to first be democratic themselves in order to talk about tech for democracy. And it's interesting, even Facebook has an opinion at this platform when they are a big part of creating and maintaining the problem. So far, I can't seem to understand the transparency process of Facebook or any other big tech company about their tracking infrastructure, disclosing their information on private surveillance, let alone providing a useful tool for users to take any action. How is data collected? How is it used? You mentioned about hate speech. I mean, Facebook AI removes less than 5% of hate speech viewed on social media. So what regulations are we talking about? Now, to take the conversation forward, I think this won't be taken seriously unless digital rights become a security issue and unless it becomes on the table of UN Security Council, unless we criminalize the corrupt relationship between big tech companies and politicians and the use of algorithm for manipulation. We cannot move forward in this discussion without that. The UN Security Council first debate ever on emerging technology was last May and first open debate on maintaining peace and security in the cyberspace was last July. We have a long way to go, but I just want to say I'm confident that when Denmark becomes a member again of the UN Security Council, it can help accelerate this agenda. So we hope to hear some commitment from Denmark. OK. Very briefly, very briefly. Before we go to you, yeah, but I'd like to give Mr. Clegg the opportunity just to reply to I am very briefly, please. Well, I raise a number of issues. I mean, look on this fundamental issue about algorithms, because that's often cited, I do think that there's a strong case and I'm sure this will happen over time, either through regulatory compulsion or through choice, that how algorithms work will need to be subject to far greater transparency. What are the signals that the companies use to design their algorithms? But my own view is, and by the way, my own view is that the industry has a significant incentive to do that because it's often it is an opaque process to a lot of people. And anything that is opaque is often infused with the very worst possible intentions. So all these narratives are developed about what algorithms do, which is way, way beyond the actual much more mundane reality. So, for instance, in the case of the Facebook ad, if you were to remove the algorithmic ranking, people would see more hate speech, more misinformation, more extremist content, because these algorithms work precisely as a sort of giant spam filter to suppress hate speech, to identify misinformation. I think the most important way forward now is really less denial and justification of the problem and more action to solve it. As everybody said, we do have progressive policies, especially in Africa, but globally. And actually, we don't need new recommendations. I was listening to the previous panel. There were many coalitions. There was civil society group. Couple years ago, I contributed to something called Contract for the Web. So things are there, solutions are there, but year after year, it's the same demand because nothing has been yet implemented thoroughly and transparently. Governments know they need to keep all of the internet available at all time and protect people online privacy. Companies know they need to do the same and they should develop technology that support the best in humanity and challenge the worst. But we don't see that happening because we're not centering solution around agency. And as Arthur said, people. So I think for the digital population, we really need to re-center these solutions to action rather than creating new problematic power dynamics. I heard ISE earlier. We should design tech so that people are the best and not the worst. Who decides what is best and what is worst? Who determines that power? I'm living in a country now in the United States where there are two, there's some culture war going on where one side of the political equation says that companies such as Metta are taking down too much content and censoring too much. And the other half are saying they're not doing it enough. In the end, a determination about what's good, what's bad, what's best, what's worst, what is acceptable, what's not acceptable, crucially, of course, what's legal and illegal, we can't have it both ways. We can't on the one hand say, oh, these tech companies are too powerful. Now also they should also determine what's good and bad. I just kind of would say that we all share responsibility of course acknowledging that technology has bad sides as well as good sides. I think the technology companies, certainly the one I'm working for is investing billions and billions of dollars in order to bear down on what we identify and our platforms as bad. OK, Mr. LaRocque, I'll come to you in a second, Aya. Let's go to Mr. No, no, no, no, I will not because there is, you know, a white man has taken so much time of his time to intervene. We can't spend 40 minutes giving up.