 People around the world are lining up to have their eyeballs scanned by a metallic orb. In exchange, they get free cryptocurrencies. I know it sounds crazy, but what I'm describing is actually the most controversial crypto project at the moment. Worldcoin is a cryptocurrency that aims at creating a global digital ID by collecting people's biometric data. The project was co-founded by Sam Altman, the CEO of OpenAI and creator of ChatGBT. We see Worldcoin as a serious danger to privacy that could lead humanity into a technological dystopia. Since its launch, the project has been under fire from regulators and privacy advocates around the world. In this video, we'll tell you everything you need to know about the Worldcoin controversy and determine how dangerous it really is. I'm Giovanni and this is a Cointelegraph Report. Worldcoin offers a verified identity solution called Proof of Personal Wood, a system designed to distinguish humans from AI while preserving privacy. But why would you need an ID to prove that you are a human, you may be asking? It's simple. With the rise of artificial intelligence, it is becoming increasingly difficult to distinguish robots from humans in online interactions. Proof of Personal Wood has multiple use cases. We ask Jake Brockman, a Worldcoins backer and the CEO of Coinfund to give us more details about them. This primitive enables a wide variety of applications for Worldcoin. It enables highly democratic one person, one vote voting systems in blockchain, which basically have not been possible before. It enables wide token distributions and fair airdrops. It could be applied toward bot protection on Twitter. But Worldcoin is not the first project that aims at solving the Proof of Personal Wood problem and creating a verified digital ID. An existing solution is Proof of Humanity. This protocol requires participants to perform human-specific tasks, such as video selfies or voice verification, to demonstrate they are genuine individuals rather than bots. The problem with this system is that as AI technology evolves, robots may become sophisticated enough to perform those tasks. Circles another blockchain network adopts a different approach. Its users need to be vouched for by existing members. Its goal is creating a decentralized web of trust. In this system however, power risks to be concentrated among early network users, which creates a centralization problem. So how is Worldcoin different from these solutions? Worldcoin tries to avoid these problems by using a piece of hardware called the Orb to scan people's irises and collect their unique biometric data. When it rewards them with a fixed amount of WLD, Worldcoin's native token. Once participants have their eyes scanned, they get a word ID, a sort of digital passport that proves they are actually a human being and not a robot. At the moment, Worldcoin orbs can be found in 35 cities around the world. So what's wrong with the Worldcoin's approach? The main criticism around Worldcoin has to do with data privacy. It is not entirely clear how the organization behind Worldcoin is handling people's biometric data. A single entity collecting millions of people's data is an obvious reason for concerns. To many it reminds of Libra, Facebook's aborted cryptocurrency project. Like Libra, Worldcoin is being scrutinized by regulators around the world. A privacy watchdog in France recently questioned the legality of Worldcoin's methods for collecting and storing private data. The government of Kenya, where the project has been tested, recently suspended its operations in the country, citing data privacy concerns. To find out more about these privacy issues, we talked to Eileen Wu, an investigative reporter who extensively covered Worldcoin's operation on the ground. There was just a huge gap between what Worldcoin was saying publicly, which focused on protecting privacy in a very specific way, which is just the technology itself, and then what users actually experienced on the ground. And so that meant deceptive marketing practices collecting far more data than Worldcoin acknowledged and maybe even new, failing to obtain meaningful informed consent. But Worldcoin advocates point out that the privacy issue is overblown and the company is not actually collecting people's data. By using cryptographic technology, users' biometric information is transformed into strings of anonymous code, which will eventually be stored on a public blockchain. Biometric data never leaves the device. It's collected, it is used to create an iris code, which is a hash, essentially, of your iris information. That hash goes into a system. At no point is a user's world ID ever revealed to a third party. Still, the privacy problem has to do not so much with the protocol, but with the hardware used to collect the data, the orbs. Those can create a centralization problem. As Vitalik Buterin, the founder of Ethereum, pointed out in a recent blog post, it is hard to verify whether the orbs contain a secret backdoor. Basically, users still need to trust the orb manufacturer. That is an issue that is for now without a clear solution. Hardware is probably kind of the hardest point of a system to decentralize. You probably first have to open source all of the hardware, and you need to find manufacturers that are willing and able to manufacture kind of alternative devices that still work with the protocol. There needs to be some governance and like auditing processes that determine, you know, that these devices are safe. Finally, another controversy surrounding Wordcoin has to do with the ethics of the methods used to onboard its first two million users. In order to incentivize people to sign up, Wordcoin has been giving out WLD, its native token. Each user can claim one WLD per week, which is about $2. WLD has no real use case at the moment, which makes it a highly speculative investment in the future success of Wordcoin. The problem is that Wordcoin has been focusing its airdrops on the global south, mainly Africa and Asia. Companies that require a lot of data to make their products and their algorithms work well. They often test in the global south, where it is easier to get the data where regulations may not be as strong, where it is cheaper, where, you know, $15 or $25 or worth of world coin will be a lot more of an incentive for someone to sign up than in the United States or in Europe. Now, these allegations are serious. Still, as Vitalik Buterin pointed out, these problems can be solved since they have to do with organizations' practice more than with the underlying technology itself. Overall, it is still early to say whether Wordcoin will be successful. As we saw, there are plenty of problems to address, both ethical and technical. The project will have to win the trust of privacy activists and regulators around the world. A certain amount of skepticism is definitely justified. The stakes are very high. If the company manages to achieve its goal of collecting millions of people's biometric data, that could pose a huge security risk if the data is not protected properly. At the same time, the concept of proof of personal good, a way of distinguishing humans from robots, is a valuable idea in the budding era of AI. A truly decentralized global ID, which allows users to interact online in a private and secure way, is a goal worth pursuing. Whether Wordcoin will be able to implement it successfully in practice remains to be seen. That's all for today's video. I'm Giovanni Rost. Thanks for watching and see you next time.