The controversial digital ID crypto project Worldcoin launched on July 24 after three years of development.
The project was co-founded by Sam Altman, the CEO of OpenAI — the company behind the popular artificial intelligence (AI)-based chatbot ChatGPT.
Worldcoin made headlines right after its announcement in June 2021, as it promised to create a futuristic digital identity system by scanning people’s eyeballs.
Worldcoin claims it will become increasingly difficult to differentiate between humans and bots online as AI technology advances. Thus, to differentiate humans from AI, it created a digital ID system based on “proof of personhood.” This digital ID will be generated by scanning an individual’s iris and giving them a “World ID.”
The firm claims all biometric data will be saved on a decentralized blockchain, and the project does not store any personal data. It generates a zero-knowledge proof to verify that the user is human without revealing the data used to create the proof.
Despite lingering controversy since its announcement, Worldcoin secured $115 million in funding in May and managed to garner over 2 million signups before its official launch to the public.
Worldcoin ecosystem
Worldcoin combines World ID and the Worldcoin (WLD) token — two essential ecosystem components.
The former is a privacy-focused digital identity that assists people in authenticating their identity and uniqueness online while protecting their anonymity. To receive a World ID, users must go through biometric verification using an “Orb.” After completing this verification, individuals are given a unique World ID and, where permitted by law, WLD tokens.
The digital ID ecosystem also consists of the World App, which functions as a wallet and offers decentralized finance services. The app also contains users’ unique World ID generated by an iris scan. The company claims the app can verify users on any third-party applications.
Users can scan their iris at designated locations using an Orb. After their irish is scanned, a user receives 25 WLD tokens. Several leading cryptocurrency exchanges, including Binance, Bybit, OKX, Gate.io and Huobi, have already listed the token.
Bybit’s head of communication, Nathan Thompson, told Cointelegraph that WLD was listed on the platform based on community demand. Asked about the challenges that the project might face, Thompson said that the most significant challenge is public perception:
Worldcoin is not launching WLD in the United States and has clarified in its terms of service that it is “not intended to be available for use, purchase, or access” for U.S. residents.
The U.S. has a history of regulating biometric data, and Fetch.ai CEO Humayun Sheikh told Cointelegraph that the project’s founders seem to be well aware of this regulatory resistance. Worldcoin has primarily focused on countries with fewer privacy rights and regulations for testing, such as Kenya, Sudan and Ghana.
“Apart from the U.S., the project may also face regulatory troubles from the European Union. While the company claims that it follows the General Data Protection Regulation, The EU has some of the strictest data protection laws in the world. The very nature of Worldcoin’s operations involves building a biometric database. So, they are likely to face challenges in complying with these laws. Even countries like India — where Worldcoin has already started its operations — would be keen to probe into Worldcoin because of their upcoming data protection bill,” Sheikh said.
The United Kingsom’s data regulatory body said it is looking to investigate Worldcoin, while the French privacy watchdog has raised the alarm on the data collection methods used by the project.
Fraser Edwards, the co-founder and CEO at decentralized data infrastructure provider cheqd, told Cointelegraph that Worldcoin has elements of best practice (only biometric templates are stored if chosen), but it still ultimately creates a centralized database of these templates, and does so with little to no informed consent, i.e., actually telling people what they’re giving up.
He noted that this approach could be dangerous, citing the recent scandal with Rohingya refugees: “UNHCR [United Nations High Commissioner for Refugees] shared the biometric templates of Rohingya refugees to Bangladesh for aid provisioning, who then shared them directly back to the government of Myanmar, i.e., where the refugees had fled to escape genocide.”
“As soon as these templates are linked to identity (possibly with the Orb), these biometrics can be used to identify people. We need to account for unintended consequences, and a global biometric database is the highest of stakes in unintended consequences.”
As AI and the supposed solutions to the problems it has created continue to develop, it will be crucial for regulators to keep up with the times and create dynamic frameworks to ensure user privacy and security.