Stanford Lab, focused on AI research, will use the decentralized computing platform in the cloud Edgecloud for its work in LLM.
The decentralized cloud could be the solution for the vast computer needs of AI. April 17, Theta Labs revealed that the AI of Stanford University would use Theta (Theta) EdgeCloud in his work in large language models. The laboratory, led by the assistant teacher Ellen Vitercik, will use the platform for discrete optimization and algorithmic reasoning of the LLM.
Stanford joins a growing list of academic institutions that use the decentralized platform for research. According to Theta Labs, other EdgeCloud adopters include the National University of Seoul, the University of Korea, the University of Oregon, the State University of Michigan and more.
Big Tech and decentralized Services compete for AI Compute
Large technological companies have quickly expanded their investment in computer infrastructure, especially a AI food oriented. In 2024, Microsoft Invested $ 3.3 billion In a data center in Wisconsin, with the support of the administration of Joe Biden.
At the same time, Amazon declared that he plans Spend $ 11 billion in data centers in Indiana. Google, on the other hand, is globalized, investing $ 1.1 billion in its data center in Finland and building another in Malaysia for $ 2 billion.
Even so, the great technology model is not the only one that competes for the workloads of AI. Unlike most traditional LLM services, Theta Edgecloud operates as a decentralized cloud computing platform. Its infrastructure is distributed geographically, which means that it is not based on mass centralized data centers to provide calculation power.
Instead, the platform uses blockchain technology to reward smaller GPU suppliers based on the income of users. This allows Theta to operate with lower capital expenses and scale more quickly. In turn, it offers a more affordable infrastructure for users.
Theta Network is a blockchain protocol originally designed for Decentralized video transmission. However, the network has been expanding since then to provide decentralized infrastructure for cloud computing, with a particular approach to AI applications.