The Artificial Superintelligence (ASI) Alliance has launched ASI: Train, a new program focused on developing domain-specific AI models. The initiative kicks off with the introduction of Cortex, a $100 million brain-inspired robotics model designed to enhance AI capabilities in real-world applications.
According to a Nov. 26 statement, the program targets complex challenges across various industries like science, medicine, and robotics. Current large language models (LLMs) are great for general tasks but struggle with specialized industry needs.
AI models are expected to offer greater precision, efficiency, and relevance for specialized tasks than general-purpose LLMs.
With this platform, researchers, investors, and community members can participate in the success of AI development while supporting it through a decentralized framework.
“By combining domain-specific models like ‘Cortex’ with decentralized ownership, we’re creating a DeSci ecosystem where individuals support groundbreaking technology and share value creation,” Humayun Sheikh, CEO of Fetch.ai and chairman of the ASI Alliance, noted.
Users can stake FET tokens to gain ownership of AI models under a DAO-like structure, with assets becoming tradable in secondary markets. ASI: Train will open staking opportunities for investors in mid-December, allowing participation in the model’s development and success.
The first model Cortex, scheduled to begin training in December, will undergo a 12-14 week training period using GPU compute resources.
The model is expected to generate annual revenue of more than $10 million from customers including educational institutions, warehouse companies, robotics startups, and industrial partners.
The ASI Alliance plans to expand its portfolio with additional AI models in the biotechnology, quantum technology, space technology, and material science sectors.
“This is the future of inclusive, sustainable AI development, and we’re thrilled to have our community at the forefront of this journey,” said Sheikh.
Read the full article here