Tuesday, November 26

Lumerin, a protocol on the Arbitrum blockchain, announced that its new Morpheus project for decentralized AI computing will go live Friday on a public test network.

The premise of the technology is to avoid pitfalls of centralized AI models, which might be prone to censorship or monopoly control, according to a press release shared exclusively with CoinDesk on Thursday.

The project relies on “personal AIs,” referred to as “smart agents,” which could be paid for using cryptocurrencies, according to the release. It is being deployed on Arbitrum’s Sepolia test network.

“The new Morpheus public testnet will be used to decentralize and more efficiently allocate AI compute power across the Morpheus AI network and enable users to engage in a decentralized Chat GPT-like interface,” Lumerin said.

Started in 2021, Lumerin describes itself as an “open-source protocol and foundational layer technology that uses smart contracts to control how peer-to-peer data streams are accessed, routed and transacted.”

Lumerin’s first use case was a peer-to-peer, decentralized marketplace for trading Bitcoin hashpower – the computing power needed to find and confirm new blocks on the Bitcoin blockchain.

The project is now “leveraging its existing codebase to build the core node software for Morpheus,” its website reads.

According to the Morpheus technical documentation, or “white paper,” the project is expected to bring functional advantages over existing AI systems such as large language models (LLMs), since it’s already in “Web3” – shorthand for technologies that are built on decentralized networks and designed to work with cryptocurrencies. Key capabilities could include running decentralized applications (dapps) and interacting with decentralized finance (DeFi) protocols.

“Being Web3 native, the user can buy or sell crypto, send stablecoins, access smart contracts and use dapps and DeFi services, which no LLM is connected to today,” the white paper reads. “Regulatory barriers faced by centralized companies prevent them from offering these tools to users, so their models can chat about tasks but not act on the user’s behalf in a Web3 context.”

Read the full article here

Share.
Leave A Reply

Exit mobile version