Back
0g
Apr 18, 2024
Introduction
Decentralized AI, also termed “crypto AI” or “on-chain AI”, has been receiving significant attention as of late. The reality is that the bulk of it occurs off-chain at present, the actual applications are still limited in scope, and we have major limitations to overcome if we want to realize the true power of crypto x AI.
A primary limitation is imposed by the need to store and process vast amounts of data on-chain, required for AI, and yet not possible using existing data availability (DA) solutions. Instead, computations must occur off-chain and be brought on-chain in real-time, using cryptographic techniques ZK Proofs for authentication.
0G provides both a large-scale data storage solution and an infinitely scalable DA layer, enabling AI to at last be supported on-chain.
In this article we will examine:
Data availability and its existing limitations
The role of data availability in decentralized AI
0G’s role in bringing AI on-chain
Data Availability
Ethereum’s future is a rollup-centric roadmap whereby Layer 2 (L2) networks handle transactions off-chain to alleviate network congestion, reduce fees, and ultimately scale the Ethereum ecosystem. L2s then publish data to Ethereum to inherit its security and to provide proof of correct computations. However, despite the recent EIP-4844 (“proto-danksharding”) upgrade, publishing data to Ethereum remains too expensive to scale on a global basis.
This is the data availability problem: how can we ensure that a summarized form of data can accurately represent a valid set of transactions, without mandating every node to store all historical data?
Data Availability Layers (DALs) provide an efficient yet secure manner to publish data and to keep it available for anyone to verify. For example, Data Availability Sampling is one technique whereby light clients store only a portion of all data and yet verify that all data exists through a mathematical sampling process.
The downside is that existing DAL solutions are insufficient for the industry’s anticipated exponential growth. A prime example of this is decentralized AI, which requires DALs capable of efficiently storing and retrieving vast amounts of data for purposes such as model training.
Crypto x AI
The interconnection of blockchain and AI has significant implications, bringing decentralization, transparency, and more to an otherwise opaque and monopolized industry. Despite the growing interest, the landscape remains early and we’re only scratching the surface of what’s possible.
A reason for this is that decentralized AI is extremely challenging at present. Not only must we be able to store vast amounts efficiently, but we must be able to quickly retrieve the data and run models in real-time while minimizing gas fees associated with the complex calculations.
The primary approach is to run computations off-chain and use on-chain Zero-Knowledge Proofs (ZKPs) to verify that models were run correctly. This allows for AI results to be quickly brought on-chain while data is simultaneously kept confidential (as ZKPs do not reveal underlying information). At the same time, ZKPs are relatively new and therefore still slow and expensive, and off-chain data continues to rely upon centralized storage firms like AWS.
There is instead a great need for large sums of data to be stored on-chain, bringing about associated benefits such as greater transparency, collaborative AI, and more. While it’s true that AI has a large open-source academy, ultimately it’s the largest firms alone that have the latest tech. It can also challenging for the average party to monetize their models, disincentivizing or the building and sharing of advanced models.
The combination of crypto and AI has many associated benefits. Yet, as mentioned, existing DALs are incapable of this. Until this is solved, data will be kept off-chain with the continued aforementioned issues associated with that.
0G’s Role
0G provides the infinitely scalable and programmable DA layer needed to bring decentralized AI to fruition.
This is possible through both data partitioning and the ability to introduce any arbitrary number of separate consensus networks in a process known as shared staking. Shared staking entails 0G consensus validators simultaneously managed by the various consensus networks while having assets solely staked on another chain of choice (likely to be Ethereum). This introduces simplicity in design, while a slashable event on any network would trigger a slashing on the network with the staked assets.
Beyond 0G’s role as a DA Layer, there’s also 0G Storage, which is a general data storage system for both structured or unstructured data. It can store vast sums of data, providing data storage infrastructure for the rest of the Web3 ecosystem.
For more information on 0G’s design, we recommend reading our recent blog post.
Conclusion
A highly scalable DAL is required to bring AI on-chain, providing the infrastructure for vast sums of data to be efficiently stored and retrieved for use. Existing DALs may not be adequately prepared to support data needs at such a large scale, instead being better served for purposes such as custom-purpose app chains.
0G’s infinitely scalable DA is the solution required to bring AI on-chain at last, and as a first step, we recently announced our testnet that is bringing this to reality.
To learn more, follow us on Twitter.
To learn about our testnet, read our latest post here.
Sign up for our newsletter