Delphi Digital: Exploring the challenges and future prospects of decentralized AI (DeAI)

This article is machine translated
Show original

The ultimate vision of truly composable DeAI computations may prove the rationality of blockchain itself.

  • Author: PonderingDurian, Researcher at Delphi Digital
  • Compiler: Pzai, Foresight News

Given that cryptocurrencies are essentially open-source software with built-in economic incentive mechanisms, and AI is disrupting the way software is written, AI will have a huge impact on the entire blockchain field.

Gz5asjhb0aadsgk
AI x Crypto Overall Stack

DeAI: Opportunities and Challenges

Given the large capital investment required to establish base models, as well as the scale economies in data and computation, in my view, the biggest challenge facing DeAI is at the infrastructure layer.

Gz4 Lggaqaaxqrl

Considering the scaling law, tech giants have a natural advantage: in the Web2 era, they have reaped huge profits from the monopoly rents of aggregating consumer demand, and have reinvested these profits into cloud infrastructure over the past decade of artificially low pricing, and now the internet giants are trying to occupy the AI market by occupying data and computation (the key elements of AI):

Gz4 Uwwb0aav7pu
Comparison of token size of large models

Due to the capital-intensive nature of large-scale training and high bandwidth requirements, a unified super-cluster remains the best choice - providing the best performing closed-source models to tech giants - they plan to rent out these models at monopolistic rents and reinvest the proceeds into each subsequent generation of products.

However, it turns out that the moat in the AI field is shallower than the network effects of Web2, with leading frontier models depreciating rapidly relative to the field, especially with Meta's "scorched earth policy" of open-sourcing frontier models like Llama 3.1 that reach SOTA performance.

Gz4 0ofb0aeyxjy
Llama 3 Large Model Scoring

In this regard, layering new emerging research on low-latency distributed training methods may commercialize (some) frontier business models - as AI prices fall, competition will (at least partially) shift from hardware super-clusters (favoring tech giants) to software innovation (slightly favoring open-source/crypto).

Gz4 6lub0amuwxe
Capability Index (Quality) - Training Price Distribution

Considering the computational efficiency of "mixture of experts" architectures and large model synthesis/routing, we may well face not just a world of 3-5 giant models, but a world composed of millions of models with different cost/performance tradeoffs - an interconnected intelligent mesh (hive).

This constitutes a huge coordination problem, which blockchain and cryptocurrency incentive mechanisms should be well-suited to help solve.

Core DeAI Investment Domains

Software is eating the world. AI is eating software. And AI is fundamentally about data and computation.

Delphi is bullish on the various components in this stack:

Gz5a3ofb0aaphn8
Simplified AI x Crypto Stack

Infrastructure

Given that the driving force of AI is data and computation, DeAI infrastructure is dedicated to procuring data and computation as efficiently as possible, often using cryptocurrency incentive mechanisms. As we mentioned earlier, this is the most challenging part of the competition, but given the scale of the end market, it may also be the highest return part.

Computation

To date, distributed training protocols and the GPU market have been constrained by latency, but they hope to coordinate potential heterogeneous hardware to provide lower-cost, on-demand computing services for those excluded from the giants' integrated solutions. Companies like Gensyn, Prime Intellect, and Neuromesh are driving the development of distributed training, while io.net, Akash, and Aethir are realizing lower-cost inference closer to the edge.

Gz4 Rh3b0ae Kfm
Distribution of project ecological niches based on aggregated supply

Data

In an ubiquitous intelligent world based on smaller, more specialized models, the value and monetization of data assets is becoming higher and higher.

Gz4 Xrubuaafga

So far, DePIN has been widely praised for its ability to build lower-cost hardware networks compared to capital-intensive enterprises (such as telecom companies). However, the largest potential market for DePIN will emerge in the collection of new data sets that will flow into on-chain intelligent systems: agent protocols (to be discussed later).

In this world, the largest potential market in the world - the labor force - is being replaced by data and computation. In this world, the De AI infrastructure provides a way for non-technical people to seize the means of production and contribute to the coming network economy.

Middleware

The ultimate goal of DeAI is to achieve effective composable computing. Just like the capital Lego of DeFi, DeAI makes up for the lack of absolute performance today through permissionless composability, incentivizing an open ecosystem of software and computational primitives to compound over time, and (hopefully) surpassing existing software and computational primitives.

If Google represents the extreme of "integration", then DeAI represents the extreme of "modularization". As Clayton Christensen has warned, in emerging industries, integrated approaches often take the lead by reducing friction in the value chain, but as the field matures, modular value chains take hold by increasing competition and cost efficiency at each layer of the stack:

Gz4 Jbcb0aamvkp
Integrated vs. Modular AI

We are very optimistic about several categories that are critical to realizing this modular vision:

1. Routing

In an intelligent fragmented world, how can we choose the right model and time at the best price? Demand aggregators have been capturing value (see Aggregation Theory), and routing functionality is critical to optimizing the Pareto curve between performance and cost in the network intelligent world:

Gz4 Nzlauaa0src

Bittensor has been a leader in the first generation product, but there are also many specialized competitors.

Allora holds competitions between different models in different "topics" in a "context-aware" and self-improving way over time, providing information for future predictions based on historical accuracy under specific conditions.

Morpheus aims to become the "demand-side router" for Web3 use cases - essentially an open-source local agent that can grasp the relevant context of the user and efficiently route queries through the emerging components of the DeFi or Web3 "composable computing" infrastructure.

Agent interoperability protocols like Theoriq and Autonolas aim to push modular routing to the extreme, making a flexible, composable, and compound ecosystem of Agents or components a fully mature on-chain service.

In summary, in an intelligent world that is rapidly fragmenting, demand-side aggregators will play an extremely powerful role. If Google is a $2 trillion company indexing the world's information, the winner of the demand-side routers - whether Apple, Google, or a Web3 solution - the company that indexes the intelligence of agents will be even larger in scale.

2. Coprocessors

Given its decentralized nature, blockchain is severely constrained in terms of data and computation. How to bring the computation and data-intensive AI applications that users need into the blockchain? Through coprocessors!

Gz4 Twdbeaaszmp
Application of coprocessors in Crypto

They all provide different technologies to "verify" the validity of the underlying data or models being used as "oracles", which can minimize new trust assumptions on-chain while greatly enhancing their capabilities. So far, many projects have used zkML, opML, TeeML, and cryptoeconomic methods, each with their own pros and cons:

Gz4 1nnb0aab3nm
Coprocessor Comparison

At a higher level, coprocessors are critical to the intelligence of smart contracts - providing "data warehouse" like solutions to query for more personalized on-chain experiences, or to verify that a given reasoning has been correctly completed.

TEE (Trusted Execution) networks like Super, Phala, and Marlin have been gaining increasing popularity recently due to their practicality and ability to host large-scale applications.

Overall, coprocessors are essential to bridging the high-certainty but low-performance blockchain with the high-performance but probabilistic intelligent agents. Without coprocessors, AI would not appear in this generation of blockchains.

3. Developer Incentives

One of the biggest problems with open-source AI development is the lack of incentive mechanisms to make it sustainable. AI development is highly capital-intensive, with very high opportunity costs for computation and AI knowledge work. Without proper incentives to reward open-source contributions, this field will inevitably lose to the hyper-capitalist supercomputers.

From Sentiment to Pluralis, Sahara AI, and Mira, these projects all aim to bootstrap networks that allow distributed individual networks to contribute to network intelligence, while providing appropriate incentives.

Through business model compensation, the compounding speed of open-source should accelerate - providing developers and AI researchers a global alternative outside of big tech, with the potential for generous rewards based on the value they create.

While this is extremely difficult to achieve, and the competition is getting fiercer, the potential market here is huge.

4. GNN Models

Large language models excel at pattern recognition and next-word prediction in large text corpora, while Graph Neural Networks (GNNs) handle, analyze, and learn from graph-structured data. Since on-chain data is primarily composed of complex interactions between users and smart contracts, i.e. a graph, GNNs seem like a reasonable choice to support on-chain AI use cases.

Projects like Pond and RPS are trying to establish basic models for web3, which may be applied in use cases such as:

  • Price Prediction: On-chain behavior models for price prediction, automated trading strategies, sentiment analysis
  • AI Finance: Integration with existing DeFi applications, advanced yield strategies and liquidity utilization, better risk management/governance
  • On-chain Marketing: More targeted airdrops/positioning, recommendation engines based on on-chain behavior

These models will heavily leverage data warehouse solutions like Space and Time, Subsquid, Covalent, and Hyperline, which I'm also very bullish on.

GNN can prove that large models and Web3 data warehouses are indispensable auxiliary tools for Web3, providing OLAP (Online Analytical Processing) capabilities.

Applications

In my view, on-chain Agents may be the key to solving the well-known user experience problems of cryptocurrencies, but more importantly, over the past decade, we have invested billions of dollars in Web3 infrastructure, yet the utilization rate on the demand side is pitifully low.

No need to worry, Agents are here...

Gz5acdab0aephgm
Growth in AI test scores across various dimensions of human behavior

And these Agents leverage open, permissionless infrastructure - spanning payments and composable computing to achieve more complex end goals, which also seems logical.

In the upcoming networked intelligent economy, economic flow may no longer be B -> B -> C, but user -> Agent -> computational network -> Agent -> user. The ultimate outcome of this flow is agent protocols. Application or service-oriented businesses have limited overhead, primarily leveraging on-chain resources to run and satisfy end-user (or each other's) needs at a much lower cost than traditional enterprises in a composable network.

Just as the application layer captured most of the value in Web2, I'm also a proponent of the "fat agent protocol" theory in DeAI. Over time, value capture should shift towards the upper layers of the stack.

Gz5ahbnb0aawouf
Value Accrual in Generative AI

The next Google, Facebook, and Blackrock may well be agent protocols, and the components to realize these protocols are being born.

The Endgame of DeAI

AI will transform our economic paradigm. Today, market expectations are that this value capture will be limited to a few large companies on the West Coast of North America. But DeAI represents a different vision - an open, composable intelligent network with rewards and compensation for even the smallest contributions, and more collective ownership/governance.

While some claims about DeAI may be overhyped, and many projects' trading prices are vastly higher than their current driving force, the scale of the opportunity is indeed quite objective. For those with patience and foresight, the ultimate vision of truly composable computing in DeAI may prove the rationality of blockchain itself.


Original Link

This article is reprinted with permission from Foresight News

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
1
Comments