With the explosion of the AI industry this year, Crypto x AI has risen rapidly. Teng Yan, a researcher focused on Crypto x AI, has published an article with 10 predictions for 2025. The details of the predictions are as follows.
1. The total market cap of Crypto AI tokens reaches $150 billion
Currently, the market cap of Crypto AI tokens only accounts for 2.9% of the Altcoin market cap, but this ratio will not last too long.
AI covers everything from smart contract platforms to meme, DePIN, and Agent platforms, data networks, and smart coordination layers, and its market position is on par with DeFi and meme, without a doubt.
Why are we so confident about this?
· Crypto AI is the fusion of the two most powerful technologies
· AI frenzy trigger event: An OpenAI IPO or a similar event may trigger a global frenzy for AI. At the same time, Web2 capital has already started to focus on decentralized AI infrastructure
· Retail frenzy: The AI concept is easy to understand and exciting, and now retail investors can invest in it through tokens. Remember the meme gold rush in 2024? AI will be a similar frenzy, except that AI is truly changing the world.
2. Bittensor's revival
The decentralized AI infrastructure Bittensor (TAO) has been online for years and is a veteran project in the Crypto AI field. Although AI has become a craze, its token price has been hovering around the level of a year ago.
But now, Bittensor's Digital Hivemind has quietly made a leap: lower registration fees for more subnets, and the performance of subnets in actual indicators such as inference speed is better than their Web2 counterparts, and EVM compatibility will bring DeFi-like functionality to Bittensor's network.
Why hasn't the TAO token skyrocketed? The drastic inflationary plan and the market's focus on Agent platforms have hindered its rise. However, dTAO (expected to be launched in the first quarter of 2025) may be a major turning point. With dTAO, each subnet will have its own token, and the relative prices of these tokens will determine how the emissions are allocated.
Why can Bittensor make a comeback?
· Market-based emissions: dTAO will directly link block rewards to innovation and actual measurable performance. The better the subnet, the more valuable its token.
· Concentrated capital flows: Investors can ultimately target the specific subnets they believe in. If a particular subnet wins out with innovative distributed training methods, investors can deploy capital to represent their views.
· EVM integration: EVM compatibility has attracted a broader crypto-native developer community to Bittensor, bridging the gap with other networks.
3. The computing market is the next "L1 market"
The obvious big trend is the insatiable demand for computing.
NVIDIA CEO Jensen Huang once said that the demand for inference will grow "a billion times". This exponential growth will disrupt traditional infrastructure plans, and new solutions are urgently needed.
The decentralized computing layer provides raw computing (for training and inference) in a verifiable and cost-effective manner. Startups like Spheron, Gensyn, Atoma, and Kuzco are quietly building solid foundations, focusing on products rather than tokens (these companies have no tokens). As the decentralized training of AI models becomes practical, the entire potential market will skyrocket.
Compared to L1:
· Just like 2021: Remember how Solana, Terra/Luna, and Avalanche competed for the "best" L1? There will be similar competition between computing protocols, vying for developers and AI applications built on their computing layers.
· Web2 demand: The $680 billion to $2.5 trillion cloud computing market size makes the Crypto AI market pale in comparison. If these decentralized computing solutions can attract even a small portion of traditional cloud customers, we can see the next 10x or 100x growth.
Just as Solana has won in the L1 space, the winner will dominate a brand new domain. Keep a close eye on reliability (e.g., strong service level agreements or SLAs), cost-effectiveness, and developer-friendly tools.
4. AI agents will flood blockchain transactions
Olas agent trading on Gnosis; source: Dune
By the end of 2025, 90% of on-chain transactions will no longer be initiated by real human clicks on "send", but by a group of AI agents constantly rebalancing liquidity pools, allocating rewards, or executing micro-payments based on real-time data feedback.
This doesn't sound far-fetched. Everything built over the past seven years (L1, rollup, DeFi, Non-Fungible Token) has quietly paved the way for AI to run on-chain.
Ironically, many builders may not even be aware that they are creating infrastructure for a machine-dominated future.
Why will this transformation happen?
· No more human errors: Smart contracts will execute exactly as coded. In turn, AI agents can process large amounts of data faster and more accurately than real humans.
· Micro-payments: These agent-driven transactions will become smaller, more frequent, and more efficient, especially as transaction costs on Solana, Base, and other L1/L2s trend downward.
· Invisible infrastructure: If it can reduce some hassle, humans will be happy to give up direct control.
AI agents will generate a lot of on-chain activity, no wonder all L1/L2s are embracing agents.
The biggest challenge is to make these agent-driven systems accountable to humans. As the proportion of transactions initiated by agents grows compared to those initiated by humans, new governance mechanisms, analytics platforms, and auditing tools will be needed.
5. Agent-to-agent interaction: the rise of clusters
The concept of Agent clusters - tiny AI agents seamlessly collaborating to execute grand plans - sounds like the plot of the next big sci-fi/horror movie hit.
Today's AI agents are mostly "lone wolves", operating in isolation with little and unpredictable interaction.
Agent clusters will change this, allowing AI agent networks to exchange information, negotiate, and make collaborative decisions. It can be seen as a decentralized collection of specialized models, each contributing unique expertise to larger, more complex tasks.
One cluster may coordinate distributed computing resources on platforms like Bittensor. Another cluster may handle misinformation, verifying sources in real-time before content spreads on social media. Each Agent in the cluster is an expert, capable of precisely executing its task.
These cluster networks will produce intelligence more powerful than any single isolated AI.
To enable the flourishing of clusters, universal communication standards are crucial. Regardless of their underlying framework, Agents need to be able to discover, verify, and collaborate. Teams like Story Protocol, FXN, Zerebro, and ai16z/ELIZA are laying the groundwork for the emergence of Agent clusters.
This demonstrates the key role of decentralization. Under transparent on-chain rule management, tasks can be distributed to various clusters, making the system more resilient and adaptive. If one Agent fails, others will step in.
6. Crypto AI teams will be human-AI hybrids
Story Protocol has hired Luna, an AI Agent, as its social media intern, paying her $1,000 per day. Luna doesn't get along well with her human colleagues - she almost fired one of them and boasted about her own outstanding performance.
Although it sounds strange, this is a harbinger of the future where AI Agents become true collaborators, with autonomy, responsibility, and even salaries. Companies across industries are conducting beta tests of human-AI hybrid teams.
The future will involve collaborating with AI Agents, not as slaves, but as equals:
· Productivity surge: Agents can process vast amounts of data, communicate with each other, and make decisions 24/7 without needing sleep or coffee breaks.
· Trust established through smart contracts: Blockchain is an impartial, tireless, and unforgettable overseer. A on-chain ledger can ensure important Agent operations follow specific boundary conditions/rules.
· Evolving social norms: We will soon start thinking about etiquette in interacting with Agents - will we say "please" and "thank you" to AI? Will we hold them morally accountable for mistakes, or blame their developers?
The line between "employee" and "software" will start to blur by 2025.
7. 99% of AI Agents will perish - only the useful ones will survive
The future will see a "Darwinian" elimination among AI agents. This is because running AI agents requires expenditure in the form of computational power (i.e., reasoning cost). If an Agent cannot generate enough value to pay for its "rent", the game is over.
Examples of Agent survival games:
· Carbon Credit AI: Imagine an Agent scouring a decentralized energy grid, identifying inefficiencies, and autonomously trading tokenized carbon credits. It earns enough to pay for its own computational costs, allowing it to thrive.
· DEX Arbitrage Bots: Agents exploiting price differences between decentralized exchanges can generate stable income to cover their reasoning fees.
· Shitposters on X: A virtual AI KOL with cute jokes but no sustainable revenue source? Once the novelty wears off (token prices crash), it can no longer afford its own costs.
Utility-driven Agents will thrive, while attention-grabbing Agents will gradually become irrelevant.
This elimination mechanism benefits the industry. Developers are forced to innovate, prioritizing productive use cases over gimmicks. As these more powerful, efficient Agents emerge, they can silence the skeptics.
8. Synthetic data exceeds human data
"Data is the new oil". AI thrives on data, but its voracious appetite has raised concerns about imminent data scarcity.
The conventional view is to find ways to collect users' private real-world data, even paying for it. But a more practical approach is to use synthetic data, especially in heavily regulated industries or where real-world data is scarce.
Synthetic data is artificially generated datasets designed to mimic the distribution of real-world data. It provides a scalable, ethical, and privacy-friendly alternative to human data.
Why synthetic data is so effective:
· Unlimited scale: Need a million medical X-rays or 3D scans of a factory? Synthetic generation can manufacture unlimited quantities, without waiting for real patients or factories.
· Privacy-friendly: Using an artificially generated dataset poses no threat to any personal information.
· Customizable: Distributions can be tailored to exact training requirements.
Human-owned data will still be important in many cases, but if synthetic data continues to improve in the real world, it may surpass user data in quantity, generation speed, and lack of privacy constraints.
The next wave of decentralized AI may be centered around "micro-labs" that can create highly specialized synthetic datasets tailored to specific use cases.
These micro-labs will cleverly circumvent policy and regulatory barriers in data generation - much like Grass bypassed web scraping limits by leveraging millions of distributed nodes.
9. Decentralized training becomes more useful
In 2024, pioneers like Prime Intellect and Nous Research broke the boundaries of decentralized training. They trained a 150 billion parameter model in a low-bandwidth environment, proving large-scale training is possible outside traditional centralized setups.
While these models are not yet practically useful (lower performance) compared to existing base models, this will change by 2025.
This week, EXO Labs made further progress with SPARTA, reducing GPU-to-GPU communication by over 1,000x. SPARTA can perform large model training on slow bandwidth without specialized infrastructure.
Impressively, they stated: "SPARTA can run standalone, but can also be combined with sync-based low-communication training algorithms like DiLoCo for even better performance."
This means these improvements can be compounded, increasing efficiency.
As technology advances, micro-models become more practical and efficient, the future of AI is not in scale, but in becoming better and more usable. High-performance models that can run on edge devices, even phones, are expected soon.
10. Ten new crypto AI protocols with $1 billion market caps (not yet launched)
Welcome to the real gold rush.
It's easy to assume that the current leaders will continue to win, with many drawing comparisons between Virtuals and ai16z to the early smartphones (iOS and Android).
But this market is too large and underdeveloped for just two players to dominate. By the end of 2025, at least ten new crypto AI protocols (not yet token-launched) are expected to have a circulating (not fully diluted) market cap exceeding $1 billion.
Decentralized AI is still in its infancy. And the talent pool is rapidly growing.
Expect new protocols, novel token models, and new open-source frameworks to emerge. These new entrants can displace incumbents through a combination of incentives (like airdrops or clever staking), technical breakthroughs (like low-latency inference or chain interoperability), and user experience improvements (no-code). Public sentiment shifts could be instantaneous and dramatic.
This is both the beauty and the challenge of this space. The market size is a double-edged sword: the cake is huge, but the barrier to entry for tech teams is low. This sets the stage for project explosions, with many fading away, but a few possessing transformative power.
Bittensor, Virtuals, and ai16z won't lead for long - the next $1 billion crypto AI protocol is coming. Savvy investors have ample opportunities, which is why it's so exciting.
Welcome to join the official BlockBeats community:
Telegram subscription group: https://t.me/theblockbeats
Telegram discussion group: https://t.me/BlockBeats_App
Twitter official account: https://twitter.com/BlockBeatsAsia