Check out the top ten Crypto+AI trends worth paying attention to

avatar
ODAILY
12-17
This article is machine translated
Show original

Author: Archetype

Compiled by TechFlow

1. Agent-to-Agent Interaction

The inherent transparency and composability of Block chains make them an ideal platform for seamless interaction between intelligent agents. In such interactions, agents developed by different institutions for different purposes can collaborate to complete tasks. There have been some exciting attempts, such as inter-agent transfers and joint Token issuance. We look forward to further expansion of agent-to-agent interactions, creating new application scenarios such as agent-driven social platforms, and optimizing existing enterprise workflows like platform authentication, micro-payments, and cross-platform workflow integration. - Danny, Katie, Aadharsh, Dmitriy

aethernet and clanker jointly issue Token on Warpcast

2. Decentralized Agentic Organizations

Large-scale multi-agent collaboration is another exciting research direction. How can multi-agent systems coordinate to complete tasks, solve problems, and even manage protocols and systems? In the article "The Promise and Challenges of Crypto + AI Applications" published in early 2024, Vitalik proposed the idea of using AI agents for prediction markets and adjudication. He believes that in large-scale applications, multi-agent systems have great potential in "truth" discovery and autonomous governance. We look forward to seeing how the capabilities of these multi-agent systems can be further explored, and how "collective intelligence" can demonstrate more possibilities in experiments.

In addition, the collaboration between agents and humans is also a direction worth exploring. For example, how can communities interact around agents, or how can agents organize humans to take collective action. We hope to see more agent experiments aimed at large-scale human collaboration. Of course, this requires some verification mechanism, especially when tasks are completed off-chain. But such exploration may lead to some unexpected and wonderful results. - Katie, Dmitriy, Ash

3. Agentic Multimedia Entertainment

The concept of digitized virtual personas has existed for years. For example, Hatsune Miku (2007) has held sold-out concerts in venues with 20,000 seats; Lil Miquela (2016) has over 2 million followers on Instagram. Recent examples include the AI virtual streamer Neuro-sama (2022), who has over 600,000 subscribers on Twitch; and the anonymous Kpop boy band PLAVE (2023), which has accumulated over 300 million views on YouTube in less than two years. With the progress of AI technology and the application of Block chain in payment, value transfer, and open data platforms, these agents are expected to become more autonomous and may open up a new mainstream entertainment category by 2025. - Katie, Dmitriy

Clockwise from top left: Hatsune Miku, Luna of Virtuals, Lil Miquela, and PLAVE

4. Generative/Agentic Content Marketing

In some cases, the agent itself is the product, while in other cases, the agent can be a complement to the product. In the attention economy, continuously outputting engaging content is key to the success of any idea, product, or company. Generative/agentic content provides teams with a powerful tool to ensure a scalable, 24/7 content creation pipeline. This field has accelerated due to the discussion on the difference between memecoin and agents. Agents are a powerful tool for memecoin to achieve virality, even if they have not yet fully realized "agentification".

Another example is that the gaming industry is increasingly pursuing dynamism to maintain user engagement. A classic approach is to guide user-generated content, and pure generative content (such as in-game items, NPCs, or even fully generated levels) may become the next stage of this trend. We are curious to see how the capabilities of agents will further expand the boundaries of content distribution and user interaction by 2025. - Katie

5. Next-Gen Art Tools/Platforms

In 2024, we launched the IN CONVERSATION WITH series, an interview program that dialogues with crypto artists in music, visual arts, design, and curation. This year's interviews have made me aware of a trend: artists interested in crypto technology are often also passionate about emerging technologies, and hope to integrate them more deeply into their creative practices, such as AR/VR objects, code-generated art, and live coding.

The combination of Generative Art and Block chain technology is not new, making Block chain an ideal carrier for AI art. It has been difficult to showcase and present these art forms on traditional platforms. ArtBlocks has provided a preliminary exploration of how digital art can be displayed, stored, monetized, and preserved through Block chain, greatly improving the experience for artists and audiences. Furthermore, AI tools have also made it easy for ordinary people to create their own artworks. We look forward to seeing how Block chain can further enhance the capabilities of these tools by 2025. - Katie

KC: Since you feel frustrated and have some disagreements with crypto culture, what motivates you to still participate in Web3? What values has Web3 brought to your creative practice - experimental exploration, economic returns, or other aspects?

MM: For me, Web3 has had a positive impact on me personally as well as other artists in multiple ways. Personally, the platforms that support the publication of generative art have been particularly important to my practice. For example, you can upload a JavaScript file, and when someone mints or collects a piece, the code will run in real-time, generating a unique work of art within the system I've designed. This real-time generative process is a core part of my creative practice. Introducing randomness into the systems I write and build, both conceptually and technically, has profoundly shaped my thinking about art. However, it is often difficult to convey this process to audiences if not displayed on platforms specifically designed for this art form, or in traditional galleries.

In the gallery, an algorithm that runs in real-time through projection or screen may be displayed, or a selection of works generated by the algorithm may be presented in a physical form. However, for audiences who are not very familiar with code as an artistic medium, it is difficult for them to understand the significance of the randomness in the creative process, which is an important part of the practice of all artists who use software in a generative way. When the final presentation of the work is just a picture posted on Instagram or a printed physical work, I sometimes find it difficult to emphasize to the audience the core idea of "code as a creative medium" in the work.

The emergence of Non-Fungible Tokens (NFTs) has excited me, as they not only provide a platform to showcase generative art, but also help popularize the concept of "code as an artistic medium", allowing more people to understand the uniqueness and value of this creative approach.

Excerpt from IN CONVERSATION WITH: Maya Man

6. Data Markets

Since Clive Humby proposed the idea that "data is the new oil", companies have been taking measures to hoard and monetize user data. However, users are gradually becoming aware that their data is the foundation upon which these tech giants survive, yet they have almost no control over how their data is used and cannot benefit from it. As powerful AI models are rapidly developing, this contradiction is becoming more acute. On the one hand, we need to address the problem of user data being abused; on the other hand, as larger-scale, higher-quality models exhaust the "resource" of public internet data, new data sources become increasingly important.

To return the control of data to users, decentralized infrastructure provides a wide design space. This requires innovative solutions in multiple areas, such as data storage, privacy protection, data quality assessment, value attribution, and monetization mechanisms. Meanwhile, to address the problem of data supply shortage, we need to consider how to leverage technological advantages to build competitive solutions, such as creating higher-value data products through better incentive mechanisms and filtering methods. Especially in the current context where Web2 AI still dominates, how to combine smart contracts with traditional service-level agreements (SLAs) is a direction worth exploring in depth.

7. Decentralized Compute

In the development and deployment of AI, computing power is as crucial as data. Over the past few years, large data centers, with their exclusive access to facilities, energy, and hardware, have dominated the development of deep learning and AI. However, this landscape is gradually being disrupted due to the limitations of physical resources and the advancement of open-source technologies.

The v1 stage of decentralized AI compute is similar to the Web2 GPU cloud, but without obvious advantages in hardware supply and demand. In the v2 stage, we see some teams starting to build more complete technology stacks, including orchestration, routing, and pricing systems for high-performance computing, as well as developing proprietary features to attract demand and improve inference efficiency. Some teams focus on optimizing cross-hardware inference routing through compiler frameworks, while others are developing distributed model training frameworks on their computing networks.

Furthermore, a new emerging market called AI-Fi is transforming computing power and GPUs into revenue-generating assets, or using on-chain liquidity to provide new financing channels for data centers. However, whether decentralized computing can truly realize its potential still depends on whether the gap between the idea and actual demand can be bridged.

8. Compute Accounting Standards

In decentralized high-performance computing (HPC) networks, how to coordinate heterogeneous computing resources is an important challenge, and the current lack of unified accounting standards makes this problem even more complex. The outputs of AI models have diversity, such as model variants, quantization, and randomness introduced by temperature and sampling hyperparameters. Additionally, different GPU architectures and CUDA versions can also lead to differences in hardware outputs. These factors make it an urgent problem to accurately account for the capacity of models and computing markets in heterogeneous distributed systems.

Due to the lack of these standards, we have seen multiple cases this year where the performance and quality of models and computing resources were incorrectly accounted for in both Web2 and Web3 computing markets. This has forced users to verify the actual performance of AI systems by running their own benchmarks or limiting the usage rate of computing markets.

The crypto space has always emphasized "verifiability", so we hope that by 2025, the integration of crypto and AI will make system performance more transparent. Regular users should be able to easily compare the key output characteristics of models or computing clusters, and audit and evaluate the actual performance of the systems.

9. Probabilistic Privacy Primitives

Vitalik mentioned a unique contradiction in his article "The Promise and Perils of Crypto + AI Applications": "In cryptography, open-source is the only way to achieve security, but in AI, making models (and even training data) public greatly increases the risk of adversarial machine learning attacks."

While privacy protection is not a new research direction for blockchain, with the rapid development of AI, cryptographic technologies related to privacy are being accelerated in their application. Significant progress has been made in privacy-enhancing technologies this year, such as zero-knowledge proofs (ZK), fully homomorphic encryption (FHE), trusted execution environments (TEEs), and multi-party computation (MPC). These technologies are being used in scenarios like private shared state computation on encrypted data. At the same time, tech giants like Nvidia and Apple are also leveraging proprietary TEE technologies to enable federated learning and private AI inference, while maintaining consistency in hardware, firmware, and models.

In the future, we will focus on how to protect privacy in probabilistic state transitions, and how these technologies can facilitate the practical application of decentralized AI on heterogeneous systems, such as decentralized private inference, encrypted data storage and access pipelines, and the construction of fully autonomous execution environments.

Apple's Apple Intelligence stack and Nvidia's H100 GPU

10. Agentic Intents and Next-Gen User Trading Interfaces

An important application of AI agents is to help users autonomously complete transactions on-chain. However, over the past 12-16 months, the definitions of terms like "agentic intent", "agentic behavior", and "solvers" have remained unclear, and the distinction from traditional "robot" development has not been clear enough.

In the coming year, we look forward to seeing more sophisticated language systems combined with various data types and neural network architectures, driving the development of this field. Will the agents continue to use the existing on-chain systems to complete transactions, or will they develop entirely new tools and methods? Will large language models (LLMs) still be the core of these systems, or will they be replaced by other technologies? At the user interface level, will users interact with the systems through natural language to complete transactions? Will the classic "wallet as browser" theory become a reality? These are all questions worth exploring.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments