This article is machine translated
Show original

Another high-profile VC lineup AI+Crypto project has been born - @withvana. Simply put, Vana aims to tokenize user private data, build a network for the allocation and incentivization of user data ownership, control, and future economic benefits, which can solve the problem of data scarcity in the training process of large AI models. The development of AI mainly revolves around the three major challenges of "computing power", "algorithms" and "data". In the computing power direction, there are io and Aethir, in the algorithm direction, there are Bittensor and SaharaAI, and Vana has locked in on the "data" direction, which is seen as the fuel for AI. With the support of a large "data source", AI can carry out multi-modal learning, continuous learning, self-supervised learning and other methods to enhance the application scenarios and scope of use of large AI models. At the current stage, the training of large AI models faces many challenges such as privacy and unbalanced data sources. The network data of general text is seriously oversupplied, while the high-quality data sources of specific fields (medical, legal) and real-time update data (news, technology) are seriously scarce. How can we break through the traditional industry data silos, reduce the cost of data labeling, and effectively solve complex problems such as privacy? After a preliminary review of Vana's technical documentation, it appears that they are trying to build: 1) A data liquidity network (Data Liquidity layer), where data can be used flexibly on the Vana network just like using Tokens in the DeFi system; 2) A data portability layer, which is equivalent to building an ecosystem where data providers, developers and platforms can collaborate to promote the orderly flow of data, allowing developers to directly utilize data through tool interfaces, and also record and incentivize high-quality data contributions; 3) A "neural network system" (Connectome) for data, which has built a distributed ledger that can record real-time data transactions in the ecosystem, as well as a POS consensus mechanism to ensure the normal operation of the DLP liquidity layer, and can also be compatible with the external EVM environment. This is the core infrastructure that can effectively solve the AI data problem on Vana's mainnet, and is the key to transforming "data" into quantifiable value and traceable liquidity.

From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments