Podcast Ep. 193 - 0G's Challenge to Break the Wall of Centralization

This article is machine translated
Show original

Hello. This is TokenPost, delivering the future of digital assets. Today, we're going to delve deeper into 0G Labs, which is building a new infrastructure layer for decentralized AI, or DAI. Based on the Mesari Research data we received, we'll clearly analyze what 0G is, what problems it aims to solve, and why its technology is so important now. Hello. 0G is a Layer 1 network for AI model execution and data processing. It presents a very interesting approach to solving the problem of the overly centralized nature of existing AI infrastructure. Today, we'll take a look at 0G's core technology, ecosystem, and token model. This talk will be especially useful for those interested in the combination of AI and blockchain. Alright. Now, let's get straight to the point. Let's start by explaining what 0G is trying to do.

01:00

The data says it's an AI-centric Layer 1 network. What does that mean specifically? It's a basic blockchain designed to handle the entire process of training AI models, performing inference, and storing and publishing the massive datasets required for this, all within a single, verifiable network. Simply put, the goal is to provide infrastructure that enables AI development and operation without relying on any specific company's servers. Ah, DAI, building infrastructure for decentralized AI. So, what about the existing AI environment? What aspects of the centralized systems largely dominated by big tech companies does 0G aim to change? That seems to be the key point. Yes, that's correct. We're trying to solve three major problems.

01:55

First, as you know, AI models and datasets are so large that moving them is expensive and somewhat inefficient. Therefore, 0G aims to provide a way to move this large amount of data more cheaply and efficiently. Second, the computing resources that run AI models need to be available reliably and immediately, but currently, they're tied to specific companies. 0G aims to provide this on a decentralized network. And finally, this entire process must be transparent and verifiable. Currently, centralized systems make it difficult for external parties to know how the results were obtained or what data was used. 0G aims to record this process on-chain, returning verifiability, accessibility, and control to users. So, this is the vision behind 0G. So, it's about directly addressing the cost, accessibility, and transparency issues of centralized AI infrastructure.

02:55

So, what specific services does 0G actually provide? Looking at the data, it seems to be broadly divided into three: compute, storage, data availability, and data analytics. Yes, that's correct. Let's look at them one by one. First, 0G compute is a marketplace that uses pre-trained AI models, such as the popular Rama37TV model. It's a marketplace where you can perform tasks like inference, result derivation, and fine-tuning. When a user requests fine-tuning of the Rama3 model using this data, computing resource providers on the network execute the request, cryptographically sign the results, and return them. Costs are automatically calculated based on usage. The cryptographically signed results seem important here. Does this connect to the verifiability you mentioned? How does this compare to existing cloud services? You've seen it correctly.

03:55

The cryptographic signature proves that the result was executed by the corresponding computing provider. But more importantly, the brain is designed to verify and reproduce not only the result, but also the execution process itself. So, if the same inputs and model are used, the same results should be produced by different providers, and this can be verified on-chain. While existing cloud computing often operates as a black box, Jijin ensures this transparency, allowing users to compare prices and speeds across providers and easily switch to better options if necessary. This goes beyond simple cost savings to a matter of trust and flexibility. Yes, next is storage. This is a distributed storage layer that stores and accesses the AI model itself, the dataset used for training, checkpoints (intermediate results), and final results. It's designed to ensure data access remains intact even if a few specific nodes are down.

04:55

We also use techniques like Merkle proofs to efficiently prove that data hasn't been tampered with. Merkle proofs. Could you explain them a little more simply? Oh, yes. Yes. Merkle proofs are a way to mathematically and very efficiently prove that a specific portion of a large data set is the original, unaltered data, without having to verify the entire data individually. It's like verifying that a specific page in a large file isn't tampered with, instead of copying and comparing the entire document. Think of it as verifying a unique digital fingerprint for that page. This is what ensures data integrity. Developers can easily upload and download data through standard APIs. Data replication and recovery are handled behind the scenes. And finally, there's the 0G data availability DA layer.

05:54

This technology efficiently publishes large amounts of data to a blockchain network, allowing anyone to verify that the data was actually there when needed. This is especially important for network solutions like Rollup. These solutions need to store transaction data somewhere, but with 0GDA, you can sample only a portion of the data without downloading it all, ensuring it's available. This is thanks to technologies like erasure coding. It integrates with Rollup frameworks like OfficeTech and Abitrom Nitro, enabling them to securely and inexpensively publish their processed data to 0GDA. Oh, and Compact Storage DAs seem like independent services, yet they're interconnected. The modular design of these services allows users to choose only what they need or choose a different provider. This mix-and-match approach sounds quite interesting. Yes, that's right.

06:54

That's one of ZeroG's major advantages. For example, if you have your own computing resources but need to store large amounts of data, you can use ZeroG Storage alone. Or, you could use ZeroG Compute from provider A and Storage from provider B, and so on. Because each service interface is standardized at the protocol level, such combinations and modifications are possible without compatibility issues. From a developer's perspective, you can gradually introduce ZeroG infrastructure, starting from the parts you need, without the burden of replacing the entire existing system all at once. This provides flexibility. Yes, these modular services seem quite powerful. So, what is the underlying infrastructure that ensures these services operate reliably and transparently? As you mentioned, managing and coordinating everything on-chain requires a robust framework. Let's take a closer look at ZeroG Chain and ZeroG Identity.

07:54

It's an EVM-compatible Layer 1 blockchain. All task requests, such as "Execute this task using the Llama Three model," reference information about where the same request or required data is in 0G storage, verification of the result receipt upon completion of the task, and ultimately payment to the computing or storage provider with 0G tokensโ€”all of these processes are recorded and coordinated on the 0G chain. Because the user's task intent is recorded in the form of a smart contract, and every step is transparently tracked, it serves as a core foundation for making the entire workflow auditable and repeatable. 0G identity is a crucial element for improving usability. Blockchain addresses are long, complex strings that typically start with 0x, making them difficult to remember and write. 0G identity is a system that converts these into human-readable and understandable names, like alice.0g.

08:51

Think of it as similar to how we remember website addresses as domain names instead of IP addresses. This allows users, as well as increasingly important AI agents, to recognize and interact with each other using identifiable names, making tasks like authorization and payments much easier. So, does this Youngju identity only work on the 0G chain? Or can it be used on other blockchains? For example, I'm wondering if I can use the name Alice Youngju on Ethereum or other chains. That's a good question. One of the key features of 0G identity is interoperability across chains. So, the name Alice 0G can be recognized and verified not only on the 0G chain but also on other linked blockchains.

09:41

If an AI agent performs some task on Ethereum, stores the result in 0G storage, and then uses a 0G computer for further analysis, it can use a consistent identifier, alice.0g, throughout the entire process. This can make complex workflows across multiple blockchain ecosystems much smoother. An ID system that works across multiple blockchains is certainly convenient. However, looking at the data, there is also a concept called AI Alignment Node, which seems a bit unique. Is this different from a typical blockchain validator, or Validator? You can think of it as a kind of independent monitoring or verification layer that exists separately from the Validator. The main role of these nodes is to oversee the health of the entire network.

10:37

For example, if a Pansy computer node fails to deliver the promised performance, a storage node loses data, or a problem occurs in the DA layer, or even further, if the quality of the output produced by an AI model significantly deteriorates or if specific policies, such as harmful content, creation, or prohibition, are violated, etc., it monitors and evaluates them. Through this, it plays a role in increasing the reliability and stability of the entire network. But what's interesting here is that the person who holds the license for this AI Alignment Node and the person who actually operates the node can be separated. The license holder can operate it directly, but if the technical capabilities are lacking, the operation can be entrusted to a trusted professional operator. And the reward for node operation is distributed to both the license holder and the operator.

11:33

This can be seen as an incentive design that encourages active contributions to maintaining network health, rather than passive participation simply by holding a license . The technical aspects seem to be very sophisticated. However, no matter how good the technology is, it is meaningless if an ecosystem with actual users and developers does not exist, right? What efforts is 0G making to expand this ecosystem? I heard that the testnet participation was significant. That's right. It seems that they are quite serious about building this ecosystem as much as they are about technology. The Gongji Foundation created an Ecosystem Growth Fund worth a total of $88.88 million. They also operate an $88.88 million GuildOn GG program and an accelerator, actively supporting early-stage projects.

12:28

In particular, they are sowing the seeds of the ecosystem by providing subsidies, access to notification services, and support for credit technology integration to early teams in fields essential to the DAI ecosystem, such as AI agents, data-related tools, and integration with existing services. The response during the testnet period was also noteworthy. Not only did global corporations such as Google Cloud, AWS, Alibaba Cloud, Japan's NTT, and Docomo participate as node operators, but over 100 validators, including specialized infrastructure companies such as Blockdaemon Pigment and Coinbase Cloud, participated. This can be seen as an indicator of the high initial market interest in mutual aid technology. The fact that the stake was already relatively evenly distributed among the top 10 active validators just one week after the mainnet launch is also a positive sign of the initial level of decentralization.

13:25

Are there any specific examples of partnerships? I'm curious about what types of projects are currently being built on top of Dot Gongji. Yes, there are some interesting examples being mentioned. For example, they're collaborating with Space Ide, a Web3 domain service, to develop an AI agent that operates on Dot Gongji identity. They're also building a system with Beacon Protocol, a decentralized identity and data protocol, that allows users to control and manage AI agents' access to their personal data. These examples show that Dot Gongji isn't just providing infrastructure; it's laying the foundation for real-world DAI applications and services. They're also putting considerable effort into building the ecosystem. Now, let's take a look at the fuel that powers all of this: Dot, Gongji, and tokens.

14:21

When was it released and what exactly is it used for? The tokenomics design will be important, too. You can see why. First, it's for network service usage fees. When using the Zeroz Compute, Zeroz Storage, and Zeroz Data Availability DA services I explained earlier, users must pay with Zeroz tokens. The fees vary depending on the type of service. For example, computers are calculated based on the AI model processed, the number of tokens, GPUs used, and time, while storage is calculated based on the amount of data stored and the duration of the usage. Second, there's the gas fee. All transactions occurring on the Zeroz chain, such as sending tokens or executing smart contracts, require fees paid in Zeroz tokens. These fees are used to reward the validators who maintain the network and process transactions.

15:20

Beyond network service payments, it creates a fundamental demand for 0Z tokens. Third, there's staking. To strengthen the security of the 0Z network, token holders can stake their 0Z tokens by delegating them to validators or directly operating validating nodes. In return, staking participants receive a portion of transaction fees and newly issued tokens, known as block rewards. Conversely, if a validator violates network rules or misbehaves, a portion of their staked tokens will be deducted in a penalty known as slashing, encouraging honest behavior. The usage fees, gas fee staking, and token use are all very clear. But how is token distribution handled? There are concerns that if initial investors or team members are allocated all at once, it could affect the price. I'm curious about how this is designed. Yes, token distribution and vesting.

16:17

So, the unlock schedule is a really important part of tokenomics. Of the total supply of 1 billion, 56% was allocated to community and ecosystem revitalization. Specifically, the largest allocation was the ecosystem growth fund, at 28%. 15% was allocated to AI alignment node operation rewards, and 13% to community rewards, including airdrops. The remaining 44% was allocated to the core team leading the project, early contributors, advisors, and 22% to those referred to as "backers" in the initial investor materials. However, the key point to note here is the token generation event (TGE), which refers to the circulating supply when the tokens are first released to the market. At the TGE, only 21.3% of the total supply was unlocked, and this amount came entirely from the community and ecosystem allocation. The core team and early investors accounted for 44% of the total.

17:15

This amount will not be released at all at the time of the TG and will be completely locked for 12 months. It's a linear lockup. After that, it will be released in equal monthly installments over 36 months, or three years, using a linear vesting system. This is a relatively conservative and stable design that mitigates the initial market's sudden selling pressure and contributes to the project's long-term success. The remaining portion of the community allocation that wasn't released during the TJ will be gradually distributed over 24 to 36 months. It's impressive that they've limited the initial circulation and established a long-term vesting schedule to ensure stability. So, summarizing what we've discussed so far, what would you say is the ultimate significance of Yeongju in the DAI space? What do you see as the biggest challenges to successfully establishing Yeongjin? Three key elements essential for AI development and operation.

18:15

So, I see this as a truly ambitious and timely attempt to decentralize computing power, data storage, space, and data availability. As highlighted in the Matsari Report, the current AI infrastructure is overly concentrated in the hands of a few large companies, leading to growing concerns about cost, censorship, and lack of transparency in data privacy results. Youngjin is proposing a technological solution to these problems. If successfully implemented, it has the potential to significantly improve the transparency, reproducibility, and, most importantly, accessibility of the AI development and execution workflow. In particular, I believe that the ability for anyone to verify model execution results, clearly trace the source of data used, and increase cost efficiency through competition among various service providers will contribute significantly to the DAI ecosystem.

19:14

However, clear challenges lie between this idealistic vision and realistic market adoption. The biggest hurdle, of course, is securing actual usage. The key lies in effectively and balanced growth across both the supplier ecosystem, which provides resources like GPUs and storage, and the demand ecosystem, which runs AI models and builds services on top of 0G, on the developer and user side. No matter how advanced the technology is, it's useless if no one uses it. Furthermore, the company must survive competition from established centralized cloud giants like AWS and Google Cloud, which possess massive capital, established infrastructure, and, most importantly, convenience. Success will hinge on how clearly it can demonstrate its value to patients and provide compelling incentives to change their existing workflows.

20:09

Beyond technical perfection, there remains a arduous process of achieving actual market adoption and market adaptation. While there's certainly potential for DAI to establish itself as a core infrastructure within this massive convergence of cryptocurrency and AI, there are also clear, realistic challenges. It will be worth keeping an eye on how Tenz balances its technological vision with market demands and continues to evolve. This concludes our in-depth analysis of Tenz Labs. It was a compelling case study demonstrating that the combination of AI and blockchain technology can go beyond mere experimentation and lead to real-world infrastructure change . We hope this provides an opportunity to reflect more deeply on the future direction of technology.

20:58

It will be fascinating to see how unique initiatives like Taeji's modular approach, result verification mechanism, and AI Alignment Node will impact the overall DAI ecosystem. Especially as concerns about the reliability of AI-generated results and data sovereignty continue to grow, I believe the importance of infrastructure like Tenji, which aims for a transparent and decentralized structure, will only grow. Listeners, which of the various Tenz components discussed today do you think will have the greatest impact on the future of DAI? For example, which services in computing, storage, and data availability will be the first to replace or complement existing systems? Or will Tenz Identity truly revolutionize the way AI agents interact?

21:58

It would be a meaningful time to ask yourself these questions and imagine the changes this technology will bring. Did you find today's story interesting? I'll return with a more in-depth analysis in the next story.

Get real-time news... Go to TokenPost Telegram

Copyright ยฉ TokenPost. Unauthorized reproduction and redistribution prohibited.

#TokenPost #Podcast #Cryptocurrency #Report #Analysis # 0G #Blockchain #MarketOutlook #Research #Web3 #DigitalAssets

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments