The essence of the meta-economy is to measure, price, and trade the intelligent output of AI like industrial products.
Article author and source: Ge Yao, China Securities Journal
“Tokens are the new commodities.” At NVIDIA’s 2026 Global Developer Conference (GTC), NVIDIA founder and CEO Jensen Huang first proposed the concept of token economics.
Jensen Huang proposed a formula: Revenue = Number of lexical units per watt × Available gigawatts. He explained that data centers have now become "lexical factories" operating around the clock, taking in electricity and data and outputting lexical units. The revenue of a "factory" depends on the product of the efficiency and scale of lexical production.
Liu Liehong, director of China's National Data Administration, recently stated that by March of this year, China's daily average number of metaword calls had exceeded 140 trillion, more than 1,000 times the 100 billion at the beginning of 2024.
The "meta-economy" is giving rise to a new industrial chain.
What is the meta-economy?
A word is the basic unit for large models to process information. When a user asks a question to an AI model, the model first breaks the user's words down into words, then processes them, and finally reassembles the resulting words into a sentence. Each word generated essentially uses the GPU computing power of the data center, and consumes electricity.
Therefore, a lexical is naturally a unit of measurement. Large model vendors charge for their APIs based on lexical units, and cloud service providers price their computing power based on lexical units. Lexical units are to AI what "degrees" are to electricity.
However, for a considerable period of time, lexical units were merely a cost concept. In 2023 and 2024, the competition among models was based on parameter size and training data volume, and lexical units were regarded as a cost, with no one considering them as a "product".
The change occurred after AI entered the inference stage. In the past two years, AI has been widely deployed in commercial scenarios, and every user conversation and task execution continuously consumes words. In the charging model, many AI companies charge users per word; the more words are consumed, the more they sell. At this point, words become a commodity that can be mass-produced, priced at different levels, and traded on a large scale.
At GTC 2026, Jensen Huang first proposed the concept of the lexicon economy, saying, "Lexicons are the new commodities." In his description, data centers are equivalent to lexicon factories that operate around the clock, with data and electricity as raw materials and lexicons as products.
He proposed a new metric, "Tokens per Watt," which he believes will measure the future revenue-generating capacity of data centers. This is because "within a fixed power limit, whoever has the highest token throughput per watt will have the lowest production cost." Nvidia's technological iterations have always revolved around token production efficiency.
In short, the essence of the word economy is to measure, price, and trade the intelligent output of AI like industrial products.
The word economy is happening. OpenAI CEO Sam Altman said in a speech earlier this year: "Fundamentally, our business, and the business of every AI model provider, will essentially become selling words."
Liu Liehong, director of the National Data Administration of China, recently stated that by March of this year, the daily average number of meta-calls in China had exceeded 140 trillion, more than 1,000 times the 100 billion at the beginning of 2024, and more than 40% higher than the 100 trillion at the end of 2025.
Liu Liehong believes that word units are not only the value anchors of the intelligent era, but also the "settlement units" that connect technology supply and business demand, providing quantifiable possibilities for the implementation of business models.
"Word Factory" Industry Chain
“A new industrial revolution is underway: data and electricity are entering factories (data centers), and words are being produced,” Huang said.
Like a manufacturing factory, a "word factory" requires facilities, equipment, logistics, and sales. Following this logic and based on research reports from multiple securities firms, the word economy can be broken down into four segments.
#1 Production Stage
Sectors involved: AI chips and servers, AIDC (Artificial Intelligence Data Center) infrastructure, liquid cooling, and power supply systems.
The process of generating words is essentially a process of reasoning, converting electricity and data into words. The upper limit of this data center's capacity is determined by its physical hardware, including the AIDC server room, AI chips and servers, liquid cooling systems, and power supply facilities. These components collectively determine power utilization efficiency—that is, how many words can be converted from every watt of electricity.
Jensen Huang stated, "A 1-gigawatt factory will never become a 2-gigawatt factory; this is a law of physics." This means that competition in the production process is essentially a competition of efficiency. With the same amount of electricity, whoever can produce more units will gain a greater advantage.
#2 Optimization process
Areas involved: Inference optimization algorithms, scheduling systems, optical modules, etc.
Once a data center is built, its total power is fixed. With the hardware remaining unchanged, the key to increasing revenue is to generate more billable words per watt of electricity.
At GTC 2026, Jensen Huang cited an example: Fireworks AI and Lynn, without changing any hardware, increased their word generation speed from approximately 700 per second to nearly 5000 per second simply by updating the software stack and inference algorithms through NVIDIA. This means that technologies such as scheduling algorithms and inference optimization can significantly improve factory output without adding hardware.
#3 Distribution Links
Sectors involved: CDN (Content Delivery Network), cross-border private networks, submarine optical cables
Once a word or token is produced, it needs to be delivered to the end user with extremely low latency. Unlike physical goods, the production and delivery of words or tokens often occur simultaneously.
CDN (Content Delivery Network) edge nodes play the role of "last mile" delivery, while when tokens need to be delivered across borders, cross-border private networks and submarine optical cables constitute international logistics channels.
"Word unit export" also occurs in this stage. With its significant advantage in inference cost, domestically developed models are exporting words on a large scale through overseas API platforms, and the network infrastructure supporting cross-border flow constitutes the basic channel for going global.
#4 Application Stage
Sectors involved: Large model vendors, Agent applications, vertical industry SaaS, multimodal generation platforms
The application stage is also the final value realization stage of the lexical economy. At GTC 2026, Jensen Huang predicted that in the future, every SaaS company will become an Agent-as-a-Service company, and every engineer will have an annual lexical budget.
As AI applications continue to be implemented, the consumption scenarios for lexical units will extend far beyond current conversational AI, expanding to various aspects such as intelligent agents, multimodal content generation, and financial analysis. The greater the consumption volume, the more it will drive the expansion demand in upstream production, creating a positive cycle and serving as the underlying flywheel for the continuous operation of the entire industry chain.
Focus on investment areas such as computing power infrastructure
A research report from Great Wall Securities suggests that OpenClaw represents a new and powerful acceleration point for AI, significantly increasing the rate of word burning. Word consumption in this model can increase exponentially or even dozens of times.
From an investment perspective, the rapid development of the word economy will first benefit the production links of word factories, including computing infrastructure such as AI chips, data centers, liquid cooling, and power supply. This is also the direction with the highest consensus among institutions at present.
A research report from CITIC Securities indicates that ByteDance's word consumption roughly doubles every three months, and major domestic cloud providers will face significant computing power shortages when their daily word consumption reaches 60 trillion words. Therefore, it is predicted that major domestic cloud providers will experience computing power strain when their daily word consumption reaches 30 trillion words, and will begin to experience a certain computing power gap when it reaches 60 trillion words.
Jiang Ying, chief analyst of the communications industry at Open Source Securities, believes that the term "byte" = AI chip (domestic computing power + computing power leasing) = AIDC. A research report from Guojin Securities states that the computing power industry chain will enter a "full-chain inflation" cycle in 2026, with prosperity spreading from chips to AIDC, cloud services, and power equipment.
In addition, computing power leasing and word token overseas expansion are also popular areas that benefit from the word token economy.
Great Wall Securities believes that the essence of word token going global is that Chinese domestic AI models provide inference services to the world through API interfaces, charging based on the amount of processing, thereby realizing the "digital export" of computing power and electricity. The core advantage of Chinese large-scale models in rapidly seizing global market share lies in their highly competitive cost control, especially in the electricity sector.
According to calculations by the Shenwan Hongyuan Computer Team, the overall inference cost of domestically produced AI models is only one-sixth to one-tenth of that of overseas models.
"The word-based industry chain is essentially a revolution that transforms the electricity of the physical world into the intelligence of the digital world," Great Wall Securities believes. The price increase logic of this industry chain follows a path of "explosive overseas demand → shortage of in-memory computing hardware → energy/infrastructure bottlenecks → reassessment of costs across the entire chain." Upstream, cost-advantaged green electricity and ultra-high-voltage transmission form the cost base, locking in the lower limit of gross profit; midstream computing power and storage layers are the core capacity bottlenecks restricting supply; the secondary midstream model and scheduling layers obtain technological premiums through algorithm optimization; and downstream applications and overseas expansion unlock the upper limit of profits thanks to high global willingness to pay.
Great Wall Securities believes that from an investment perspective, the focus should be divided into several stages. The first stage is the storage and graphics memory segment, aiming to capture the maximum price increase elasticity brought about by short-term supply and demand mismatches; the second stage is computing chips and servers, locking in medium-term performance; the third stage is power equipment and green electricity operation, which have long-term barriers to entry; and the fourth stage is leading companies with the ability to implement in real-world scenarios and monetize at high premiums overseas.



