Tokens, computing power, chips... a major transformation is underway.

This article is machine translated
Show original
The wave of artificial intelligence is unstoppable.

Article author and source: Xinhua News Agency

In the world of AI, a token refers to a word unit, the basic unit by which a model processes and generates information. This can be a word, a piece of code, or even a pixel in an image or video. Simply put, AI doesn't directly recognize characters or words; instead, it typically breaks down language into tokens and maps them to digital IDs that AI can understand and process. When a user asks a question to AI, and AI outputs an analysis report or generates a high-definition video through deep reasoning, it first generates a series of tokens on a high frequency and on a large scale.

Along with this process, the role of data centers is undergoing a fundamental transformation. Traditionally, data centers were mainly used for static data storage and network exchange, and were regarded by the industry as "electronic warehouses." Now, as AI agents begin to autonomously call tools and execute complex logical tasks, data centers are transforming into "token factories" that continuously process massive amounts of data and output intelligent content.

Industry insiders generally believe that whether it is the so-called "token factory" or AI factory, its essence is to redefine the data center as an intelligent production line: the input is power, data, models and scheduling systems, and the output is the execution capability of AI agents and the actual productivity in industry scenarios.

Shanghai Stock Exchange Focus | Major Changes in Computing Power

Throughout human civilization, different eras have had different strategic commodities. In the rapidly approaching era of artificial intelligence, computing power equates to national strength. In 2026, China explicitly proposed implementing new infrastructure projects such as ultra-large-scale intelligent computing clusters and computing-powered collaborative computing.

"This year, the industry must bid farewell to the extensive era of 'competing on production capacity and talking about concepts' and enter a major transformation of 'competing on ecosystem and looking at implementation'." This is the most intuitive feeling of the Shanghai Securities News reporter after surveying more than ten computing power companies.

Trends and Hidden Worries

Recent investigations by reporters have revealed that industry insiders have reached a certain consensus on the current trend, but also have some concerns.

First, global capital investment in the computing power industry will continue to grow this year. Second, "going to the sky" (space computing power) and "going to the ground" (edge computing power) have become the main lines of industry evolution. By 2026, humanity will be both looking up at the stars and breaking through the boundaries of geophysics, and cultivating the earth, allowing intelligence to permeate every terminal.

Soaring capital expenditures are the "master switch" for the global computing power industry.

Global AI spending is projected to reach $2.52 trillion in 2026, a 44% year-on-year increase. In the past two years, a wave of intelligent computing center expansion has swept across China. In the A-share market, some listed companies are readily investing billions or even tens of billions of yuan to purchase computing power in order to enter the computing center business.

Behind the booming expansion of computing power, some people are anxious and worried that the frenzied investment might end up in a mess.

Guo Yufeng, deputy general manager of Phytium Information Technology Co., Ltd., found in his research that many intelligent computing centers have a computing power utilization rate of less than 30%, with a large amount of computing resources lying idle for a long time. He believes that the industry has prominent structural problems of "emphasizing computing power but neglecting applications" and "emphasizing construction but neglecting effectiveness".

“Currently, there is a mismatch between supply and demand in computing power. There is an oversupply of low-end computing power and a shortage of high-end intelligent computing power; the utilization rate of general-purpose computing power in the west is low, while the supply of intelligent computing power urgently needed by industries in the east is tight. At the same time, the phenomenon of computing power silos is serious, and it is difficult to efficiently circulate computing power resources across regions and entities,” said Lian Yuming, founding president of the Beijing International Urban Development Research Institute.

Industry insiders also revealed that many local data centers are still equipped with CPUs as the main type of chip, which are suitable for traditional IT and cloud service scenarios but cannot meet the needs of AI large model training and inference. In some places, the computing architecture of the chips is unreasonable, resulting in narrow application scenarios, and even if the power cost is low, they cannot be used.

Behind these phenomena lies China's current data center construction boom, which stems from both genuine AI needs and the pursuit of hot topics and blind project launches by local governments and investors.

Ascending to the sky and diving into the sea

While tech giants are quite optimistic about the future of AI, turning large sums of money into intelligent computing centers is becoming increasingly difficult. Energy supply, heat dissipation capacity, and water consumption are gradually becoming bottlenecks for the growth of ground-based computing power.

Therefore, humanity turned its gaze to the stars.

"This year, the 'Three-Body Computing Constellation' will launch another 50 satellites, with plans to complete the networking of 1,000 computing satellites by 2032, forming a space computing power constellation capable of interconnection and providing services for artificial intelligence. At that time, the total computing power will reach 100 quadrillion calculations per second." In the view of Wang Jian, an academician of the Chinese Academy of Engineering and director of the Zhejiang Laboratory, sending computing power into space is as valuable as the invention of electricity, and will give rise to many unimaginable new values.

In addition, China is quietly testing the waters of offshore computing. On February 10, the world's first "direct-connection" seabed data center with offshore wind power went into operation in Lingang, Shanghai. It directly connects offshore wind power with the seabed data center, achieving a green electricity ratio of up to 95% and utilizing seawater for natural cooling.

Beyond the "sky and sea" applications of computing power, the development of edge computing power (which can be described as "grounding") represents an even more immediate and significant transformation, poised to become a core engine driving the upgrade of the consumer electronics and automotive industries by 2026. From the compact smartphone form factor to the Mac Mini popularized by OpenClaw, numerous landmark examples have emerged. Automobiles, with their high-performance computing chips, human-machine interfaces, and ample power, have become an ideal scenario for the deployment of edge AI hardware.

"AI has undergone multiple rounds of evolution and has entered a new stage centered on inference," said Xie Liming, Vice President of Huawei's Storage Product Line and President of the Flash Memory Domain, at the 2026 Huawei Data Storage Spring Festival Launch Conference on March 17. On the same day, Huawei officially released new AI data infrastructure for AI inference scenarios: an AI data platform for central inference scenarios and the FusionCube A1000 AI hyper-converged appliance for branch and edge inference scenarios.

Domestic production speeds up

In a competition of computing power, the most important factor is the computing chip.

At the turn of the year, several domestic chip "little dragons"—Moore Threads, Muxi Technology, Biren Technology, and Tianshu Zhixin—have successively entered the capital market. This is not only the latest practice of the capital market in supporting science and technology innovation enterprises, but also a momentous occasion for the Chinese computing chip industry to enter the capitalization stage.

"This year, chip supply will evolve from simply meeting the basic needs of 'availability' to providing differentiated, scenario-based solutions that address 'quality' and 'accuracy'," said a representative from Tianshu Intelligent Chip.

"In 2026, chips will focus more on ease of use, security, and high energy efficiency," Zhang Lei, founder and CTO of Hanbo Semiconductor, told reporters. "With the release of downstream demands such as AI inference, industrial quality inspection, and digital twins, the competitiveness of domestic computing power is expanding from hardware parameters to full-stack solution capabilities."

Although the development of domestic computing chips has been rapid, there is still a long way to go before we can truly move from "following" to "keeping pace".

In the design and manufacturing stages, the lack of advanced EDA tools and insufficient high-end process capacity remain major bottlenecks. Several domestic computing chip companies report that domestically produced advanced process capacity is in high demand, and even if advanced chips are designed, large-scale mass production is quite difficult. "Chip design companies are essentially working for wafer foundries," said a senior semiconductor industry analyst. "Beyond the dazzling chip specifications, we are more concerned about whether companies can achieve tape-out and mass production."

Deng Zhonghan, an academician of the Chinese Academy of Engineering, recently stated that the large-scale application of domestically produced high-end computing chips faces three core challenges: First, insufficient technical adaptability. Existing chips are mostly based on traditional architectures, which do not match the diverse computing needs of AI large models and intelligent computing clusters, resulting in the practical pain points of "difficulty in computing power adaptation and high cost of scenario implementation." Second, shortcomings in the ecosystem. Software and hardware collaboration, standardization, and scenario verification have not formed a closed loop, which cannot meet the stability requirements of large-scale applications. Third, low computing power utilization efficiency. The industry's "brute-force computing" mode has significantly increased energy consumption costs, and it is also difficult for domestically produced chips to leverage their architectural innovation advantages due to limitations in process technology.

Computer-Electronic Collaboration

On the other side of the Pacific, the US computing industry also has its own "troubles".

Recently, news emerged from Washington that U.S. government officials are demanding that tech companies such as Microsoft and Alphabet commit to ensuring that their data centers will not drive up electricity prices or impose additional burdens on consumers.

This action aims to address the political and public relations issues arising from the expansion of data centers across the United States. In some areas, technology companies have faced growing public resistance. Advocates are actively opposing the construction of energy-intensive data centers, arguing that these centers will strain local infrastructure, water resources, and electricity supplies. Several areas, including Atlanta and New Orleans, have already implemented restrictions on new data center construction.

The current situation in the United States reveals a global problem: the social cost of the computing power boom is moving from being implicit to being explicit.

How to solve the above problems? The concept of "computing and power collaboration" has emerged. This is a core concept in China's new infrastructure strategy for 2026, and it was included in the government work report for the first time in 2026.

Some telecom operators believe that as AI enters a phase of intense competition for electricity, China's "computing-electricity synergy" model is more likely to prevail in terms of large-scale, sustainable green computing power supply, because it addresses the fundamental geographical and temporal mismatch between energy and computing power. Meanwhile, if the United States does not resolve its grid fragmentation and slow expansion issues, the energy bottleneck for its AI development may become increasingly prominent. Globally, China's practices offer a "Chinese solution" that deeply integrates the digital economy with energy transformation.

The wave of artificial intelligence is unstoppable. As the core infrastructure of AI, the research, iteration, and expansion of intelligent computing power, along with the coordinated development of computing power and electricity, are equally unstoppable. Against the backdrop of a competition for computing power that is essentially a competition for national strength, the drama of a major shift in computing power may have only just begun.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments