A revolutionary turnaround! China's AI large-scale model's weekly usage surpasses that of the US, with the mysterious "unicorn" Hunter Alpha dominating the leaderboards.

This article is machine translated
Show original
As token usage becomes a new benchmark for measuring the prosperity of AI, this trend of "computing power shifting eastward" is becoming increasingly unstoppable.

Article author and source: AIBase

In the second half of the AI computing power race, China's large-scale AI models are "overtaking" others with their astonishing application penetration rate. According to the latest data released by OpenRouter, the weekly call volume of China's large-scale AI models broke records again last week (March 9 to March 15), surpassing US models on mainstream international platforms for two consecutive weeks, demonstrating extremely strong explosive power.

Data shows that the weekly usage of Chinese AI large-scale models has surged to 4.69 trillion tokens , a week-on-week increase of 11.83%; in contrast, the weekly usage of US large-scale models has declined by 9.33% to 3.294 trillion tokens. Among the top three in global usage rankings, Chinese large-scale models have completely dominated the top spots.

MiniMax M2.5 : Holds the champion position for five consecutive weeks, with a weekly call volume of 1.75 trillion tokens.

Step3.5 Flash : With its lightning-fast response and free strategy, it broke into the top three for the first time and took second place, with weekly usage increasing by 79% compared to the previous week.

DeepSeek V3.2 : With 1.04 trillion tokens, it firmly holds the third position.

In this computing power frenzy, the most eye-catching is a "mysterious new face" called Hunter Alpha . This model was just launched on March 11 and violently surged into seventh place globally with a record of 0.666 trillion tokens.

According to OpenRouter platform data, Hunter Alpha is a trillion-parameter model built specifically for agent applications, boasting an ultra-long context capability of 1 million tokens. It demonstrates remarkable performance in long-term planning, complex logical reasoning, and multi-step task execution, exhibiting exceptional reliability and instruction execution accuracy, particularly when interfacing with frameworks like OpenClaw.

From MiniMax's continued dominance to the meteoric rise of Jieyue Xingchen, and the stunning debut of the mysterious Hunter Alpha model, Chinese large-scale models are reshaping the global AI application power landscape through high-frequency iteration and precise scenario entry. As token usage becomes a new benchmark for measuring AI prosperity, this trend of "computing power shifting eastward" is becoming increasingly unstoppable.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments