This article is machine translated
Show original
In the week that Pony Alaph's involvement as GLM5 for Zhipu became confirmed, Zhipu's stock price more than doubled from its high. This contrasts sharply with the volatile and fatigued performance of US AI stocks, which have been characterized by significant price fluctuations.
Barring unforeseen circumstances, DeepSeek V4 and Qianwen 3.5 will decisively shatter the sacred narrative of Scaling Law, the lifeblood of US AI stocks, during this year's Spring Festival.
Those who are still belatedly increasing their holdings of US dollar assets and reducing their holdings of RMB assets deserve only the leftover broth from the feast of those who were already aware of the situation and were actively increasing their RMB holdings!

It's amazing how smart the GLM 5 is; does it really deserve a +60% increase?
However, financial markets have always loved speculating on expectations; actual implementation isn't their primary concern.
A budget-friendly, fully-feeding version of Claude Opus 4.5
Yesterday, two new domestically produced models were released: the GLM 5 and the MiniMax 2.5.
The Spring Festival Gala battle has finally moved from red envelopes to AI.
The current model in the US AI industry—relying on OpenAI's massive fundraising to sell stocks, Microsoft, Google, and Amazon issuing long-term, high-grade bonds to inflate stock prices and profit at the expense of retail investors, and Nvidia engaging in external Ponzi schemes to legally fabricate performance—is unhealthy and unsustainable.
Is the AI saga, fueled by capital investment, coming to an end?
To be honest, I'm quite puzzled: What kind of cutting-edge engineering optimization techniques did the domestically developed AI models use to catch up with the intelligence levels of the three latest international AI models in such a short time?
I've seen many domestic models claiming to be optimized versions of DeepSeek 3.2.
In that case, DeepSeek's open-source release of R1 last year was truly invaluable; no amount of praise is too much.
Given the constraints of computing power, domestically developed AI has invested more development resources in terms of training economics. Interestingly, due to the inherent high correlation between intelligence and sparsity, domestically developed AI actually outperforms its American counterparts in terms of performance.
Speaking of computing power constraints, a cautionary tale must be mentioned: xAI, where money talks and resources are used to overwhelm. The current AI war is somewhat reminiscent of the electricity war of yesteryear.
However, the difference is that the world is finally no longer forced to choose between alternating current and direct current.
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content





