The most dangerous thing in the era of AI-driven compression

This article is machine translated
Show original
What truly happens in the AI era is not just a technological revolution, but a "massive compression of human capabilities."

Article author and source: LV, hashclaw

Over the past two years, a strange phenomenon has emerged on the internet. Many people are frantically learning AI tools every day:

Today I'm studying Prompts, tomorrow Agents, and the day after Workflows. I just learned Midjourney, and now I'm learning Runway; I just finished studying LangChain, and now I'm chasing MCP.

Everyone is afraid: "Will I miss out on the next wave of the AI revolution?"

As a result, many people fell into a strange state—

I study every day, I feel anxious every day, and I feel like I'm falling behind every day.

The problem is that the speed at which tools are iterating has begun to exceed the speed at which humans learn.

If you base your life on "tool proficiency," then your value will be constantly reset.

The tools you are most proficient in today may be obsolete in six months; the skills you are proud of today may be built into the next generation of models.

Many people haven't realized yet:

What truly happens in the AI era is not just a technological revolution, but a "massive compression of human capabilities."

Professor Li Jin's "Big Compression Theory" can help us understand what will become cheaper and more expensive in the future, and what a person's true core competitiveness is in the AI era.

I. The biggest misjudgment in the AI era: mistaking tool capabilities for personal capabilities

The logic of the industrial age was simple: whoever mastered complex tools possessed higher productivity.

People who can edit videos are scarce, people who can design videos are scarce, and people who can write code are scarce.

Because the tools themselves have a learning curve.

But the AI era has changed all of that.

Skills that used to require ten years of training can now be accomplished with a single sentence; tasks that used to require a professional team can now be completed by one person plus AI.

As a result, knowledge is compressed, skills are compressed, industries are compressed, and professions are compressed.

This is called "massive compression".

II. Professor Li Jin's Theory of Massive Compression: All Costs Approach Zero

Professor Li Jin proposed that technological progress, organizational innovation, economies of scale, and cognitive upgrades will lead to a continuous decrease in all social costs.

Ultimately, the costs of information, dissemination, creation, learning, and collaboration all approach zero.

AI, on the other hand, is the most powerful accelerator of this "massive compression".

Because AI is not just compressing industries, it is compressing human capabilities themselves.

Third, the people most likely to be eliminated in the future: those who live their lives as "tools".

Many people are still stuck in the old mindset: "As long as I am skilled enough, I am safe."

But the real danger in the AI era is that the more you resemble a tool, the easier it is for you to be replaced.

Because AI is naturally adept at standardization, repetition, process optimization, and modularization. It is faster, cheaper, more stable than humans, and can work 24 hours a day.

As a result, many people find themselves in a dilemma: they work very hard, but their value diminishes.

Because what they are striving to improve is precisely the part that AI excels at.

Fourth, what's truly valuable in the AI era is "incompressible capability."

What is incompressible capability? It is capability that is difficult for AI to replace.

For example: judgment, aesthetics, emotional appeal, insight, worldview, creative drive, and personal charisma.

The biggest difference in the future will no longer be "who is better at using tools", but "what will you have left after everyone has tools".

V. Why is it that the more powerful AI becomes, the more important humanities become?

Many people believe that the most important thing in the AI era is to learn technology, but the opposite is true.

The more powerful AI becomes, the more people should return to philosophy, history, literature, art, sociology, psychology, and basic science.

Because these disciplines do not study "how to do it", but rather "why humans exist".

VI. The most dangerous people in the future: those who have skills but no independent thought.

In the future, a large number of people like this will emerge:

Proficient in many tools and workflows, able to generate content quickly and efficiently;

But it lacks a worldview, aesthetics, spiritual core, and long-term judgment.

As a result, they become increasingly anxious—because with each update to the tools, their life values are reset.

This will create a new ailment of our times: "the confusion of high efficiency".

Many people are busy every day, but they don't know why they live.

VII. The True Value of Philosophy: Establishing a Spiritual Coordinate System

Why is philosophy needed even more in the AI era?

Because the amount of information in the future will expand infinitely, but human attention will not.

Therefore, one of the most important abilities for the future is to remain clear-headed amidst chaos.

Philosophy is essentially about training the ability to think independently—not about believing everything others say, but about how you form your own judgments.

The ability to cope with uncertainty

The AI era will bring increasingly rapid changes, leading to industry restructuring, the disappearance of professions, and the invalidation of career paths.

Many people would break down because of this. But philosophy will make you realize that change is the very nature of the world, and you will become more stable.

The ability to confront nothingness

The future of AI will create a sense of crisis for many: "If AI can do everything, what is the point of being human?"

Philosophy has been studying this problem for thousands of years.

From Zhuangzi to Wang Yangming, from Nietzsche to Heidegger, all great ideas are essentially discussing how people can find their own value when the external world collapses.

8. The strongest people of the future will be a combination of "technology + humanities".

The most outstanding people in the future will not just be engineers, nor will they just be artistic youths.

Rather, it refers to "people who understand both technology and human nature."

Because AI can only answer "how to achieve", but the truly important question is always "why do it".

What will be truly scarce in the future are technologists who understand technology, or technologists who possess a humanistic spirit.

9. In the AI era, personal growth is no longer an "S-curve".

In the past, a person's growth resembled an "S-curve": slow accumulation, long-term uphill climb, and gradual growth.

But the AI era is more like a "Z-curve".

Skills that used to take 10 years to develop can now be acquired in a year; skills that were once only available to large companies can now be acquired by ordinary people.

As a result, the upward mobility of ordinary people will increase exponentially, but at the same time, the rate of elimination will also accelerate exponentially.

Therefore, the most dangerous people in the future are those who build their entire lives on a single skill—because skills will inevitably be compressed.

10. Fundamental sciences are the true "underlying operating system" of the future.

Tools are just add-ons, but basic disciplines determine your "cognitive framework".

• Learning Mathematics: Training Logic

• Learning physics: Training you to understand laws

• Studying history: Training you to understand the cycles of civilization

• Studying literature: Training you to understand human nature

• Studying philosophy: Training you to rethink all problems from the ground up.

One of the most dangerous things in the AI era is that humanity will lose its ability to think deeply. Basic sciences, in essence, help you retain "depth as a human being."

11. Truly intelligent people are withdrawing from "instrumental involution."

Many people are still desperately chasing the latest models, the latest agents, and the latest workflows.

But truly top-tier individuals have begun to realize that the speed at which tools iterate has surpassed the speed at which humans learn.

Continuing to invest a significant portion of your life in the details of tools will only lead to decreasing marginal returns.

The truly correct strategy for the future is to leverage AI to amplify ourselves, rather than turning ourselves into AI's appendages.

12. The most valuable people in the future will be those who cannot be compressed.

Professor Li Jin's theory of big compression tells us that in the future, everything will become cheaper and cheaper—knowledge, skills, tools, and productivity will all become cheaper.

But there is one thing that will become increasingly expensive: "irreplaceable people".

Therefore, the real problem in the AI era is not "how not to be replaced by AI", but what you will have that is unique to you when everyone has AI.

Because the real competition in the future will not be about "who is more like AI", but about "who is more like a human".

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments