AI + New Materials: The "GPT Moment" and Paradigm Revolution in Materials Science

This article is machine translated
Show original

AI+new materials, as a disruptive technology that deeply integrates artificial intelligence and materials science, is driving a paradigm revolution in materials research and development, moving from "experience-based trial and error" to "intelligent creation." With the qualitative leap in AI's ability to understand complex structures, generate innovative solutions, and perform cross-scale reasoning, materials science is undergoing a fundamental transformation from "experience-driven" to "intelligent-driven." The "GPT moment" has arrived in the global scientific research and industry sectors, forming a development pattern driven by data, algorithms, and automated experiments.

Yunxiu Capital systematically outlines the development path and technological trends of the integration of AI and materials science, analyzes the competitive advantages in this industrial ecosystem, and explores investment opportunities.

In the grand narrative of artificial intelligence, the emergence of generative pre-trained models is undoubtedly a watershed moment. It not only redefines the boundaries of human-computer interaction but also, with its astonishing versatility and creativity, heralds the dawn of the AGI era. As this disruptive technological wave sweeps into the ancient and fundamental field of materials science, a "GPT moment" for new materials is dawning.

For a long time, the discovery and development of new materials has been like "finding a needle in a haystack" in the vast universe, relying on scientists' intuition, experience, and countless trial-and-error experiments. From Edison testing thousands of substances to find the right filament material to modern researchers spending years optimizing an alloy formula, the "alchemy" of materials research has always been a bottleneck restricting industrial progress. However, with the qualitative leap in AI models' ability to understand complex structures, generate innovative solutions, and perform cross-scale reasoning, materials science is undergoing a fundamental transformation from "experience-driven" to "intelligence-driven." AI is no longer just a tool to assist in calculations, but has become a "research partner" capable of autonomously proposing hypotheses, designing experiments, and even discovering entirely new forms of matter.

The Singularity Has Arrived: AI Reshapes the Paradigm of Materials Research and Development, Moving From Trial and Error to Rational Design

The integration of AI and materials science is not something that can be achieved overnight. Its development process can be clearly divided into three stages, with each iteration marking a leap in R&D efficiency and cognitive depth.

1.0 Era: The Foundation of Computational Materials Science (Late 20th Century - Around 2010)

The core of this stage was "computational assistance." Computational methods, represented by density functional theory (DFT) and molecular dynamics (MD), provided scientists with powerful tools to simulate and predict material properties at the atomic scale. During this period, researchers built a number of high-throughput computing databases, such as the Materials Project, laying a valuable "data foundation" for subsequent data-driven research. However, the computational cost of methods such as DFT is extremely high, making it difficult to handle material screening tasks at the million-level or even tens of millions level. Their applications were more limited to the mechanistic study of known materials and small-scale performance optimization.

The 2.0 Era: Data-Driven AI Exploration (2010-2023)

With the rise of machine learning algorithms and the continuous expansion of materials databases, AI+new materials has entered the "data-driven" 2.0 era. Traditional machine learning algorithms such as random forests and support vector machines are widely used to establish structure-property relationship models between "composition-processing-structure-performance". The breakthrough of this stage is that AI has begun to learn patterns from massive amounts of historical experimental data, achieving rapid prediction of material properties and significantly reducing unnecessary experiments. However, limited by data quality, algorithm generalization ability, and insufficient understanding of the intrinsic physicochemical mechanisms of materials, AI models in this period played more of a "predictor" role than a "creator" role, and their ability to discover new materials remained limited.

The 3.0 Era: Intelligent Creation Led by Large Models (2024 to Present)

With breakthroughs in pre-training techniques, we have witnessed the rise of "large-scale materials models." These models undergo self-supervised learning on massive, multimodal scientific literature, crystal structure databases (such as ICSD and Materials Project), and experimental data, thereby mastering the "universal syntax" of the materials world.

They are gradually acquiring three core characteristics similar to GPT:

Emergent capability: The model can understand cross-disciplinary material knowledge, discover implicit laws that are difficult for human experts to detect, and achieve performance prediction across material systems.

Generative creation: AI is no longer limited to screening known materials, but can "generate" entirely new, theoretically stable crystal structures or molecular formulas based on performance requirements, just like generating text.

Transfer learning and physics enhancement: A general-purpose foundational model pre-trained on massive amounts of known material data contains rich prior chemical and physical knowledge. When faced with entirely new systems, the model does not need to be trained from scratch. Instead, it uses transfer learning combined with active learning strategies to fine-tune and correct boundaries using a small amount of high-confidence data (or DFT calculation data). This significantly reduces experimental costs while ensuring that the prediction results conform to the laws of physical thermodynamics.

The arrival of this moment signifies that materials research and development has officially entered a new era of "intelligent generation and precise design".

Value Anchor: Crossing the "valley of death" from the "laboratory" to the "production line," engineering implementation is the only true solution.

According to QYResearch data, the global AI for Science market size was approximately US$4.538 billion in 2025 and is projected to reach US$26.23 billion by 2032, with a compound annual growth rate of 28.9%—this is a huge market, but the potential of AI+ empowerment is far more than that.

In the six major downstream industries of chemicals, pharmaceuticals, new energy, alloys, displays and semiconductors, AI4S can cover a total downstream market size of nearly 11 trillion US dollars. When the R&D penetration rate reaches 2.5%, the annual output value can exceed 140 billion US dollars.

The most profound transformation in the field of AI+new materials is not merely about using artificial intelligence to accelerate scientific discovery (i.e., AI for Science, AI4S), but is undergoing a strategic leap from "laboratory intelligence" to "engineering and manufacturing intelligence" (AI for Engineering & Manufacturing).

There is a saying in the industry that "one generation of materials leads to one generation of industry," which means that innovation in materials technology is the foundation and precursor to industrial upgrading and development, and breakthroughs in materials directly determine the technological level and form of an industry.

In other words, if a virtual material with excellent performance in a database cannot be stably and economically mass-produced, its industrial value is meaningless. Therefore, the key indicator for measuring the competitiveness of AI+new materials companies has shifted from "how many new materials have been discovered" to "how many AI-designed materials have been successfully transformed into mass-producible products."

This new paradigm of "implementation is key" requires AI systems to transcend pure scientific computing and deeply integrate with engineering thinking and manufacturing constraints. It is no longer an isolated algorithm model, but an intelligent hub that runs through the entire "design-experiment-manufacturing" chain.

First, at the design stage, AI must possess the forward-thinking concept of "Design for Manufacturing" (DFM). This includes three key points:

Engineering constraints must be considered upfront: AI models must incorporate engineering and manufacturing constraints such as raw material costs, synthesis path complexity, equipment compatibility, and environmental safety as part of their optimization objectives from the initial design stage, rather than considering them afterward.

Physical closed-loop verification: Every "thinking" action of AI must be able to be verified quickly and at low cost in the physical world. Deep coupling with automated laboratories (lights-out labs) to form a "wet and dry" iterative flywheel is key to ensuring the feasibility of the design.

Full life cycle perspective: An excellent AI platform should not only design good materials, but also be able to predict their long-term stability, recyclability and environmental impact in end products, thereby providing customers with comprehensive solutions that go beyond the materials themselves.

In the execution phase, AI-driven automated experimental platforms must evolve from "passive execution" to "autonomous decision-making," constructing a high-throughput, high-precision physical verification closed loop. Three core capabilities are crucial:

Unmanned operation: The AI ​​system needs to coordinate and manage automated synthesis, characterization, and testing equipment to achieve fully unmanned operation from raw material proportioning and reaction condition control to performance testing. For example, by combining robotic arms with microfluidic technology, hundreds of formulations can be synthesized and screened in parallel within a day, improving efficiency by an order of magnitude compared to manual methods.

Real-time data feedback and model iteration: The massive amounts of data generated by the experiment (such as temperature, pressure, and spectral signals) need to be fed back to the AI ​​model in real time to drive it to dynamically optimize subsequent experimental plans. This "dry and wet" iterative flywheel can quickly correct the deviation between theoretical predictions and experimental results, forming a closed loop of "prediction-verification-optimization".

Anomaly detection and autonomous error correction: AI needs to have the ability to perceive and process experimental anomalies in real time. When equipment malfunctions or the reaction deviates from expectations, the system can automatically trigger emergency plans (such as pausing the reaction and adjusting parameters) and learn from historical data to avoid similar problems, ensuring the continuity and reliability of the experiment.

Ultimately, the essence of this transformation is to upgrade AI from a powerful "scientific research assistant" to a "chief technology officer" driving industrial value. This signifies that the focus of AI+new materials development has shifted from exploring unknown scientific frontiers to solving real-world industrial pain points.

Ecosystem Reconstruction: Breaking the "Island Effect"—A Deep Coupling Strategy of Computing Power, Data, and Scenarios

Mass production is not a breakthrough in a single technology, but a complex systems engineering project where every aspect is interconnected. Under the traditional research model, "going it alone" is no longer suitable for the high demands of the AI ​​era. Although universities and research institutes possess the most cutting-edge algorithm models and theoretical innovations, they are often limited by the lack of real industrial scenarios and pilot-scale verification platforms. Traditional materials companies, while having clear market pain points and rich application scenarios, generally face the dilemma of weak computing infrastructure, insufficient accumulation of high-quality data, and a shortage of digital talent.

This supply-demand mismatch has led to serious internal resource consumption. Only through in-depth collaboration between industry, academia, research and application, and by closely coupling the computing power of hardware manufacturers, the data scattered among various parties, the algorithms of technology companies and the scenarios of industry giants, can we truly bridge the "last mile" from theoretical design to large-scale production.

The future of AI-powered new materials will no longer be a simple software procurement, but a systemic revolution encompassing "computing power infrastructure, data standards, intelligent algorithms, and physical verification":

At the underlying layer, GPU manufacturers provide the engine, and platform providers set the standards to awaken dormant data. This layer addresses the issues of "computing power bottlenecks" and "data silos," with the core focus on connectivity rather than ownership.

GPU companies are the physical engine of the entire ecosystem. AI materials research requires handling massive amounts of quantum mechanics calculations and molecular dynamics simulations, which places extremely high demands on parallel computing capabilities. GPU manufacturers not only provide the core accelerator cards, but also the underlying parallel computing architecture, which determines the speed and efficiency of upper-layer model training.

Data standard setters (governments, industry associations) act as connectors throughout the ecosystem. The data itself is not controlled by the platform, but rather dispersed across thousands of professors' labs and corporate R&D departments. The platform's core value lies in establishing unified standards for data collection, storage, and interaction, and building a secure and reliable circulation mechanism. Through technologies such as federated learning or data spaces, data from professors and companies can be accessed and used for training by GPU clusters without compromising privacy.

The middle layer is an intelligent hub connecting the underlying infrastructure and the top-level applications. Its core value lies in building a dual-drive system of AI algorithms and simulation software, which completely solves the pain point of difficulty in achieving both efficiency and accuracy in materials research and development.

This layer of technology/service providers no longer relies solely on a single technology, but rather deeply integrates both:

AI algorithm platforms act as "accelerators." By utilizing pre-trained large models and generative AI, candidate materials can be rapidly screened in a vast chemical space, compressing the screening process that originally took months into hours, thus solving the efficiency problem of "finding a needle in a haystack."

Simulation software vendors act as "verifiers." By introducing first-principles-based physical simulations, they provide rigorous physical mechanism verification for AI predictions, ensuring that the design not only conforms to data patterns but also withstands scientific scrutiny, thus solving the credibility problem of "black box predictions."

At the top level, automation labs and industry giants have reduced R&D efficiency from "years" to "days" through a closed loop of "human-machine collaboration".

This layer represents the endpoint of value realization. Industry giants such as Wanhua Chemical and Shengquan Group have opened up real production line needs and verification scenarios, deeply integrating them with automated laboratories. The formulas provided by AI are quickly verified in automated laboratories, and the new data generated feeds back into the mid-level algorithms and the underlying platform, forming a data flywheel.

This deep collaboration across the entire supply chain signifies a shift in industry competition from single-point technological contests to ecosystem alliance rivalry. Only innovative consortia capable of integrating GPU computing power, aggregating dispersed data, and targeting specific industry scenarios can build an unshakeable systemic advantage in the AI ​​+ new materials race.

Technological Conflict: The deep integration of AI and MGE is constrained by the structural challenge of "data scarcity".

As the ecosystem gradually takes shape, technological evolution is also accelerating. AI is no longer just an auxiliary tool, but has undergone a profound integration with materials genome engineering (MGE), like a chemical reaction.

Materials Genome Engineering (MGE) draws inspiration from the concepts of biological genomics, treating the microstructure of materials (such as atomic arrangement, chemical composition, and crystal defects) as "genes" and their macroscopic properties (such as strength, conductivity, and heat resistance) as "phenotypes." Its goal is to change the passive "trial and error" model of traditional materials research and development by constructing a structure-property relationship database that connects "composition, process, structure, and properties," thereby enabling the rational design and efficient development of new materials.

Although MGE laid the foundation for data and high throughput, it still faces insurmountable challenges in practical implementation, preventing its full potential from being realized:

The dilemma of abundant data but scarce information: MGE generates massive amounts of high-dimensional data such as crystal structures, band diagrams, and stress-strain curves, but human scientists struggle to uncover deep, nonlinear physical laws from them. For example, the subtle effects of trace element doping on high-temperature creep performance are often hidden in the noise of tens of thousands of data sets, making them ineffective for the human brain to recognize.

The Blockchain Trilemma of computational cost versus accuracy: High-throughput computation is fast, but its accuracy is limited (e.g., classical force fields); high-precision computation (e.g., DFT) is accurate, but its computational cost is extremely high, potentially taking days or even weeks to compute a complex system. Faced with a potential material space of hundreds of millions, traditional computational methods alone remain inefficient for screening.

Primarily focused on forward selection, with weak reverse engineering capabilities: Traditional material engineering (MGE) mainly involves "selection" within a known or simple combination of materials, i.e., "I have this structure, let's calculate its properties." However, for specific performance requirements such as "I need a material that can withstand 2000℃ and has a density of less than 3g/cm³," MGE lacks effective reverse engineering capabilities and struggles to proactively create entirely new material structures.

The collision of AI is equivalent to equipping MGE with a "super brain" and an "autonomous driving system".

Value 1: AI acts as a pattern decoder, deciphering high-dimensional structure-function relationships.

AI can automatically extract deep physicochemical features from MGE's high-throughput computation and experimental data, and build surrogate models with millisecond-level responses. This means that first-principles calculations that originally took several days can now be completed with high-precision predictions in an instant by AI, greatly reducing screening costs and enabling researchers to quickly evaluate millions of candidate materials, truly achieving a leap from looking at data to understanding patterns.

Value 2: AI acts as a reverse designer, enabling on-demand customization.

The introduction of generative AI (such as diffusion models and variational autoencoders) has endowed MGE with powerful "reverse design" capabilities. Now, the R&D logic has been completely reversed: users only need to input target performance indicators (such as "band gap 1.5 eV, yield strength > 800 MPa"), and AI can directly "generate" entirely new crystal structures or molecular formulas that meet these conditions within a vast chemical space. This capability breaks the boundaries of human experience, making it possible to customize entirely new materials that have never been discovered before.

Value 3: AI acts as a closed-loop commander, building an autonomous evolutionary flywheel.

This is a crucial step for MGE towards intelligentization. In traditional MGE processes, computation, experimentation, and data analysis are often isolated processes. AI, by combining with automated laboratories (lights-out labs), has built a closed-loop autonomous decision-making system that integrates dry and wet processes. AI is not only a designer but also a commander: it proposes hypotheses, directs robots to perform high-throughput synthesis and testing, receives experimental feedback in real time, and dynamically adjusts the next round of experimental plans using algorithms such as Bayesian optimization. This self-iterative mechanism of design-prediction-experimentation-learning forms a constantly accelerating R&D flywheel—every failure or success of an experiment becomes nourishment for the model, making the system smarter with use and compressing the R&D cycle from "years" to "weeks."

Value 4: AI acts as a cross-scale connector, bridging the gap between the micro and macro levels.

A long-standing challenge for MGE (Mechanical Engineering and Technology) is the "scale discontinuity": quantum mechanics calculations (microscopic) offer high precision but are limited in scale, while finite element analysis (macroscopic) is large in scale but relies on empirical parameters. AI plays an excellent connecting role in this process. Through techniques such as machine learning potential functions, AI can simulate the dynamic behavior of tens of thousands or even millions of atoms with near-quantum mechanics precision, accurately transmitting the evolutionary information of the microstructure to mesoscopic and macroscopic models. This allows us to understand the failure mechanisms of materials at the atomic level and directly predict their macroscopic lifetime and performance under actual operating conditions, truly achieving full-scale digital twins and solving the long-standing multi-scale modeling problem that has plagued materials scientists.

To put it more vividly, MGE provides standardized "ingredients" and "recipe libraries"—data and high-throughput methods—and then AI acts as a "top chef" to understand the characteristics of the ingredients and invent new dishes—discovering patterns and generating designs.

The vision is wonderful, but in reality, it faces multiple technical barriers, the most critical of which is insufficient data.

Fragmentation and Non-standardization:

Current materials data are scattered across laboratories worldwide, various computational software (such as VASP, Gaussian, and LAMMPS), and literature. Data from different sources exhibits inconsistent formats, missing metadata, and even inconsistent naming conventions. AI models are data-driven, and this heterogeneity makes building large-scale, universal databases extremely difficult.

The contradiction between small sample size and high-dimensional features:

Compared to the hundreds of millions of labeled data points in computer vision or natural language processing, high-quality experimental data in materials science often consists of only a few hundred to a few thousand data points. Faced with a high-dimensional space containing tens of thousands of element combinations and process parameters, existing AI models are prone to overfitting and struggle to achieve accurate predictions in data-scarce fields (such as novel high-temperature alloys and complex ceramics).

Missing negative data:

The scientific community tends to publish successful experimental results (positive data), while discarding a large number of failure cases (negative data). However, negative data is crucial for AI to learn "what doesn't work" and to define the boundaries of material stability. A lack of negative data can lead to overly optimistic predictions of material properties by AI models, creating illusions.

In addition, there are issues such as non-equilibrium process modeling and lack of synthesis feasibility assessment, which pose significant challenges and opportunities for the industry.

Business Upgrade: From Selling Shovels to Building a Community of Interests Sharing Gold Mines

Under the dual pressure of technology and data, the traditional linear business model of "software licensing" or "project-based formula sales" is no longer sustainable. Faced with high R&D costs and uncertain delivery results, no single company can bear all the risks alone. Therefore, the business logic of AI + new materials is undergoing a fundamental reversal: it is no longer about whoever owns the technology collects the money, but about whoever can unite resources from all parties and solve the final problem will reap the rewards.

This shift has given rise to a variety of innovative business paradigms characterized by "deep integration," four of which are illustrated below:

Community of Interests

Operating model: AI technology providers enter the market through a model of "technology investment + joint R&D + industrialization profit sharing," forming capital ties with end customers such as military and new energy companies.

Applicable material types: Strategic new materials, such as metamaterials and solid-state battery materials.

Core logic: Transform the client-vendor relationship into a "partner" relationship, jointly sharing R&D risks and market growth.

turnkey delivery

Operating Model: Drawing inspiration from the CRO/CDMO model in the biopharmaceutical field, the platform has spawned the CRDO (Contract Research and Design Organization) model in the materials field. By leveraging "AI + automated pilot production lines," the platform delivers to companies not just a string of molecular formulas, but a complete set of validated and directly deployable production line process parameters.

Applicable material types: Basic materials for large-scale chemical and fine chemical industries, such as specialty plastics and electronic chemicals.

Core logic: Shift from selling formulas to selling production capacity, lowering the barriers to transformation for traditional manufacturing, and charging based on delivery results.

Regional empowerment

Operational Model: A technology operator possessing AI algorithms collaborates with a computing power company, with local government funding the construction of an intelligent computing center. The operator then builds an AI-powered new materials experimental platform to incubate or empower local industries.

Applicable material types: Materials related to local specialty industries, such as bio-based materials and industrial mold steel.

Core logic: Computing power for industry, achieving a win-win-win situation through regional innovation communities, increased government tax revenue, stable operation of computing centers, and profitability for technology operators through technology service fees and incubation equity returns.

Inclusive empowerment

Operational Model: Standardized performance prediction tools are provided through a cloud-based SaaS platform (charged per usage), while localized private deployments are offered to leading enterprises. For example, small and medium-sized paint companies subscribe to cloud-based models to screen for environmentally friendly formulations, while large automakers customize exclusive corrosion protection material models.

Applicable material types: General-purpose functional materials, such as anti-corrosion coatings and 3D printing consumables.

Core logic: A dual-track system of public domain traffic acquisition + private domain cultivation ensures both broad technological coverage and builds competitive barriers through a data flywheel.

Although these models have different entry points, they are all essentially addressing the same core contradiction: the contradiction between the high barriers to entry and high investment in AI technology and the long cycle and high risk of the materials industry.

Therefore, to assess the feasibility and explosive potential of a new AI + new materials business model in the future, we might consider the degree to which the following four dimensions are met:

Profit mechanism: Shifting from one-off transactions to risk-sharing and profit-sharing, technology providers can successfully share in the long-term industrialization dividends by binding with customers.

Delivery model: Shifting from virtual data to physical capabilities, with deliverables upgraded to deterministic productivity such as producible processes and reusable platforms.

Resource organization: Shifting from individual efforts to ecosystem building, integrating government, capital, computing power, research and scenarios to create a win-win closed loop for all parties.

Data logic: Shifting from one-way acquisition to a closed-loop flywheel, business cooperation has become a moat for acquiring exclusive high-quality data and feeding back into model iteration.

Ultimate Barrier: Rejecting Single-Point Involution, Building a Four-Dimensional Composite Moat of "Data-Algorithm-Validation-Implementation"

In summary, the industry barriers for AI+new materials technology and service providers are not advantages in a single dimension, but rather a complex moat composed of data, algorithms, experimental verification, and industrial application capabilities.

The barriers to integrating "high-quality data" with "domain-specific algorithms"

Data dimensions: This is not simply a compilation of publicly available literature data, but a cleaned, structured, and standardized private high-fidelity dataset. It includes first-principles calculation data, experimental measurement data, and failure data.

Algorithm dimension: Physics-driven algorithms that embed prior physical/chemical knowledge (such as symmetry, conservation laws, and thermodynamic constraints) into deep learning architectures, rather than pure data fitting black boxes.

Its core competitive advantage lies in the leap in prediction accuracy and the compression of the search space.

The experimental verification barrier of "dry-wet closed loop"

Hardware facilities: Equipped with a high-throughput automated experimental platform (robot cluster) to achieve 24/7 unmanned operation.

Software collaboration: Establish a full-chain digital closed loop of "AI design - robot synthesis - automatic representation - data feedback".

The core competitive advantage it provides lies in the exponential increase in R&D efficiency and the precipitous decrease in trial and error costs.

Interdisciplinary talent and the implementation barriers to industrialization

Talent density: The team comprises not just AI experts or materials scientists with a single background, but a cross-disciplinary team possessing combined capabilities in materials science, AI algorithms, and engineering. Members must have a deep understanding of materials mechanisms and be able to translate this understanding into algorithmic language.

Practical know-how: Possessing the engineering capability to translate laboratory formulations into industrial production processes. This includes a deep understanding of pilot-scale amplification, production line adaptation, and supply chain management, rather than merely relying on academic papers or software.

The core competitiveness it provides lies in its ability to bridge the "last mile" from theory to product and its extremely high team efficiency.

Systemic barriers to business model innovation and ecosystem building

Mechanism Design: It is not a simple buyer-seller relationship, but a distribution mechanism based on "deeply aligned interests and co-creation of value." Through joint R&D, revenue sharing, and data asset ownership confirmation, raw material suppliers, R&D companies, manufacturers, and users can all obtain excess benefits in the cooperation, thereby actively joining the ecosystem.

Platform Dimension: Building an "open collaborative innovation network" rather than a closed system. By defining standard interfaces, sharing infrastructure, and accumulating a foundation of industry data, we lower the entry barriers and trial-and-error costs for partners, creating a network effect where "the more participants, the higher the ecosystem value."

The core competitive advantage it offers lies in its strong customer loyalty and its self-reinforcing ecological moat.

Track Player Classification

Based on each company's value anchor in the industry chain, we roughly divide the players in this field into three major categories (excluding AI + drug development):

*Given the non-public nature of information in the primary market, our current classification of this sector is primarily based on publicly disclosed data and general industry understanding. While this approach provides a general market overview, it largely remains at the surface level of business models. In reality, the depth of each company's technology stack, business boundaries, and true core competitiveness are often more complex and dynamic than what is presented in publicly available information. Therefore, this classification serves more as a preliminary reference framework than a final assessment of each company's strength.

Platform and Tool

These companies typically don't manufacture materials themselves; instead, they develop AI algorithms, simulation software, or databases. They act as "super brains" for researchers, primarily charging through software licensing, subscription fees, or cloud services.

The technical features focus on the versatility of the underlying algorithms, multi-scale simulation capabilities, and improved computational efficiency.

*Yunxiu Capital summarizes and organizes information based on publicly available data.

R&D services and intelligent body shape

These companies use large AI models or automated laboratories to directly help clients solve specific R&D problems, such as finding catalysts and screening formulations.

The technology focuses on "dry and wet closed loop" (AI design + robot experimentation) and large vertical models in specific fields, and charges are based on projects or results.

*Yunxiu Capital summarizes and organizes information based on publicly available data.

Vertical product type

This is the most direct business model—not only using AI for research and development, but also building its own factories for production and directly selling materials.

Its technological advantage lies in having its own production lines and physical products, realizing a closed-loop chain from algorithm to physical product.

*Yunxiu Capital summarizes and organizes information based on publicly available data.

Conclusion

The wave of AI and new materials is by no means a fleeting technological frenzy, but a fundamental paradigm shift in how humanity perceives the material world. We are standing at the dawn of a new era: moving from Edison-style accidental trial and error to the inevitable realm of rational design.

The ultimate vision of this transformation is to build a new paradigm of material creation where "what you imagine is what you get." In this paradigm, AI is not only a tool, but also a bridge connecting atoms and bits; data is not only a record, but also the fuel driving innovation. When high-quality data, specialized algorithms, automated experiments, and interdisciplinary talent are integrated, we will unlock not only breakthroughs in single materials, but also an accelerated evolution of the entire industrial civilization.

The future is here; only change is constant. In this grand narrative of reshaping the material foundation, the true winners will be those pioneers who can establish a complete closed loop of "cognition-verification-implementation." They are not only developing new materials, but also defining the industrial rules and the cornerstone of civilization for the next century.

This article is from the WeChat official account "Yunxiu Capital" (ID: winsoulcapital) , authored by Yunxiu Capital, and published with authorization from 36Kr.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments