Based on the perspectives of the a16z crypto team and invited contributors, this report focuses on key future trends in the crypto industry, covering core areas such as stablecoins, AI agents, and privacy and security.
Written by: Adeniyi Abiodun, Ali Yahya, Andrew Hall, Arianna Simpson, Christian Crowley, Daejun Park, Elizabeth Harkavy, Guy Wuollet, Jeremy Zhang, Justin Thaler, Maggie Hsu, Miles Jennings, Pyrs Carvolth, Robert Hackett, Sam Broner, Scott Duke Kominers, Sean Neville, Shane Mac, and Sonal Chokshi
Compiled by: Saoirse, Foresight News
This week, a16z released its annual "Key Insights" report, drawing on perspectives from its partners across Apps, American Dynamism, Bio, Crypto, Growth, Infra, and Speedrun teams. Below are 17 observations on industry trends for 2026 from several a16z partners in the crypto space (including several invited contributors)—covering topics such as intelligent agents and artificial intelligence, stablecoins, tokenization and finance, privacy and security, extending to prediction markets, SNARKs and other applications, and finally discussing the direction of industry development.
On stablecoins, RWA tokenization, payments and finance
1. A higher quality and more flexible stablecoin deposit and withdrawal channel
Last year, stablecoin trading volume was estimated at $46 trillion, continuously breaking historical records. To put this in perspective, this is more than 20 times the trading volume of PayPal, nearly 3 times that of Visa, one of the world's largest payment networks, and is rapidly approaching the trading volume of the Automated Clearing House (ACH), the electronic network in the United States that processes financial transactions such as direct deposits.
Today, sending a stablecoin takes less than a second and costs less than a cent. But the core issue that remains unresolved is how to integrate these "digital dollars" with the financial system people use every day—that is, the "deposit/withdrawal channels" for stablecoins.
A new generation of startups is filling this gap, driving the integration of stablecoins with more widespread payment systems and local currencies: some companies are using cryptographic proof-of-delivery technology to allow users to privately exchange their local currency balances for digital dollars; others are integrating regional networks to enable interbank transfers using features such as QR codes and real-time payment channels; still others are building truly interoperable global wallet layers and card issuance platforms, allowing users to directly use stablecoins at everyday merchants. These solutions collectively expand the reach of the digital dollar economy and may accelerate the adoption of stablecoins as a mainstream payment tool.
As payment gateways mature and digital dollars are directly integrated into local payment systems and merchant tools, new application scenarios will emerge: cross-border workers can receive payments in real time, merchants can receive global dollars without bank accounts, and applications can instantly settle value with global users. At that time, stablecoins will transform from a "niche financial tool" into a fundamental "internet settlement layer."
— Jeremy Zhang, a16z Encryption Engineering Team
2. Reconstructing RWA tokenization and stablecoins with a "crypto-native mindset"
Currently, banks, fintech companies, and asset management institutions have shown great interest in "on-chaining traditional assets," including US stocks, commodities, indices, and other traditional assets. However, as more traditional assets are on-chained, their tokenization process often falls into the "objectification trap"—that is, it is limited to the existing form of real-world assets and fails to leverage the advantages of crypto's native characteristics.
Synthetic derivatives such as perpetual futures not only offer deeper liquidity but are also easier to implement. Furthermore, the leverage mechanism of perpetual contracts is easy to understand, making them, in my opinion, the crypto-native derivatives with the highest "product-market fit." In addition, emerging market equities are one of the most suitable asset classes for "perpetualization" (the liquidity of some stocks' "zero-day expiration options" market already exceeds that of the spot market, making them perpetual would be a highly valuable endeavor).
This is essentially a choice between "fully on-chain vs. tokenization," but in any case, we will see more "crypto-native" RWA tokenization solutions in 2026.
Similarly, stablecoins entered the mainstream market in 2025, with outstanding issuance continuing to grow; in 2026, the stablecoin sector will shift from "simple tokenization" to "innovative issuance models." Currently, stablecoins lacking a sound credit infrastructure are similar to "narrow banks"—holding only specific, highly secure liquid assets. While the narrow banking model has its merits, in the long run, it is unlikely to become a core pillar of the on-chain economy.
Currently, several new asset management institutions, asset managers, and protocols have begun exploring "on-chain asset-backed lending based on off-chain collateral." However, these loans are often initiated off-chain first and then tokenized. I believe that the value of tokenization in this model is very limited, only serving users already within the on-chain ecosystem. Therefore, debt assets should be "initiated directly on-chain," rather than "initiated off-chain and then tokenized"—on-chain initiation reduces loan service costs and back-end architecture costs, while improving accessibility. Although compliance and standardization remain challenges, developers are actively working to address these issues.
— Guy Wuollet, General Partner in the Crypto Space at a16z
3. Stablecoins drive upgrades to bank ledgers, unlocking new payment scenarios.
The software used by most banks today is almost "unrecognizable" to modern developers: In the 1960s and 70s, banks were early adopters of large-scale software systems; in the 1980s and 90s, second-generation core banking software emerged (such as Temenos' GLOBUS and Infosys' Finacle). However, this software has gradually become outdated, and updates are extremely slow—today, the banking industry (especially core ledger systems, i.e., the critical databases that record deposits, collateral, and other liabilities) still often relies on mainframes, uses COBOL programming language, and employs batch file interfaces instead of APIs.
The vast majority of global assets are stored in these "core ledgers with decades of history." Although these systems have been proven in practice, have gained regulatory approval, and are deeply integrated into complex banking scenarios, they also severely hinder innovation: adding key functions such as real-time payments (RTP) may take months or even years, and requires dealing with layers of technical debt and regulatory complexity.
The value of stablecoins lies precisely in this: in the past few years, stablecoins have not only achieved "product-market fit" and entered the mainstream, but by 2025, traditional financial (TradFi) institutions have "fully embraced" stablecoins. Stablecoins, tokenized deposits, tokenized government bonds, and on-chain bonds allow banks, fintech companies, and financial institutions to develop new products and serve new customers—more importantly, without forcing these institutions to rebuild their "aging but decades-old stable legacy" systems. Stablecoins provide financial institutions with a "low-risk innovation path."
— Sam Broner
4. The Internet will become the "next generation of banks"
With the widespread adoption of AI agents, more business activities will be completed automatically in the background (rather than relying on user clicks), which means that the way value (money) is transferred must change accordingly.
In a world where systems act on intent (rather than on instructions)—for example, where AI agents automatically transfer funds after recognizing needs, fulfilling obligations, or triggering results—value transfers must possess "the same speed and freedom as current information transfers." Blockchain, smart contracts, and new protocols are key to achieving this goal.
Today, smart contracts can complete global USD payments in seconds; by 2026, emerging foundational protocols such as x402 will enable "programmable and responsive settlement": smart agents can pay for data, GPU computing power, or API calls instantly and without permission, without the need for invoicing, reconciliation, or batch processing; software updates released by developers can have built-in payment rules, limits, and audit trails, without the need for fiat currency integration, merchant onboarding, or reliance on banks; prediction markets can "automatically settle in real time" as events unfold—odds updates, smart agent transactions, and global profit payouts are completed in seconds, without the involvement of custodians or exchanges.
When value can flow in this way, the "payment process" will no longer be an independent operational layer, but will become a "network behavior": banks will be integrated into the internet infrastructure, and assets will become infrastructure. If money can flow like "routable data packets on the internet," the internet will no longer "support the financial system," but will "become the financial system itself."
— Christian Crowley, Pyrs Carvolth, a16z Crypto Market Development Team
5. Wealth management services accessible to everyone
Traditionally, personalized wealth management services have been limited to banks' "high-net-worth clients": customized advice and portfolio adjustments across asset classes are costly and complex to operate. However, with the tokenization of more asset classes, encrypted channels enable personalized strategies based on "AI recommendations + assisted decision-making" to be "executed instantly and rebalanced at low cost."
This goes far beyond "robo-advisors": everyone can now access "active portfolio management" (rather than just passive management). In 2025, traditional financial institutions increased their allocation to crypto assets in their portfolios (banks recommended allocating 2%-5% directly or through exchange-traded products (ETPs), but this is just the beginning; in 2026, we will see the rise of platforms "aimed at wealth accumulation" (rather than just focusing on wealth preservation)—fintech companies like Revolut and Robinhood, as well as centralized exchanges like Coinbase, will leverage their technological advantages to seize this market.
Meanwhile, DeFi tools such as Morpho Vaults can automatically allocate assets to lending markets with the "most risk-adjusted returns," providing a "core return allocation" for the investment portfolio. Holding idle liquidity in stablecoins (rather than fiat currency) and in tokenized money market funds (rather than traditional money market funds) can further expand the potential for returns.
Finally, tokenization, while meeting compliance and reporting requirements, also makes it easier for retail investors to access illiquid private market assets (such as private credit, pre-IPO corporate equity, and private equity). Once all assets in a balanced portfolio (from bonds to stocks, private equity, and alternative assets) are tokenized, rebalancing can be completed automatically without wire transfers.
— Maggie Hsu, a16z Crypto Market Development Team
About Agents and AI
6. From KYC to KYA
Currently, the bottleneck of the "intelligent agent economy" is shifting from "intelligent level" to "identity recognition".
In the financial services sector, the number of "non-human identities" (such as AI agents) has reached 96 times that of human employees, but these identities are still "ghosts that cannot access the banking system"—the core missing basic capability is KYA (Know Your Agent).
Just as humans need credit scores to obtain loans, intelligent agents also need "encrypted signature credentials" to complete transactions—credentials that must be linked to the agent's "principal," "constraints," and "attribution of responsibility." If this issue remains unresolved, businesses will continue to block intelligent agents at the firewall level. The industry, which has spent decades building KYC infrastructure, now needs to overcome the KYA challenge within months.
— Sean Neville, co-founder of Circle, USDC architect, and CEO of Catena Labs
7. AI will empower "substantive research tasks"
As a mathematical economist, in January 2025, I still struggled to get consumer-grade AI models to understand my workflow; but by November, I could send abstract tasks to AI models as if giving instructions to doctoral students—and sometimes they would even return "innovative and correctly executed" results. Beyond my personal experience, the application of AI in research is becoming increasingly widespread, especially in the "reasoning domain": AI not only directly assists in discovery but can also "autonomously solve Putnam problems" (considered the world's most difficult university-level mathematics exams).
What still needs to be explored is: in which fields are these research aids most valuable, and how exactly should they be applied? However, I anticipate that AI will foster and reward a "new model of polymath research"—one that emphasizes the ability to "speculate connections between ideas" and "rapidly deduce from highly speculative answers." These answers may not be accurate, but they point in the right direction (at least within a specific logical framework). Ironically, this is akin to "utilizing the power of model illusion": when a model is intelligent enough, giving it space for abstract exploration may produce meaningless content, but it may also lead to crucial discoveries—just as humans are most creative in a "non-linear, non-specific goal-oriented" state.
To achieve this reasoning model, a "new AI workflow" needs to be built—not only "interaction between intelligent agents," but also "nested intelligent agents": multi-layered models assist researchers in evaluating "methods of preceding models," gradually filtering effective information and eliminating invalid content. I have used this method to write papers, while others have used it for patent searches, creating new art, and even (unfortunately) discovering new ways to attack smart contracts.
However, it should be noted that to run a "nested inference agent cluster" to support research, two key issues need to be addressed—"interoperability" between models and "identifying and reasonably compensating for the contributions of each model"—and encryption technology can provide solutions for this.
— Scott Duke Kominers, member of the a16z cryptography research team, professor at Harvard Business School
8. The "Hidden Tax" of Open Networks
The rise of AI agents is imposing a "hidden tax" on the open internet, fundamentally undermining its economic foundation. This damage stems from the increasing misalignment between the internet's "context layer" and "execution layer": currently, AI agents extract data from "ad-supported websites" (context layer), providing convenience to users while systematically bypassing "revenue sources supporting content creation" (such as advertising and subscriptions).
To avoid the decline of the open network (while protecting the diverse content that "fuels AI"), a large-scale deployment of "technology + economics" solutions is needed, such as "next-generation sponsored content," "micro-attribution systems," or other new funding models. Existing AI licensing agreements are essentially "financially unsustainable stopgap measures"—compensation for content providers is often only a fraction of the revenue they lose due to AI diverting traffic.
Open networks require a new techno-economic model where value flows automatically. A key shift in 2026 is from static authorization to real-time, pay-as-you-go pricing. This means testing and scaling a blockchain-based micropayment system with precise attribution criteria—automatically rewarding all stakeholders who contribute to an agent's task completion.
— Elizabeth Harkavy, member of the a16z crypto investment team
Regarding privacy and security
9. Privacy will become the "most important moat" in the crypto space.
Privacy is a key prerequisite for "global finance on the blockchain," but almost all blockchains currently lack this feature—for most chains, privacy is merely an "extra consideration."
Today, privacy capabilities are enough to make a blockchain stand out from its competitors; more importantly, privacy can create a chain lock-in effect, which can be called a privacy network effect – especially in a time when performance competition alone is no longer sufficient.
Thanks to cross-chain bridge protocols, migration between different chains is very easy as long as the data is public; however, the situation is completely different when privacy is involved: "Transferring tokens across chains is easy, but transferring secrets across chains is difficult." When entering or leaving the "privacy zone," observers of the chain, mempool, or network traffic may be able to identify the user; and transferring assets "between a privacy chain and a public chain" or "even between two privacy chains" will leak metadata such as transaction time and amount correlation, increasing the risk of users being tracked.
Currently, many "undifferentiated new blockchains" are experiencing transaction fees approaching zero due to competition (the on-chain space is essentially homogenized); while blockchains with privacy capabilities can build stronger "network effects." The reality is: if a "general-purpose blockchain" does not have a thriving ecosystem, killer applications, or unique distribution advantages, users and developers have no reason to choose it, build on it, let alone develop loyalty.
On public blockchains, users can easily transact with users on other blockchains, making the choice of which blockchain to use irrelevant. However, on privacy blockchains, the choice of which blockchain to use is crucial. Once users join a privacy blockchain, they may be reluctant to migrate due to concerns about exposing their identities, leading to a "winner-takes-all" scenario. Since privacy is a fundamental requirement in most real-world scenarios, a few privacy blockchains may dominate the crypto space.
— Ali Yahya, General Partner, a16z Crypto Insights
10. The (Near) Future of Instant Messaging: Not Only Quantum-Resistant, But Also Decentralized
As the world prepares for the "quantum computing era," encrypted instant messaging applications such as Apple, Signal, and WhatsApp have taken the lead and achieved significant results. However, the problem lies in the fact that all mainstream communication tools rely on private servers operated by a single entity—servers that are vulnerable to government intervention to shut down, implant backdoors, or forcibly obtain private data.
If a country can shut down servers, companies hold private server keys, or even companies themselves own private servers, then what's the point of "quantum-resistant encryption"? Private servers require users to "trust me," while "no private servers" means "you don't need to trust me." Communication doesn't need an intermediary (a single company), but rather "open protocols that don't require trust in any entity."
The path to achieving this goal is "network decentralization": no private servers, no single application, fully open-source code, and the use of "top-level encryption technology" (including resistance to quantum threats). In an open network, no individual, company, non-profit organization, or country can deprive people of their right to communicate—even if a country or company shuts down an application, 500 new versions will appear the next day; even if a node is shut down, the economic incentives provided by technologies such as blockchain will allow new nodes to immediately fill the gap.
When people "control their messages with keys" (like controlling funds), everything will change: applications may iterate, but users will always control their messages and identities—even if they no longer use an application, ownership of the messages will remain with the user.
This is not just about "quantum resistance" and "encryption," but also about "ownership" and "decentralization." Without these two, what we build is nothing more than "unbreakable encryption that can be shut down at any time."
— Shane Mac, Co-founder and CEO of XMTP Labs
11. "Secret as a Service"
Behind every model, agent, and automated system lies a simple foundation: data. However, most data transmission channels today—whether input to or output from models—are opaque, tamper-proof, and unauditable. This may not significantly impact some consumer applications, but many industries and users, including finance and healthcare, demand privacy protection for sensitive data from businesses; it is also a major obstacle to institutions' efforts to tokenize real-world assets.
So, how can we achieve secure, compliant, autonomous, and globally interoperable innovation while protecting privacy? There are many solutions, but here I focus on "data access control": Who controls sensitive data? How does data flow? Who (or what entity) has the right to access the data?
Without data access control mechanisms, any entity currently seeking to protect data confidentiality must either rely on centralized services or build a customized system—a process that is not only time-consuming, labor-intensive, and costly, but also hinders traditional financial institutions and other entities from fully leveraging the functions and advantages of on-chain data management. Furthermore, as intelligent agent systems begin to autonomously browse information, complete transactions, and make decisions, users and institutions across industries require "encryption-level security," rather than "best-effort trust commitments."
This is why I believe we need "Secrets-as-a-Service": leveraging new technologies to enable programmable native data access rules, client-side encryption, and decentralized key management—clearly defining who can decrypt which data under what conditions and for how long, with all rules enforced on-chain. Combined with verifiable data systems, "data confidentiality protection" will become part of the internet's fundamental public infrastructure, rather than an afterthought patch at the application level, truly making privacy a core infrastructure.
— Adeniyi Abiodun, Chief Product Officer and Co-founder of Mysten Labs
12. From "Code is Law" to "Rules are Law"
Recent DeFi hacks have targeted protocols that have been tested and proven over time, have strong teams, rigorous auditing processes, and have been running stably for many years. These incidents reveal a disturbing reality: current mainstream security practices largely remain at the level of "judgment based on experience" and "case-by-case handling."
To drive DeFi security towards maturity, two major shifts are needed: from "patching vulnerability patterns" to "protecting design-level attributes," and from "best-effort protection" to "principles-based systemic protection." This can be approached from two aspects:
In the static/pre-deployment phase (testing, auditing, formal verification): it is necessary to systematically prove "global invariance" (i.e., the core rules that the entire system always follows), rather than just verifying "manually selected local rules". Currently, several teams are developing AI-assisted proof tools. These tools can assist in writing specifications, proposing invariance assumptions, and significantly reduce the proof engineering work that previously had to be done manually—work that was extremely costly and difficult to scale up in the past.
In the dynamic/post-deployment phase (runtime monitoring, runtime enforcement, etc.), the aforementioned "immutability rules" can be transformed into real-time protective barriers as the last line of defense. These protective barriers are directly encoded as "runtime assertions," and all transactions must satisfy the assertion conditions to be executed.
In this way, we no longer need to assume that "all vulnerabilities have been fixed," but instead enforce key security attributes through the code itself—any transaction that violates these attributes will be automatically rejected.
This is not just theoretical speculation. In fact, almost all hacker attacks to date have triggered such security checks during execution, which may prevent the attack from taking place. Therefore, the once popular concept of "code is law" is gradually evolving into "standards are law": even when faced with new types of attacks, attackers must adhere to the core security attributes that maintain system integrity, and the attack methods that remain will either have minimal impact or be extremely difficult to implement.
— Daejun Park, a16z Encryption Engineering Team
Regarding other industries and applications
13. Market forecasting: Larger scale, wider coverage, and higher level of intelligence
Prediction markets have entered the mainstream. In 2026, with their deep integration with encryption technology and AI, they will further expand in scale, broaden their coverage, and enhance their intelligence level—but at the same time, they will also bring new and important challenges to developers that urgently need to be addressed.
First, prediction markets will launch more contracts. This means we'll not only have access to real-time odds on major elections and geopolitical events, but also odds on outcomes across various sub-sectors and complex, overlapping events. As these new contracts continue to release information and integrate into the news ecosystem (a trend already evident), society will face important questions: How do we balance the value of this information? How can we improve the transparency and auditability of prediction markets through optimized design (this can be achieved using encryption technology)?
To address the significant increase in the number of contracts, a new consensus mechanism is needed to complete contract settlement. While centralized platform settlement (confirming whether an event actually occurred and how to verify it) is important, its limitations have been exposed by controversial cases such as the "Zelensky litigation market" and the "Venezuelan election market." To resolve these marginal cases and promote the expansion of prediction markets into more practical scenarios, new decentralized governance mechanisms and Large Language Model (LLM) oracles can assist in determining the authenticity of disputed results.
Beyond LLM oracles, AI is opening up even more possibilities for prediction markets. For example, AI agents trading on prediction platforms can extensively collect various signals to gain short-term trading advantages, thus providing new insights into understanding the world and predicting future trends (projects like Prophet Arena have already demonstrated the potential in this area). These agents can not only serve as "advanced political analysts" providing insights, but by analyzing their autonomously formed strategies, they can also help us discover the core factors influencing complex social events.
Will prediction markets replace polls? The answer is no. On the contrary, they can improve the quality of polls (polling information can also be integrated into prediction markets). As a political scientist, I most look forward to the synergistic development of prediction markets and a "rich and vibrant polling ecosystem"—but this requires the reliance on new technologies: AI can optimize the survey experience; encryption technology can provide new ways to prove that poll respondents are real humans and not bots, etc.
— Andrew Hall, Crypto Research Advisor at a16z, Professor of Political Economy at Stanford University
14. The Rise of Pledged Media
Traditional media models tout "objectivity," but their drawbacks have long been apparent. The internet has given everyone a voice, and today, more and more practitioners, builders, and stakeholders are directly conveying their views to the public—their perspectives reflecting their own "interests" in the world. Ironically, audiences respect them not "despite their interests," but "precisely because they have interests."
The new development in this trend is not the rise of social media, but rather the emergence of cryptographic tools—tools that allow people to make publicly verifiable commitments. As AI drastically reduces the cost and simplifies the process of generating massive amounts of content (generating content from any perspective and from any identity—real or not), relying solely on human (or bot) statements is no longer convincing. Tokenized assets, programmable lock-up periods, prediction markets, and on-chain historical records provide a more solid foundation for trust: commentators can prove their consistency between words and actions (backing their opinions with funds); podcasters can lock up tokens to prove they won't manipulate the market or engage in speculative maneuvers; and analysts can link their predictions to publicly settled markets, creating auditable performance records.
This is precisely the early form of what I call "staked media": media that not only endorses the concept of "stakeholder" but also provides concrete evidence. In this model, credibility comes neither from "pretending to be neutral" nor from "unfounded claims," but from "transparent and verifiable commitments of interest." Staked media will not replace other media forms but rather complement the existing media ecosystem. It sends a new signal: it's no longer "Believe me, I am neutral," but rather "This is the risk I'm willing to take, and this is how you can verify what I'm saying."
— Robert Hackett, a16z Encrypted Editorial Team
15. Encryption technology provides "a new type of foundational component beyond blockchain".
For years, SNARKs—a cryptographic proof technique that verifies computation results without re-performing the computation—have been largely confined to applications within the blockchain field. The primary reason is their "excessive cost": the amount of work required to generate a computational proof could be a million times greater than directly performing the computation. This technology is only valuable in scenarios where the cost can be amortized across thousands of verification nodes (such as blockchain); otherwise, it is impractical.
But this is about to change. By 2026, the cost of zero-knowledge virtual machine (zkVM) provers will drop to approximately 10,000 times (meaning the effort required to generate a proof is 10,000 times that of direct computation), with a memory footprint of only a few hundred megabytes—fast enough to run on a mobile phone and affordable enough for widespread adoption. One reason this 10,000-fold reduction might be a "critical threshold" is that the parallel processing power of high-end GPUs is approximately 10,000 times that of a laptop CPU. By the end of 2026, a single GPU will be able to "generate proofs of CPU execution processes in real time."
This will realize the vision proposed in older research papers: "verifiable cloud computing." If you need to run CPU workloads in the cloud due to reasons such as "insufficient computational power for GPU processing," "lack of relevant technical capabilities," or "legacy system limitations," you will be able to obtain "cryptographic proofs of computational correctness" at a reasonable additional cost. The prover is already GPU-optimized, and your code will be usable without additional adaptation.
— Justin Thaler, member of the a16z cryptography research team, Associate Professor of Computer Science at Georgetown University
On industry development
16. Transaction Business: A "Transit Station" for Crypto Businesses, Not a "Destination"
Today, aside from stablecoins and some core infrastructure companies, almost all high-performing crypto companies have either shifted to trading or are in the process of transforming into trading businesses. But what will happen if "all crypto companies become trading platforms"? A large number of companies crowding into the same sector will not only distract users but also lead to a situation where "a few giants monopolize the market, and most companies are eliminated." This means that companies that shift to trading too quickly will miss the opportunity to build a "more competitive and sustainable business model."
I fully understand the founders' initial motivation to achieve business profitability, but "pursuing short-term product-market fit" also comes at a price. This issue is particularly prominent in the crypto space: the unique dynamics of token characteristics and speculative attributes can easily lead founders to choose the path of "instant gratification" in the process of "finding product-market fit"—which is essentially similar to the "marshmallow experiment" (testing the ability to delay gratification).
The transaction business itself is not problematic; it is an important market function, but it should not be the "ultimate goal" of a company. Founders who focus on the "essence of the product in product-market fit" are more likely to become industry winners in the end.
— Arianna Simpson, General Partner, a16z Crypto
17. Unleashing the full potential of blockchain: When the legal and technical architectures finally align
One of the biggest obstacles to building blockchain networks in the US over the past decade has been "legal uncertainty." Expanded securities laws and inconsistent enforcement have forced founders into a regulatory framework where they are "designing for the business, not the network." For years, "avoiding legal risks" has superseded "product strategy," and the importance of engineers has given way to lawyers.
This situation has led to numerous distortions: founders are advised to avoid transparency; token distribution becomes arbitrary in legal terms; governance degenerates into a mere formality; organizational structures are "primarily designed to circumvent legal risks"; and token designs deliberately "avoid carrying economic value" or "don't include a business model." Worse still, crypto projects that "disregard the rules and operate in gray areas" often develop faster than those built by "honest and compliant" developers.
But now, the US government is closer than ever to passing the "Crypto Market Structure Regulatory Act"—a bill that is expected to eliminate all the aforementioned distortions by 2026. If passed, the bill will incentivize companies to increase transparency, establish clear standards, and replace "random enforcement" with "clear, structured financing, token issuance, and decentralized pathways." Previously, the passage of the GENIUS Act led to a significant increase in stablecoin issuance; however, legislation related to the crypto market structure will bring about even more significant changes—this change will focus on "blockchain networks."
In other words, this type of regulation will allow blockchain networks to "truly operate as a network": open, autonomous, composable, trustworthy, neutral, and decentralized.
— Miles Jennings, Member of the Crypto Policy Team and General Counsel at a16z




