In-depth conversation with Silicon Valley's "Dark Prophet" Peter Thiel: Humanity has entered "technological stagnation"

avatar
Bitpush
07-21
This article is machine translated
Show original

By Ross Douthat

Compiled by: Liam

Source: Carbon Chain Value

Original title: Peter Thiel's understanding of the next stage of the world


Editor’s Note:

On June 26, Ross Douthat, an American columnist and owner of Interesting Times, had a 60-minute hardcore conversation with Peter Thiel, a veteran of the right-wing forces in the technology industry. The topics involved stagnation theory, anti-Christ, immigration to Mars, cryptocurrency , artificial intelligence, transhumanism, sea settlement and other advanced human topics.

Because Peter Thiel, who has been living behind the scenes most of the time, always appears in interviews and talks about topics that are relatively "hardcore" and have stood the test of time. I decided to compile and organize this latest content that has been delayed for a while and share it with readers. I hope this article can bring readers a sense of refreshment and inspiration.

Peter Thiel is closely associated with several specific labels. " Silicon Valley Mafia", "right-wing forces", "stagnation theory", "anti-Christ", the earliest Silicon Valley tech investor who supported Trump against all odds, and JD Vance. Recently, the media even revealed that his own investment company Founders Fund had foresightedly invested in Bitmine Immersion before it built a treasury company around Ethereum, and held a 9% stake.

Peter Thiel said that he relatively believed in the "stagnation theory". That is, from 1750 to 1970, more than 200 years, it was a period of accelerated change. We have been accelerating: ships are faster, railways are faster, cars are faster, and airplanes are faster. This acceleration process reached its peak in the Concorde and Apollo moon landing programs. Since then, things have slowed down relatively in all dimensions. But Peter Thiel did not give a specific reason for his relative stagnation. His thinking reminds me of the views expressed by Robert Gordon in his famous work "The Rise and Fall of American Growth". He said, "Progress continued after 1970, but it was mainly concentrated in the fields of entertainment, communications, and information technology . The progress in these fields will not be as fruitful and spectacular as the by-products of great inventions, and will not burst out as suddenly. Instead, changes are evolutionary and continuous."

Robert Gordon gives the reasons for the relative slowdown as follows: inequality, education, demographics, and debt repayment, which have slowed the growth of the US economy and reduced the growth rate of real disposable income of the bottom 99% of the income distribution to near zero.

Regarding the religious topic "Antichrist". Peter Thiel said that the philosophical framework of atheists is "One World or None." This is a short film launched by the Federation of American Scientists in the late 1940s. The Christian framework is, to some extent, the same question: Antichrist or the end of the world? Either the "World War" of Antichrist, or we are sleepwalking towards the end of the world. "One World or None.", "Antichrist or the end of the world", to some extent, are the same question.

How does the Antichrist rule the world? He'll give some demonic, hypnotic speech, and people will fall for it. This is the devil, Ex Machina.

The following is an edited transcript of the latest episode of Interesting Times. If you want to listen to the podcast, you can search for "Peter Thiel and the Antichrist" and you should be able to find the original podcast.

Ross Douthat: Is Silicon Valley too reckless? Should we worry more about the end of the world or stagnation? Why is one of the world's most successful investors worried about the Antichrist?

Our guest is the co-founder of PayPal and Palantir , and an early investor in the political careers of Donald Trump and J.D. Vance. Peter Thiel is a veteran of the tech right, known for funding all sorts of conservative and anti-establishment ideas. But we're going to discuss his own views, because despite his billionaire fortune, there's a good reason to think he's the most influential right-wing intellectual of the past 20 years.

Peter Thiel, welcome to Interesting Times.

Peter Thiel: Thank you for the invitation.

Douthat: I want to start from about 13 or 14 years ago. You wrote an article for the conservative magazine National Review called "The End of the Future." The core idea of the article is that the dynamic, fast-changing modern world is not as dynamic as people think it is, and we have actually entered an era of technological stagnation. Digital life is indeed a breakthrough, but it is far from what people expected, and the world has basically come to a standstill.

Peter Thiel: Yes.

Douthat: You’re not the only one to make this argument, but yours has special weight because you’re a Silicon Valley insider who made a fortune in the digital revolution.

So I'm curious: In 2025, do you think this diagnosis will still hold true?

Peter Thiel: Yes. I still fundamentally believe in the stagnation thesis. It was never an absolute thesis. Our argument is not that we are completely stagnant; it is that in some ways the rate of change has slowed.

It's not stagnant, but 1750 to 1970 - more than 200 years - was a period of accelerating change. We've always been accelerating: faster ships, faster railways, faster cars, faster planes. This culminated in Concorde and the Apollo moon landings. However, now things have slowed down in every dimension.

I always make an exception for the digital world (bits), so we have computers, software, the internet, and mobile internet. The emergence of the cryptocurrency and AI revolutions over the last 10 to 15 years, I think, is pretty significant in some senses. But the question is: is it enough to really get out of this general sense of stagnation?

In the Back to the Future series, we can start with an epistemological question: How do we know if we are in a state of stagnation or acceleration? Because one of the characteristics of postmodernity is that people are highly specialized. Can you say that unless you spend half your life working on string theory, we are not making progress in physics? Or quantum computers? Or cancer research and biotechnology and all these verticals? How do you measure progress in cancer research compared to string theory? You have to give weight to all these things.

In theory, this is a very difficult question to grasp. Because it is so difficult to answer, and the number of guardians is getting smaller and smaller, it is questionable in itself. So I think overall, we are still in a relatively stagnant world, but not completely stagnant.

Douthat: You mentioned Back to the Future. We just showed the kids the original Back to the Future — the first one, with Michael J. Fox.

Thiel: That's like going back 30 years, from 1955 to 1985. And then Back to the Future 2 is 1985 to 2015, which is now 10 years later. There were flying cars. And the future of 2015 is very different from the future of 1985.

Douthat: Back to the Future 2 did make Biff Tannen a Donald Trump-like figure in power, so it was somewhat prescient. But yeah, the most striking thing is how different the built environment looks. So one of the strongest arguments for stasis that I've heard is that, yes, if you put someone in a time machine from a different point in time, they would realize they're in a completely different world. If they left 1860 and landed in--

Thiel: Or, 1890 to 1970, that's 80 years of your life. That's about it.

Douthat: But for my kids, even kids in 2025, looking back at 1985, the world is a little different, the cars are a little different, nobody has cell phones, but the world looks pretty similar. This isn't a statistic, but --

Thiel: It's common sense.

Douthat: That's common sense. But what makes you believe we're in a takeoff period? Is it just economic growth? Or productivity growth? What data do you look at to measure economic stagnation and dynamism?

Thiel: Of course, the economic data is: How is your standard of living compared to your parents? If you're a 30-year-old millennial, how is your standard of living compared to when your baby boomer parents were 30? How were they doing?

There are intellectual questions: How many breakthroughs have we achieved? How do we quantify them? What is the reward for doing research?

There are really diminishing returns to doing science, or academic research in general, and maybe that's why it feels like an antisocial, Malthusian institution, because you have to put in more and more effort to get the same return, and at some point people give up and the whole system collapses.

Douthat: Let's talk about this. Why do we pursue growth and dynamism? Because, as you've pointed out in some of your writing about this, there was a cultural change in the Western world in the 1970s—around the time when you think that economic growth slowed and began to stagnate—where people started to get very anxious about the costs of growth, especially the environmental costs.

This thinking culminates in a widely held belief that we are already wealthy enough. If we push too hard for greater wealth, the planet won’t be able to sustain us—environmental conditions will deteriorate. We should just be content with the status quo. So, what’s wrong with this argument?

Thiel: Well, I think there are deep reasons why economic stagnation happens. When you look back at history, you always ask three questions: What happened? Then you ask: What should we do about it? But there's also a middle question: Why did it happen?

People have run out of ideas. I think to some extent, institutions have declined and become risk-averse, some of these cultural shifts we can describe. But on the other hand, I think to some extent, people also have some very reasonable concerns about the future, if we continue to accelerate, are we accelerating towards environmental disasters, nuclear disasters, things like that?

But I think that if we don't find a way back to the future, society - I don't know. It's going to fall apart, it's not going to function.

The middle class - I define it as people who expect their children to be better than they are. When that expectation is broken, we no longer have a middle class society. Maybe you can have a stagnant feudal society, or maybe you can develop into a very different society. But that's not the way the Western world works, and that's not how the United States worked for the first 200 years.

Douthat: So you think that ordinary people will eventually not accept stagnation? They will revolt, and in the process destroy everything around them?

Thiel: They might revolt. Or maybe our institutions simply don't work, because all our institutions are premised on growth.

Douthat: Our budget is certainly premised on growth.

Thiel: Yeah. If you say, I don't know, Reagan and Obama - Reagan advocated consumer capitalism, which is a contradiction in itself. Capitalists don't save money, they borrow money. And Obama advocated low-tax socialism - that's just as contradictory as Reagan's consumer capitalism.

I prefer low-tax socialism to high-tax socialism, but I worry that it's not sustainable. At some point, taxes will go up, or socialism will end. So it's very, very unstable. That's why people are not optimistic. They think we haven't achieved some kind of stable, Greta-like future yet. Maybe it can work, but we're not there yet.

Douthat: Since her name is likely to come up again in this conversation, referring to Greta Thunberg, the activist known for her anti-climate change protests, she represents, I would say, for you a symbol of an anti-growth, actually authoritarian, environmentalist-led future.

Thiel: Absolutely. But we're not there yet. Not there yet. If you really get into this, it's going to be a very, very different society --

Douthat: If you actually lived in a small underdeveloped Scandinavian village.

Thiel: I'm not sure it would be North Korea, but that would be extremely oppressive.

Douthat: One thing that always strikes me is that when there is this sense of stagnation and decadence in society, to use a word I like to describe it, you also find that people eventually crave a crisis, a moment that can radically change the status quo of society. Because I tend to think that in affluent societies, when wealth reaches a certain level, people become very comfortable and risk-averse, and it is difficult to get out of decadence and move to a new situation without a crisis.

So the original example for me was this: After 9/11, there was a widespread belief among foreign policy conservatives that we had been stagnant and it was time to wake up and start a new journey and reshape the world. Obviously, that ended badly. But things like that --

Thiel: But Bush told people to go shopping right away.

Douthat: So it’s not anti-decadence enough?

Thiel: For the most part, yes. There were some neoconservatives, some foreign policy circles, where people played LARP as a way to get out of their decadence. But the mainstream was still Bush Sr., who told people to just go shopping.

Douthat: So what are the risks you're willing to take to get rid of decadence? Now it seems that people who want to be anti-decadence do have to take a lot of risks. They have to say: Look, you have a nice, stable, comfortable society, but guess what? We want a war, a crisis, or a complete reorganization of the government. They have to rise to the occasion.

Thiel: Well, I don't know if I can give you a definitive answer, but my directional answer is: more. We should take more risks. We should do more.

I can name all these different verticals. Take biotech, for example, diseases like dementia, Alzheimer's -- we've had no progress in 40, 50 years. People are completely relying on amyloid beta. It's clearly not working. It's just a stupid scam that people are just self-reinforcing. So, yes, we need to take more risks in this space.

Douthat: Just to make us more specific, I want to use this example to explain: OK, what does it mean to say that we need to take more risk in anti-aging research? Does it mean that the FDA has to step back and say: Anyone who has a new treatment for Alzheimer's can sell it on the open market? What exactly does risk look like in the medical field?

Thiel: Yes, you take more risk. If you have a fatal disease, you might be able to take more risk. Researchers can take more risk.

Culturally, I imagine it to be like the early modern period, when people thought we could cure disease. They thought we could radically extend lifespan. Immortality was part of the project of early modernity. It was the idea of Francis Bacon, Condorcet. It was perhaps anti-Christian, perhaps downstream of Christianity. It was competitive. If Christianity promised bodily resurrection, it would not succeed unless science promised the same thing.

I remember in 1999 or 2000, when we were still running PayPal, one of my co-founders, Luke Nosek, who was into Alcor and cryonics and thought people should freeze themselves. One day we took the whole company to a cryonics party. You know the Tupperware party? There were people selling Tupperware policies. At the cryonics party, they sold-

Douthat: Is it just your heads? What gets frozen?

Thiel: You can choose the whole body or just the head.

Douthat: It's cheaper to choose "Head Only".

Thiel: It was disturbing that the freeze policy couldn't be printed when the dot-matrix printer wasn't working properly.

Douthat: Technology is stagnant again, right?

Thiel: But in retrospect, it's also a symptom of decline, because in 1999, it wasn't a mainstream view, but there were still some fringe views among the baby boomers who still believed they could live forever. That was the last generation. So I've always been against the baby boomers, but maybe even in this fringe baby boomer narcissism, we lost something, at least a few baby boomers still believed that science could cure all their diseases. Now, no millennials believe that.

Douthat: I think some people now believe in a different kind of immortality, though. I think part of the fascination with AI stems from a particular vision of transcending limits. I asked you about politics first, and then asked you this question. One of the things that struck me about your original arguments for stagnation was that they were primarily about technology and economics, and that it could be applied to a pretty wide range of areas. At the time you wrote that essay, you were interested in seasteading—essentially building new polities independent of the ossified Western world—but then you made a shift in the 2010s.

You were one of the few Silicon Valley notables—perhaps the only one—to support Donald Trump in 2016. You supported a few hand-picked Republican Senate candidates. One of them is now Vice President of the United States. As an observer, after reading your discussion of decadence, I think you are basically a kind of political venture capitalist. You said: There are some disruptive forces here that could change the political status quo, and it's worth taking a certain amount of risk. Is that how you think?

Thiel: Of course, there are many levels to this. One level is that we hope to change the course of the Titanic away from the iceberg, or to use any metaphor, to really change the course of society as a whole.

Douthat: Through political change.

Thiel: Maybe a more narrow wish is that we can at least have a conversation about this. So when someone like Trump says, “Make America Great Again” — well, is that a positive, optimistic, ambitious agenda? Or is it just a very pessimistic assessment of where we are, that we’re no longer a great country?

I didn't expect Trump to do anything positive, but I thought that at least for the first time in a hundred years we had a Republican who was not feeding us Bush-style sweet talk. It was certainly not progress, but at least we could have a conversation. Looking back now, it was an absurd fantasy.

In 2016, I had two thoughts—you always have these subconscious thoughts—but I didn’t combine them: One, if Trump loses, no one will be mad at me for supporting him. Two, I think his chances of winning are 50-50. I had this thought in my mind—

Douthat: If he loses, why isn't anyone mad at you?

Thiel: It would just be a weird thing and it wouldn't really matter. But I think the odds of him succeeding are fifty-fifty because the problems are so entrenched and the stagnation is so frustrating. And the reality is, people are not ready for it.

Maybe we’ve advanced enough to have this conversation in 2025, a decade after Trump’s death. Of course, you’re not a zombie leftist, Ross—

Douthat: Peter, I've been called a lot of names.

Thiel: But I will try to make progress.

Douthat: From your perspective, there are two levels of assumptions. The first level is: this society needs change and needs adventure; Trump is change and adventure. The second level is: Trump is actually willing to tell the truth about the decline of the United States.

As an investor and venture capitalist, what do you think you gained during Trump's first term?

Thiel: Yeah.

Douthat: What do you think Trump did to fight decadence or stagnation during his first term? If anything, the answer is probably "nothing."

Thiel: I think it's been much slower and longer than I expected, but we've gotten to the point where a lot of people think something is wrong. I didn't have this conversation from 2012 to 2014. I debated Eric Schmidt in 2012, I debated Marc Andreessen in 2013, and I debated Jeff Bezos in 2014.

I was talking about “there is a stagnation problem” and all three of them were saying “everything is going great.” I think at least those three have updated and adjusted to varying degrees. Silicon Valley has adjusted.

Douthat: But Silicon Valley has adapted—

Thiel: About stagnation.

Douthat: Exactly. But a large portion of Silicon Valley ended up supporting Trump in 2024—obviously the most prominent of which was Elon Musk.

Thiel: Yeah. It goes hand in hand with the stagnation issue, in my opinion. These things are always extremely complex, but what I would say is—and again, I hesitate to speak for all of these people—but someone like Mark Zuckerberg, or Facebook, or Meta, in some ways, I think he's not very ideological. He doesn't think about these things very carefully. The default liberalism is, and always has been: If liberalism doesn't work, what do you do? Year after year, it's been: You do more. If something doesn't work, you just do more. You keep increasing the dose, spending hundreds of millions of dollars, you're completely woke, and everybody hates you.

And at some point you think: OK, maybe this isn't going to work.

Douthat: So they pivoted.

Thiel: This is not a pro-Trump thing.

Douthat: This is not a pro-Trump thing, but there is a sense, both in public and in private conversations, that Trumpism and populism in 2024—maybe not in 2016, when Peter was the only supporter, but now, in 2024—can be a vehicle for technological innovation, economic dynamism, and so on.

Thiel: You really put it in a very optimistic way.

Douthat: I know you're pessimistic. But people --

Thiel: When you look at it optimistically, you're saying these people are going to be disappointed, they're prepared for failure and things like that.

Douthat: I mean, there's a lot of optimism being expressed, and I'll just say this. Elon Musk expressed some apocalyptic concerns about how the budget deficit is going to destroy us all, but he came into office and the people around him basically said: We have a partnership with the Trump administration, and we're on a path to technological greatness. I think they're optimistic.

You are more pessimistic, or more realistic. I am asking about your assessment of our current situation, not theirs. Do you think the populism in Trump 2.0 will become a carrier of technological vitality?

Thiel: This is still our best option. Is Harvard going to continue to muddle through and repeat the same old approach that hasn't worked for 50 years to cure dementia?

Douthat: It's just a rationale: It can't get any worse; let's disrupt it. But the current critique of populism is: Silicon Valley is aligned with the populists, but at the end of the day, the populists don't care about science. They don't want to spend money on science. They want to cut off funding to Harvard simply because they don't like Harvard. Ultimately, you're not going to get the kind of future investment that Silicon Valley wants. Is that wrong?

Thiel: Yes. But we have to go back to the question: How does science work behind the scenes? That's what the New Dealers did - whatever problems they had, they pushed science hard, funded science, gave money to people, and then scaled it up. Today, if you had an Einstein and he wrote a letter to the White House, it would get lost in the mailroom. The Manhattan Project was unthinkable.

If we call something a "moonshot" -- like Biden did when he talked about cancer research -- then in the '60s, a "moonshot" still meant that you went to the moon. Now, a "moonshot" means something that's completely made up and will never happen: Oh, you need a "moonshot" to do it. It's not like we needed Apollo. It means it will never, ever happen.

Douthat: But you still seem to be in this mode where, for you, not like some other people in Silicon Valley, the value of populism is to tear off the veil and the fantasy. We are not at the stage where we expect the Trump administration to do the Manhattan Project and the moon landing. It's more like, populism helps us see that all this is fake.

Thiel: You need to try to do both. They're very intertwined.

Nuclear power is being deregulated, and we'll start building new plants again, or designing better plants, and maybe even fusion reactors. So, yes, there's a deregulation and deconstruction part of it. And then you actually start building, and that kind of thing. You're clearing the site, in a sense, and then maybe --

Douthat: But you personally have stopped funding politicians?

Thiel: I have a schizophrenic attitude toward these things. I think this is extremely important, but also extremely harmful. So I think about how to do it—

Douthat: Extremely harmful to you personally?

Thiel: It’s detrimental to everybody involved. It’s a zero-sum game. It’s crazy. And then at some point —

Douthat: Because everyone hates you and associates you with Trump. Does this do any harm to you personally?

Thiel: It's very harmful because it's a zero-sum world. The stakes are really, really high.

Douthat: Do you end up having enemies that you didn't have before?

Thiel: It's damaging to all the people who were involved in it in different ways. There's a political dimension to Back to the Future. You can't -- this is a conversation I had with Elon in 2024, and we had all sorts of conversations. I talked to Elon about the Seasteading version, and I said: If Trump doesn't win, I want to leave the country. And Elon said: There's nowhere to go. There's nowhere to go.

And then you always come up with some plausible argument. I came home about two hours after we finished dinner, and it hit me: Wow, Elon, you don’t believe in Mars anymore. 2024 is the year Elon stops believing in Mars — not as a silly tech project, but as a political project. Mars was supposed to be a political project; it was building an alternative. In 2024, Elon starts to believe that if you go to Mars, the socialist US government, the awakened AI will follow you to Mars.

Here’s a meeting we facilitated with Elon and DeepMind CEO Demis Hassabis.

Douthat: This is an artificial intelligence company.

Thiel: Yes. The conversation went something like this: Demis said to Elon, “I’m working on the most important project in the world. I’m developing a superhuman AI.”

Elon responded to Demis: "I'm working on the most important project in the world." I'm going to turn us into an interplanetary species. Then Demis said: You know my AI can follow you to Mars. Then Elon was silent. But in my story, it took years for this statement to really make Elon realize its significance. It wasn't until 2024 that he really accepted this brother fact.

Douthat: But that doesn't mean he doesn't believe in Mars. It just means he's decided that he has to beat the budget deficit or the awakening to get to Mars.

Thiel: Yes, but what does Mars mean?

Douthat: What does Mars mean?

Thiel: Well, is this just a science project? Or is it, like Heinlein, portraying the moon as a libertarian paradise or something?

Douthat: A vision of a new society populated by the descendants of Elon Musk.

Thiel: Well, I don't know if it's crystallized, but if you crystallize things, maybe you realize that the Mars program is not just a science project, it's a political project. When you crystallize it, you have to start thinking: Well, awakened AI is going to come after you, socialist governments are going to come after you. And then, maybe you need to do something other than just go to Mars.

Douthat: So if we're still stagnant, then awakened AI seems to be the biggest exception, where it's made remarkable progress - and to a lot of people, surprising progress.

This is also - we were just talking about politics - I think the Trump administration has largely given AI investors a lot of what they wanted, both in terms of taking a back seat and also in terms of getting them to form public-private partnerships. So this is an area of progress and government involvement.

You are an investor in artificial intelligence, what do you think you are investing in?

Thiel: Well, I don't know. There are a lot of layers to this. Let's ask the question: How big do I think AI is? My dumb answer is: It's bigger than a hamburger, which is not a big deal, but it's not going to revolutionize our society. I think it's about the size of the Internet in the late '90s. I'm not sure it's big enough to really end economic stagnation. It's probably big enough to spawn some great companies. The Internet probably contributed a couple of percentage points to GDP, maybe 1% of GDP growth every year for 10 to 15 years. It also increased productivity. So, that's kind of how I think about AI.

It's the only thing we have. It's so unbalanced and a little bit unhealthy. It's the only thing we have. I hope there are more dimensions of progress. I hope we can go to Mars. I hope we can find a cure for dementia. If all we had was artificial intelligence, I would take it. It has risks. Obviously, there are dangers with this technology. But there are also --

Douthat: So you're skeptical of what's called the superintelligence cascade theory. The basic idea is that if AI succeeds, it will become so smart that it will allow us to make advances in the atomic world, like: Well, we can't cure dementia. We can't figure out how to build a perfect factory to make rockets to Mars. But AI can.

At some point, you cross a certain threshold where we get not only more digital progress, but 64 other forms of progress. It sounds like you don't believe that, or you think it's less likely.

Thiel: Yeah, I don't know if that's really the limiting factor.

Douthat: What does that mean? Limiting factors.

Thiel: This is probably the ideology of Silicon Valley. It's a little weird, maybe more of a liberalism than a conservative ideology, but Silicon Valley people really focus on IQ and think it's all about smart people. If you have more smart people, they can do great things.

The argument against IQ in economics is that people actually do worse. The smarter they are, the worse they do. It's just that they don't know how to use that knowledge, or our society doesn't know how to deal with them, and they don't fit in. So, this suggests that the limiting factor is not IQ, but some deep problem in our society.

Douthat: But is this a limitation of intelligence, or is it a question of the type of personality that a human superintelligence would create?

I don't buy into the idea -- I discussed this on a podcast with an AI accelerator -- that some problems can be solved by just increasing intelligence. Let's increase intelligence and Alzheimer's will be solved. Let's increase intelligence and AI will figure out how to automate processes and create a billion robots overnight. I'm an intelligence skeptic because I think there may be limits to intelligence.

Thiel: Yeah, that's hard to prove. Proving these things is always hard.

Douthat: Until we have superintelligence.

Thiel: But I agree with you because I think we have a lot of smart people, but things are stuck for other reasons. So maybe these problems are unsolvable, and that's the pessimistic view. Maybe dementia is untreatable, and that's an unsolvable problem. Maybe death is untreatable, and that's an unsolvable problem.

Maybe it's cultural. So the question is not how smart an individual is, but how it fits into our society. Can we tolerate smart people who are not mainstream? Maybe we need smart people who are not mainstream to do crazy experiments. If AI is just smart in the traditional sense, if we define "woke" - again, "woke" is too ideological - but if you just define it as conformist, maybe this level of intelligence will not bring about any change.

Douthat: So, are you worried that in the future AI will somehow cause itself to stagnate? It's highly intelligent, but it's also very creative in a very routine way. It's like the Netflix algorithm: It makes a ton of great movies for people to watch. It generates a ton of great ideas. It will make a lot of people unemployed and obsolete. But in a way, it will also deepen stagnation. Is that a fear?

Thiel: That's -- (sighs). It's very possible. It's certainly risky. But my final conclusion is: I still think we should try AI, or we're going to stop completely.

So, yes, all kinds of interesting things could happen. Maybe drones in the military combined with AI, which could be scary or dangerous or dystopian or change the status quo. But without AI, wow, nothing is going to happen.

There is another version of this discussion on the Internet: Has the Internet led to more conformity and awakening? And in many ways, the Internet has not led to the explosion of rich and diverse ideas that liberals imagined in 1999. But counterfactually, I think the Internet is still better than the alternatives, and if we didn't have the Internet, things would probably be worse. AI is better, it's better than the alternatives, and the other options are nothing.

You see, there's a place here where the stagnation argument is still reinforced. We're only talking about AI - and I think there's always an implicit admission that, outside of AI, we're almost completely stagnant.

Douthat: But the world of AI is clearly full of people who at least seem to have a more utopian, more transformative (whatever word you want to use) view of the technology than you express here. You mentioned earlier that the modern world once promised radical life extension, but that’s no longer the case. It seems clear to me that many of the people who are deeply involved in AI see it as a transhumanist mechanism—transcendence of our mortal bodies—either the creation of some successor species or some kind of mind-machine fusion.

Do you think these are all irrelevant fantasies? Or is it just hype? Do you think people are pretending to build a deus ex machina to make money? Is it hype? Is it delusion? Is it something you're worried about?

Thiel: Well, yes.

Douthat: You want the human race to continue, right?

Thiel: Um—

Douthat: You are hesitating.

Thiel: Well, I don't know. I would -- I would --

Douthat: That's a long hesitation!

Thiel: There are so many problems here.

Douthat: Should humans continue to exist?

Thiel: Yes.

Douthat: Okay.

Thiel: But I also hope that we can address these issues at their root. So, I always think of transhumanism as this. The ideal is a radical transformation, a transformation of your natural human body into an immortal body. There are criticisms, like, of putting transgender in a sexual context, or, I don't know, transvestite is someone who changes clothes, cross-dresses, and transsexual is someone who turns your, I don't know, penis into a vagina. And then we can talk about how effective these surgeries are. But we want a transformation that's much more than that. The criticism isn't that it's weird or unnatural, it's: Man, that's too little. Well, we want more than just cross-dressing or changing your sex organs. We want you to be able to change your heart, change your mind, change your whole body.

By the way, the criticism of this from orthodox Christianity is that these things don't go far enough. Transhumanism is not just about changing your body, you need to change your soul, you need to change your whole self. So -

Douthat: Etc., etc. I generally agree with your belief that religion should be friends with science and the idea of scientific progress. I think any idea of divine providence has to include the fact that we have made progress, achieved accomplishments, and done things that our ancestors could not have imagined.

But the Christian promise ultimately still seems to be that, through God’s grace, people can achieve a perfect body and a perfect soul, and those who try to achieve that on their own, relying on a bunch of machines, are likely to end up as dystopian figures.

Thiel: Well, let's be clear about that.

Douthat: You can also have a heretical form of Christianity that says something else.

Thiel: Yeah, I don't know. I don't think the word "nature" appears once in the Old Testament. So, I understand that the inspiration of Judeo-Christianity is to transcend nature and overcome everything. And the closest thing to nature is that man is fallen. From a Christian perspective, fallenness is natural, and man is messed up. That's true. But sometimes, with God's help, you should be able to transcend and overcome it.

Douthat: Yes. But most people who work on building a hypothetical deus ex machina—with the exception of those in this room—do not think of themselves as working with Yahweh, the Lord of Hosts.

Thiel: Sure, sure. But --

Douthat: They think they are creating immortality on their own, right?

Thiel: We talked about a lot of things. So, my criticism was: They're not ambitious enough. From a Christian perspective, these people are not ambitious enough. So, let's talk about this: Well, they --

Douthat: But they are not ambitious enough morally and spiritually.

Thiel: So, are they still ambitious enough on a physical level? Are they still true transhumanists? Well, man, cryonics seems like a throwback from 1999—there aren't many of them anymore. So, they're not physical transhumanists. OK, OK, maybe it's not about cryonics, maybe it's about uploading. OK, OK, not really—I'd rather have my own body. I don't want a computer program that simulates me.

Douthat: Yes, I agree.

Thiel: So uploading seems like a step below cryonics. But even so, it's part of the discussion, and that's where it's hard to score. I don't want to say they're all making it up, they're all fake, but I don't --

Douthat: Do you think some of these are fake?

Thiel: I don't think "fake" means people are lying, but I would say that's not the focus.

Douthat: Yes.

Thiel: So there's a language of fertility, there's a language of optimism.

I was talking to Elon Musk about this a few weeks ago. He said that the United States will have a billion humanoid robots in ten years. I said: Well, if that's true, then you don't have to worry about the budget deficit because we have a lot of room for growth, and growth itself will solve this problem. And then - he is still worried about the budget deficit. This doesn't prove that he doesn't believe in a billion robots, but it means that he may not have thought it through, or he doesn't think it will bring economic transformation, or there is a big error here. Yes, these things are indeed not fully thought through in some aspects.

If I have to criticize Silicon Valley, it's that it has a poor understanding of what technology means. Their discussions tend to be about micro-level questions like: What is the IQ-ELO score for AI? How do we define AGI? We get into endless debates about technology, and there are a lot of mid-level questions that I think are really important, like: What does technology mean for the budget deficit? What does technology mean for the economy? What does technology mean for geopolitics?

I recently discussed a question with you: Does this change China's calculus about taking Taiwan? If we accelerate the AI revolution, militarily - will China fall behind? Maybe from an optimistic point of view, it will deter China because they have actually already lost. From a pessimistic point of view, it will accelerate their actions because they know that now is the time to lose - if they don't take Taiwan now, they will fall behind.

Regardless — and this is a very important thing — it hasn’t been thought through very carefully. We haven’t thought about what AI means for geopolitics. We haven’t thought about what it means for macroeconomics. These are things that I hope we can explore further.

Douthat: You're also interested in a macro question. Let's talk a little bit about religion. You've been talking a lot lately about the concept of the Antichrist, which is a Christian concept, an Apocalyptic concept. What does that mean to you? What is the Antichrist?

Thiel: How much time do we have?

Douthat: We have the time you would like to spend discussing the Antichrist.

Thiel: Okay. Well, I could go on for a long time. I think there's always been a question of how we articulate these existential risks, some of the challenges that we face, and that's framed in some kind of dystopian science text of runaway. The risk of nuclear war, the risk of environmental disaster. And maybe some specific risks like climate change, although we've raised a lot of other risks. And the risk of biological weapons. There's all these science fiction scenarios. And obviously there are certain types of risks with artificial intelligence.

But I've always thought that if we're going to use this framework to talk about existential risk, maybe we should also talk about the risk of another bad singularity, which I describe as the "monopolar totalitarian state." Because I think the default political solution people have for all of these existential risks is monopolar governance. So what about nuclear weapons? We have a really powerful United Nations that controls them, and they're controlled by the international political order. And then there's things like: What do we do with artificial intelligence? We need global computational governance. We need a monopolar government that controls all the computers and logs every keystroke to make sure people don't write dangerous AI programs. I've always wondered if this is going to be a shift from the frying pan to the fire pit.

The philosophical framework for atheists is "One World or None." This is a short film that the Federation of American Scientists put out in the late 40s. The story begins with a nuclear bomb blowing up the world, and it is obvious that a world government is needed to stop it - One World or None. The Christian framework, in a way, is the same question: Antichrist or Armageddon? Either the "World War" of the Antichrist, or we are sleepwalking towards the end of the world. "One World or None." "Antichrist or Armageddon" is, in a way, the same question.

Now, I have a lot of thoughts on this topic, but one question is - and this is a plot hole in all Antichrist books - how does the Antichrist rule the world? He'll give some demonic, hypnotic speech, and people will fall for it. This is the devil, Ex Machina -

Douthat: This is completely - this is unbelievable.

Thiel: That's a very incredible plot hole. But I think we've got the answer. The way the Antichrist rules the world is by constantly talking about the end of the world, constantly talking about existential risk, and that's what you need to regulate. This is the opposite of the 17th and 18th century Baconian scientific picture, where the Antichrist is like some evil tech genius, evil scientist, who invents some kind of machine to rule the world. People are terrified of that.

In our world, the politically resonant thing is the opposite. The politically resonant thing is: we need to stop science, we need to say "stop" to this. In the 17th century, I could imagine someone like Dr. Strangelove, Edward Teller, running the world. In our world, it's more likely to be Greta Thunberg.

Douthat: I'd like to suggest a middle ground between these two options. In the past, the legitimate fear of the Antichrist was some kind of technological wizard. Now, the legitimate fear is someone who promises to control technology, make it safe, and bring about (from your perspective) universal stagnation, right?

Thiel: Well, that's more like my description of how things went down.

Douthat: Yes.

Thiel: I think people are still afraid of the Antichrist of the 17th century. We're still afraid of Dr. Strangelove.

Douthat: Yes, but you say that the real Antichrist will exploit this fear and say: You have to follow me to avoid Skynet, to avoid the Terminator, to avoid nuclear Armageddon.

Thiel: Yes.

Douthat: I guess my point is, looking at the world right now, you need some kind of novel technological advancement to make that fear concrete.

So if the world is convinced that AI is going to destroy everyone, I'm totally fine with the idea that the world might move to someone who promises peace and regulation. But I think to get to that point, you need one of the accelerationist doomsday scenarios to start playing out. To get peace and security, you need more technological progress.

For example, one of the key failings of 20th century totalitarianism was its lack of knowledge - it had no idea what was going on around the world. So you need artificial intelligence or something to help totalitarian rule achieve peace and security. So don't you think your worst-case scenario essentially requires some kind of leap forward progress that is then tamed and used to impose stagnant totalitarianism? You can't get there directly from where we are now.

Thiel: Yeah, sure.

Douthat: Just like Greta Thunberg protesting Israel on a boat in the Mediterranean. I just don’t think that the promise of protection from the threats of AI, technology, or even climate change is a powerful global rallying cry right now, absent accelerating change and a real fear of outright catastrophe.

Thiel: Man, these things are hard to tell, but I think environmentalism is pretty strong. I don't know if it's strong enough to create a one-world totalitarian state, but man, it is --

Douthat: I think in its current form, that's not the case.

Thiel: I would say that this is the only thing that the people of Europe still believe in. They believe in green far more than sharia law, far more than totalitarian Chinese rule. The future is a very different future than the present. The only three options Europe has right now are green, sharia law, and a totalitarian communist state. And green is by far the most powerful.

Douthat: Europe is declining and decaying and is no longer the dominant power in the world.

Thiel: Absolutely. It always depends on the specific situation.

There’s a very complicated history of how nuclear technology works, and—well, we didn’t really move toward a totalitarian “unipolar” world state. But by the 1970s, one explanation for nuclear stagnation was that the runaway progress of technology had become so scary that Baconian science ended at Los Alamos.

Then it became: OK, this is the end of it, we don't want to go on any further. Charles Manson took LSD in the late '60s and started committing murders. What he saw from LSD, and what he learned, was that you can be like a Dostoyevsky antihero and do whatever you want.

Of course, not everyone turned into Charles Manson. But it seemed to me that everyone was turning into Charles Manson, and the hippies were taking over.

Douthat: But Charles Manson didn't become the Antichrist and take over the world. We're heading toward the end of time, and you—

Thiel: But my reading of the history of the 1970s is that the hippies did win. Man landed on the moon in July 1969, and Woodstock happened three weeks later. In hindsight, that’s when human progress stopped. The hippies won. And it wasn’t Charles Manson who won—

Douthat: OK. I want to continue the discussion of the Antichrist, and leave it at that. And you're backing off. You say: Well, environmentalism already supports stagnation, etc. OK, we agree with that.

Thiel: No, I'm just saying there's something powerful.

Douthat: But we are not living under the reign of the Antichrist. We are just stagnant. And you are assuming that something worse is going to happen, perpetuating the stagnation and driving it by fear. I think for that to happen, there has to be a technological advance like the one at Los Alamos that scares people.

I have a very specific question: You are an investor in artificial intelligence, with a lot of money invested in Palantir, military technology, surveillance technology, war technology, and so on. You told me a story about the Antichrist coming to power and using people's fear of technological change to establish a world order, and I think this Antichrist might use the tools you have developed. So, does the Antichrist think: Well, we're not going to have any more technological progress, but I really like what Palantir has done so far. Isn't this a cause for concern? Isn't it an irony of history that people who publicly worry about the Antichrist accidentally accelerate the arrival of the Antichrist?

Thiel: Look, there are all sorts of different scenarios. I obviously don't think that's what I'm doing.

Douthat: I mean, to be clear, I don't think you're doing anything either. I just want to know how you can get the world to accept permanent dictatorship.

Thiel: Well, we can describe the different levels of this problem. But is what I just said really absurd as a summary of the 50-year stagnation of the world that has succumbed to peace and security? This is the fulfillment of 1 Thessalonians 5:3 - the slogan of the Antichrist is "peace and security."

We've submitted an application to the FDA -- which regulates not only drugs in the United States, but actually all over the world, because the rest of the world listens to the FDA. The Nuclear Regulatory Commission actually regulates nuclear power plants around the world. You can't design a modular nuclear reactor and build it in Argentina. They won't trust the Argentine regulators. They'll listen to the United States.

So this is at least a question of why we've had 50 years of stagnation. One answer is that we ran out of ideas. Another answer is that something happened at a cultural level that was not allowed. The cultural answer could be some kind of bottom-up answer, that humans are transforming into a more docile species. Or it could be at least partially top-down, that government institutions have become these stagnant institutions.

Nuclear power was supposed to be the energy source of the 21st century. But somehow it has been put on the back burner around the world.

Douthat: So in a sense, we are already living under the mild reign of the Antichrist. Do you believe that God is in control of history?

Thiel: (Pauses) Man, here we go again - I think there's always room for human freedom and human choice. These things are not predetermined.

Douthat: But God is not going to leave us forever under the rule of a mild, stagnationist Antichrist, is He? That can’t be the end of the story, can it?

Thiel: There's always a problem with attributing too much causation to God. I could give you different Bible verses, but I'll give you John 15:25, where Christ says, "They hated me without a cause." So, all these people who persecuted Christ had no reason, no reason to persecute Christ. If we interpret that verse as ultimate causation, they're saying: I persecute because God told me to. God is the root of everything.

The Christian view is anti-Calvinist. God is not behind history. God is not the master of everything. If you say God caused everything --

Douthat: But wait, but God is-

Thiel: You are scapegoating God.

Douthat: But God was behind Jesus Christ entering history, because God would not leave us in a stagnant, decadent, corrupt Roman Empire, right? So, at some point, God would intervene.

Thiel: I'm not that kind of Calvinist. And --

Douthat: That's not Calvinism, that's Christianity. God doesn't leave us to stare at screens forever, listening to Greta Thunberg. He doesn't abandon us to our fate.

Thiel: I think humans have a lot of room for action and freedom, for better or for worse. If I think these things are destined, then you better accept the reality that the lion is coming. You should do yoga, meditate religiously, and wait for the lion to eat you. I don't think you should do that.

Douthat: I agree. I guess, at that point, I was just trying to express hope and suggest that we should have hope of success in exercising our human freedoms in trying to resist the Antichrist, right?

Thiel: We agree on that.

Douthat: Okay. Peter Thiel, thank you for accepting my interview.

Thiel: Thank you.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments