With apologies to Robert Frost:
Some say the world will end in fire
Some say in ice
From what Phil Torres has tasted of Prometheus’ desire
He holds with those who favor rampaging nanobots, global war,
Hostile superintelligence
And microbes
Okay, that last bit is a little clunky. But the end of something is never entirely pretty, is it?
Phil Torres is a scholar of the end times. He’s also an atheist, an affiliate of the Institute for Ethics and Emerging Technologies, and a thinker in the field of existential risk studies. His new book, The End: What Science and Religion Tell Us About the Apocalypse, is an in-depth survey of all the ways your world might collapse.
Most apocalypse-minded folks worry about God and divine judgment. Torres is curious about the more immanent ways that the world may reach its imminent demise. Some of these potential disasters sound familiar (nuclear holocaust). Others do not (an artificial superintelligence goes rogue and tries to wipe out humanity; aliens attack the world). From the plausible to the outlandish, Torres takes anything seriously, as long as he thinks it’s possible.
The End: What Science and Religion Tell Us about the Apocalypse
Phil Torres
Pitchstone, February 2016
The End is not light reading. Then again, existential risk studies is not a light discipline. The field is an annex of transhumanism, which envisions a world where human beings merge with technology until we become something posthuman, and perhaps immortal. Existential risk studies is the dystopian yin to transhumanism’s utopian yang. Where transhumanism wonders how far technology can take the human species, existential riskology wonders how those same tools might wipe us all out.
In The End, Torres offers a primer in secular eschatology—essentially, the study of end-times, but from a non-religious perspective. His outlook is grim. Nature is dangerous. New technologies are creating even more vectors for disaster. And some apocalypse-minded religious groups, Torres argues, might harness those new technologies in order to manufacture their own end-of-days.
Plus, there could be aliens.
While reading The End, I spent a lot of time wondering whether Torres was prescient or nuts. Much to his credit,Torres is happy to discuss both possibilities. I met him at a coffeeshop near his home in Carrboro, N.C. where we spoke about ISIS, technophobia, and why he doesn’t plan to have kids.
This interview has been edited for length and clarity
So. You’re scaring me. Is this fear healthy?
I think there’s a lot of fear mongering about various scenarios that are improbable, or frankly just not worth worrying about. But I do think our policies should be guided by a kind of intelligent anxiety.
But how do you write about end-times scenarios without acting like a fear-monger?
Everything that we value in the great experiment of civilization depends on avoiding some sort of grand catastrophic scenario. The reason existential risk studies isn’t well-funded probably has a lot to do with the fact that it’s super-uncomfortable thinking about these sorts of things.
I grew up in a fundamentalist evangelical household, where talk of the Rapture and end-times events was kind of frequent. Frankly, I think I suffered a bit psychologically because I took it quite seriously.
What did taking it seriously look like for you?
I must have been in elementary school when Bill Clinton was elected. [I have a vivid memory of] being overcome with a sense of fear that he was the anti-Christ. Eventually I left fundamentalist evangelical Christianity, and then just started researching apocalypticism.
If you look across history, millenarian end-times thinking is ubiquitous. Judaism has some millenarian elements. Christianity probably began as an apocalyptic movement. Islam—a lot of people have talked about that having been started as an end-times, eschatologically motivated religion. All the way through Marxism. You know, Nazism had integral millenarian components. Then you get a new kind of millenarian thinking after the atomic age started. Suddenly, people are starting to think about human extinction. You get things like Ray Kurzweil’s techno-rapture Singularitarianism [in which a superintelligence emerges and technological growth outpaces human understanding].
And then you get this new field of existential risk studies, where the conclusions are not dissimilar in many ways to a lot of the religious views. You know, there’s going to be some sort of catastrophic event in the future, perhaps we might even survive it, but we’ll enter into this permanent state of privation. Is it the case that these new narratives are different, or are they just an extension [of the old ones]? Because it seems like if history reveals one thing, it’s that humans have a tendency to wish for the world to end.
Just look at the popularity of The Left Behind series. The end times are fun to talk about!
I do think there’s something really captivating about certain versions. I mean, dispensationalism in particular has that kind of thriller-movie series of events
This new version of end-times thinking is different in kind than this millennia-worth of end-times thinking, because it’s scientific. It’s based on evidence. These are scenarios that can genuinely come true, although some of them are without a doubt quite speculative at the moment.
That’s true. Some are more far-fetched.
Even if there’s a slight possibility that there might be something like some ecophagic nanobots or aliens—it sounds crazy, but I really don’t think it is for evidential reasons.
Most of science is utterly unintuitive. It’s not a question of whether it sounds preposterous. It’s a question of whether there are good reasons for believing the claims.
Evolution might not be intuitive, but we can observe it. We don’t see the aliens.
Right. But there’s also lots of other things we don’t see. Like quarks and leptons and what not. The relevant distinction in terms of what you should believe is entirely epistemological. And if there are good reasons for thinking that super-intelligence could lead to a catastrophe, and I think there are—without a doubt we’re on the vanguard of research here. But there seems to be enough evidence, in my best judgment at least, to not dismiss it.
Most of the threats you describe in the book are enormous, invisible, and almost impossible for most of us to imagine. Is rational, evidence-based discussion in the public sphere a sufficient tool to grasp something as large and amorphous as a global catastrophe? What does it take?
I think you’re exactly right that the mental machinery that we have was not designed for thinking about probably most of science, and certainly really big phenomena like global warming.
Beyond just sort of encouraging big-picture thinking, I don’t know. I really struggled in the last chapter—I was trying not to make policy recommendations that are too specific. Just like we’ve got the IAEA to monitor nuclear weapons, we should set up a similar entity to monitor biological weapons.
Or some sort of catastrophic risk clearinghouse. I can imagine someone thinking about these issues with the policy tools available to them.
Yeah, totally. And some people have done good work. Although there’s been an incredible paucity of work there.
It’s hard to take [these concerns] into the public arena and go, “you know, you really ought to be worried, in thirty years there might be this issue,” without it devolving into an exercise of fear-mongering.
Something that I appreciate about the transhumanist scene is your willingness to sound kind of nuts.
Initially I found that movement really unappealing. I thought there’s a lot of hints of religion in it, in this sort of techno-utopianism. I think [transhumanist thinker Ray] Kurzweil’s eschatology is pretty close to some Christian ideas. He divides the cosmic evolution into various epochs, and the beginning of the fifth one is when the Singularity will happen—
—and he’s searching for the defeat of death! Which is a very New Testament concept.
Everlasting life, guaranteed to all the believers. Literally he’s got a line in The Singularity is Near, where he is debating with somebody who’s like, What if I decide not to incorporate nano-tech or upload my mind? Kurzweil’s like, Well, I don’t know, you’re just gonna get left behind.
Again, I do admire that kind of total long-shot thinking. But I’m skeptical of transhumanist ideas, in part, because they’re coming from such a homogeneous group. For an illustration: seventeen people blurbed your book. Sixteen of them are men. Are these ideas just going to seem like they’re emerging from a clubby fringe, or is your argument ready for primetime?
So on the one hand, the atheist movement has become quite an influential cultural phenomenon. It’s pretty visible, has pretty strong representation. Existential risk studies, not so much, but I think that’s changing. For example, the guy who founded the Institute for Ethics and Emerging Technologies [Nick Bostrom] has been named one of Foreign Policy‘s Top 100 Thinkers. He recently spoke at the UN. He’s a pretty visible guy.
There’s reason to believe that doomsday machines may proliferate to the point at which the probability might be considerable that we don’t make it into the twenty-second century. So I feel like there’s every reason to think that this could gain some…
It also matters a lot to me—you mentioned all the men who gave me blurbs.
That seems to be the case for most books from Pitchstone Publishing, the atheist press that’s putting out your book.
Yeah, I mean, really just most books. It’s a patriarchal society.
I think getting greater gender diversity is crucial. It is kind of crazy that I look over those blurbs and it’s just a sausage fest. But frankly, looking over the list of people who would be candidates, for blurbs—you know, the New Atheist movement is extremely masculine, and perhaps even sexist to some extent. I think the existential risk studies community is much more egalitarian, is much more feminist in its approach, but still there’s asymmetry in terms of what gender is involved.
If these ideas did become mainstream, do you think that they would elicit a new kind of technophobia or tech-skepticism?
It seems like a very dismal dystopic future that is largely dystopic because of technology. Why the hell would you want the technology then? That was my initial impression.
Technology is an enterprise that’s not going to be stopped. Moratoria on different domains I think are just totally unfeasible. Especially in the long run.
We’re on a ship right now, and the wind is blowing in a direction, so let’s just accept what appears to be inevitable to me.
I’m curious: do you agree?
On the whole, yes. But I think there’s been so little effort to regulate many technologies. I’m curious what might be possible. We’ve done it with nuclear technology, somewhat—we’ve placed moratoria on the cleanest fuel source we’ve got.
I don’t disagree at all. My response to that would be the accessibility aspect is what really differentiates nuclear technology from some of the other stuff. Uranium and plutonium is really hard to get.
Fair point. It’s easier to regulate nuclear power, because it’s easier to cut it off at the source.
Right.
Unlike, say, synthetic microbes that people may one day be able to make using widely available tools.
You’d probably need even more intrusive surveillance apparatuses to regulate biotechnology or something like that.
I guess I’m pretty skeptical about regulations in the future. But who the hell knows? History is full of surprises.
I don’t feel great about it. I wouldn’t have a kid.
Because you’re immersed in all of these doomsday scenarios?
Before I started writing this, I had a bunch of friends who had kids. A lot of them are very open to debate. And so we went out on a number of occasions and had two or three drinks and talked about the ethics of having a kid. There are legitimate questions to be asked, just at an existential level, about whether it’s right to bring something into the world that will suffer—and also experience pleasure—but will suffer and eventually perish.
Let’s imagine the best world we can imagine. Is it right to bring a kid in, because they’re going to live for eighty years, or whatever, and perish? Now look at this world. This is a world where Donald Trump is leading in the polls. It’s a world full of nuclear weapons. It’s one where there are at least some people a bit anxious about the future.
All my friends had kids anyway. They were like, Oh you have a good point, but I’m just gonna do it.
Also on The Cubit: A conversation with the Transhumanist Party’s first presidential candidate