Existential Risks to Humanity, Phil Torres Interview by Mike Gilliland and Euvie Ivanova on Future Thinkers Podcast
Read Full Transcript

Mike: Thanks, Phil, for joining us on the show. It’s good to have you.

Phil: Yeah, thanks for having me, this is wonderful.

Mike: You are an author of a book called the End, sounds very scary.

Phil: Yeah, the topics that it discusses are rather unsettling but I would say that the ultimate message of the book is quite optimistic.

Mike: What motivated you to write the book?

Phil: I’ve always been interested in the subject of eschatology, which is the study of the end of the world. [00:01:30] Traditionally, that’s been a branch of theology, going back many millennia – several millennia before Christ was born – to the ancient Persians, one of whom was a guy named Zoroaster who found Zoroastrianism. He introduced the first eschatological narrative. It had essentially an Armageddon-like battle at the end, resurrection of the dead, final judgement, so on. Then Christianity and Islam borrowed quite a bit from this. I grew up in a household that was very evangelical, fundamentalist [00:02:00] household. I think that is what instilled the interest in end times scenarios.

Then what’s fascinating is early 2000s there was a field – which was essentially founded properly in 1986 by a guy named John Leslie, who wrote a book called the End of the World, in which he gives a rigorous look from a scientific evidence-based perspective on various scenarios that could result in human extinction or some sort of global catastrophe that pushes us back into a [00:02:30] Palaeolithic state for an extended amount of time. Early 2000s, people became inspired by this topic and started to develop a rigorous framework for talking about these end times scenarios. It fascinating me, the millenarian tendency extends across time all the way back to the origins of human civilization, but only recently has it been infused with this new epistemology which is scientific evidenced-based, rather than [00:03:00] based on faith in revelations that were privately revealed to prophets.

Euvie: Yeah, I think in the last few decades a lot of scientists have been talking about specifically climate change as being a major threat to us. In 2015, there were so many articles saying that it was by far the hottest year on record and that things are getting really serious now.

Phil: Yeah, totally. The bulletin of atomic scientists – I’m not sure if you’re familiar – they have a doomsday clock which is supposed to [00:03:30] represent our collective proximity to some sort of global catastrophe. They move the minute hand back and forth where midnight represents doom. 2015, they moved it forward from five minutes to three minutes. The only time it’s been closer than three minutes was in 1953, it was at two minutes and that was when the US and the Soviet Union both detonated hydrogen bombs. Their science and security board consists of many very respectable individuals [00:04:00] and, according to them, we are really dangerously close to some sort of catastrophe.

In fact, one of the two biggest reasons that they cite is climate change, despite that Paris’ climate accords it’s a good gesture but there’s no penalties if countries withdraw or fail to meet the various standards that they’ve set.

Mike: Aside from everyone dies.

Phil: Yeah.

Euvie: Pretty big penalty.

Phil: I would agree. I think there’s a big difference being an alarmist and being [00:04:30] alarmed. I think those are oftentimes totally confused in public conversations. The difference between them is the degree to which one’s level of alarmed is based on the best available evidence. When you look at the evidence, there really is pretty good reason for being a little bit alarmed about the climate, about nuclear weapons. Then the existential risk studies, the field I was mentioning earlier, the bulletin looks at immediate threats – climate change and nuclear weapons are immediate threats.

Existential risk scholars tend to [00:05:00] take a more futurological view and peer a couple decades into the future where suddenly there’s this whole other array of really big picture risks from biotechnology and nanotechnology and even super intelligence. Something to look forward to.

Mike: We were just talking about this today, about how the danger of having a home garage chemical lab, how that’s becoming more widely available and an individual terrorist with a bit of knowledge about chemistry [00:05:30] could create a biological weapon easily in his garage.

Euvie: Or some programming could create artificial intelligence that goes rogue.

Phil: Yeah, definitely. I think one of the biggest issues is nuclear weapons continue to be very difficult to make. You really need quite sophisticated understanding of the device and physics. You also need some very rare materials, like uranium and plutonium. Not only are they very hard to get a hold of but they are extremely dangerous to handle. It’s not the case with [00:06:00] microbes. Once you get anthrax, you’re not going to suddenly run out in the lab. It’s a self-perpetuating material. I think one of the biggest issues is that a lot of these technologies are not only becoming more powerful, not only enabling humanity to rearrange and manipulate the physical world in really unprecedented ways, but some of them are also becoming much more accessible.

That’s not the case with nuclear weapons, at least not so far, but it definitely is the case with biotechnology that affect biology [00:06:30] probably with future nanotechnology like molecular manufacturing stuff. It could also be the case, at least in theory with AI, it could possibly be the case that some individual in his or her basement hits upon the right code to create a self-improving program. I completely agree. It’s sort of a new – every generation said, “The end is near. We’re in a unique time in history.” But I think there’s actually a pretty good evidenced-based argument for [00:07:00] why we really are on the brink of a new era. The ability for mass-destruction is going to be distributed among citizens all around the world, among many, many people.

Mike: I’m interested how your research of religion and prophecies about the end times play into this new technology and your insights from that. What is your background with religion and how do you see it playing any kind of a role with these new threats?

Phil: My background? I’ve studied religious eschatologies quite a bit, particularly the [00:07:30] Islamic narratives and Christian narratives, as well. In particular, dispensationalists – that’s the view that any time you’ve heard talk of the rapture, the reference is dispensationalism, it’s the most popular view among Americans. I believe even in Africa there’s a huge number of people who accept this particular eschatological framework.

Euvie: Can you explain quickly what that is, just for our listeners who don’t know?

Phil: Yeah, sure, sure, sure.

Mike: And for me.

Phil: For Christians of this particular camp – [00:08:00] and there have been many, Ronald Reagan was a champion of this view, some of the republican presidential candidates right now have a long history of close ties with really Christian fanatics who are raving about the rapture being imminent and wars in the Middle East being essentially a good sign, something to celebrate rather than despair of. Basically, the view is believers around the world right now will be raptured into the clouds [00:08:30] to meet Jesus. He’ll come back to earth and then gather up all of the believers at the time and they’ll go to heaven for a seven-year period called the tribulation.

During that time, the anti-Christ will rise up, take over the UN or the EU, depending on who you ask, and he’ll sign a peace agreement with Israel. Hallway through the tribulation – three and a half years in – he will break that tribulation, there’s going to be mass wars, supernatural catastrophes, [00:09:00] disease, hailstones that are a hundred feet wide falling from the sky and, ultimately, that’s going to end with the battle of Armageddon which will happen in Israel, right around that area. After that, Jesus will come back to earth, kill the anti-Christ, he’ll have all of his followers, then he’ll create this marvellous paradisiacal millennial kingdom on earth which will last for 1,000 years.

Then some various other events happen after that. This [00:09:30] ties into politics which has implications for catastrophe in the future, because some of the biggest conflicts right now in the world have been fuelled by this particular view. For example, I’m sure you’ve heard a lot of republicans talking about, for many years now, about pre-emptively attacking Iran. One of the main reasons is that – the Bible on the one hand says those who bless Israel will be blessed themselves. Another reason is that if you don’t have a Jewish state in Palestine, the tribulation [00:10:00] cannot commence. Without the tribulation, you’re not going to get eternal peace, which is ultimately what they want. There is an eschatological imperative to defend Israel at all costs, no matter what, no matter how many UN sanctions there have been against the Israeli state.

In fact, the founding of Israel itself, there was a document called the Balfour Declaration, which was in the early 20th century, that they made explicit the UK’s intention to create a Jewish state in Palestine. Belfour [00:10:30] himself was a dispensationalist or, at least, he grew up in a dispensationalist household. There’s reason to believe that he was explicitly influenced by Biblical prophecy. I think there’s a tangle of influences of eschatology from Christianity and certainly Islam – the Islamic state is very much an apocalyptic cult – that go back a long ways and are integral to how unstable global politics is right now. Religion also, it’s not just what it’s doing.

If there’s an event, [00:11:00] a possible future event that doesn’t fit into a narrative, there’s a tendency – one could argue – for religious people to ignore it. Climate denialism is the highest among Christians in the US. Some major Christian leaders have been explicit about that. John Shimkus is a republican congressman who sat in congress probably five years ago or something – people can watch the video online. He says, “In Genesis, God promises Noah after the great deluge that never again will there be a flood to destroy the earth, therefore, we ought not to [00:11:30] worry about global warming.” It just doesn’t fit into the narrative. It’s a huge danger in that respect, as well.

Mike: Yeah, I had never thought about how the belief in these end times and the narrative that you’re describing would be motivated – they would be motivated – to push it forward and make it happen. Then there’s the whole ignorance of the other side of things where there are actual real risks that they’re saying, “It doesn’t fit in with our narrative, it’s not going to happen.”

Phil: [00:12:00] Yeah, I think that’s totally right. Actually, to zoom out just a little bit, to give the book more context, the issue that I’m really interested in – a lot of people have said fairly sophisticated thing about the technologies that could be exploited to induce some sort of global catastrophe. We mentioned them earlier: biotechnology, somebody in their garage or something with a little lab messing with advanced nanotechnology, something of that sort. People have said, “Here’s some scenario, grey goose scenario, there’s the possibility of engineered pathogens,” [00:12:30] and so on and so on. What I’m really interested in – and I feel like nobody’s really talked about – is those tools aren’t, by themselves, going to initiate a catastrophe.

They require an agent. A risk potential, it cannot be realized unless you have an agent to a coupling. Almost nobody said anything about the agents. Who would want to destroy the world? Who would want to bring about a global catastrophe? That’s why I end up spending a lot of time talking about religion and, in particular, apocalyptic movements [00:13:00] that not only believe the end is imminent but see themselves as catalysts for accelerating that whole process. As many, many terrorism scholars have written, apocalyptic groups are the most dangerous because they’ve got nothing to lose, as one author wrote, they don’t want a seat at the table, they want to destroy the table and everybody at it. Moving forward, not only should we keep an eye on what technologies are going to make possible as a means to an end, but we need to look at also the various agents who might exploit those means to bring [00:13:30] about an end that isn’t compatible with future flourishing of humanity or us reaching a post-human state or something of that sort.

Mike: It’s such a complicated and difficult subject when you have these groups of people that are entirely motivated to make everything bad happen to us.

Euvie: If they believe that they’re righteous and they’re going to meet their God or Jesus or Allah or whatever, they think it’s actually going to be a good thing and they will benefit from it.

Phil: Yeah, you could certainly imagine moving forward with climate change, biodiversity loss, but these [00:14:00] things, rather than waking people up to the reality of our precarious situation on spaceship earth, these could actually reinforce their eschatological views. It’s the most unproductive reaction but they could respond to climate change with glee and eschatological elation. “It’s finally happening, the end is near, Jesus is going to come back and finally this weary world of sin and suffering will be destroyed for good.”

Mike: What do you see as a solution for this? Easy question, right?

Phil: Yeah, that’s an easy one. [00:14:30] I put forward, in the last chapter of the book I talk about some possible strategies, like perhaps you have an agency that monitors nanotechnology or something. It’s this non-governmental institution that keeps an eye on what’s happening around the world, I don’t really talk about the specific things, I really talk about big picture possibilities. I think education is certainly one, although I have doubts as to the efficacy of that. If you look back [00:15:00] at history, there’s a long record of people adopting just crazy views. I think there’s something about the cognitive hardware that makes us susceptible to certain kinds of delusions.

I’m not sure that just education could ultimately solve the problem. The three biggest things – and I would love to know what you guys think about this. I feel like the three biggest options, which are a bit futuristic but also, I would argue entirely within the realm of future possibility within the next several decades or something. One is space colonization. The idea is the wider [00:15:30] you spread out in the world, the less likely it is to that a single incidence is going to have worldwide consequences. That makes sense of course, if you had colony on Mars, you can have a great disaster on earth and the Martians would survive. Elon Musk has talked about, I think it was the late 2020s, that he hopes to have, yeah… NASA’s talked about 2030s. It could be in our lifetime that we see this dream realized.

Superintelligence is a possibility. If you had super intelligent [00:16:00] foresight guiding us forward through this obstacle course of risk, we might have a much better probability of surviving in the end. Of course, super intelligence itself poses perhaps the greatest existential risk to our long-term survival. I take it that education is a software level modification. You’re essentially giving people better software, “Here’s some tools for thinking, here’s what it means to think clearly about certain issues, here’s what a justified belief,” and so on. Alternatively, you could attempt a hardware modification.

I think this is [00:16:30] were transhumanism comes in. If you had some way of redesigning the architecture of our minds, the result could be easier said than done, of course, but perhaps the result could be a more rational individual who’s less likely to succumb to crazy delusions about climate change being a global hoax, or Jesus returning at some point in the future to make everything right. Those are the three big picture strategies that I think could potentially offer a way forwards, a long-term basis.

Mike: The space colonization thing, it’s scary to me because [00:17:00] best-case scenario, we’re on a desert planet. We’ve kind of left Eden already to go somewhere that’s 1,000 times worse than what we just left and it’s so much more difficult to get it started and to terraform and to get it to any condition even close to what we currently have. If we can’t collectively move up in consciousness before we move up technologically, then we’re in trouble. I’m curious if you’ve thought about this from this perspective of meditation or psychedelics, how do you [00:17:30] improve empathy and improve awareness?

Phil: To be honest, I haven’t given it a whole lot of thought. I am reminded of experiments, I’m sure you guys have heard about this, I think it was from a couple years ago, a scientist developed a pill that people just took the pill and suddenly became much more empathetic. I might be misremembering but I think it was oxytocin, that’s perhaps more of a technological, a pharmacological means for achieving this end. As for psychedelics and meditation, ultimately, I guess the issue that I’ve been focusing on is rationality. Not just instrumental rationality, not just to the ability to [00:18:00] get what you want but a moral rationality and then is trying to achieve this through technological means, namely through this transhumanist root of modifying our phenotypes.

I don’t know, you may have a point. I’ve talked to many other people who have done shrooms and, generally speaking, there were oftentimes the moments of euphoria, perhaps when you’ve had a few drinks or during meditation, those are moments oftentimes when yourself dissolves and suddenly you are overwhelmed with the feeling [00:18:30] of oneness and comradery and so on. Yes, certainly those feelings are important in this world, increasingly dangerous world with all sorts of dangerous disagreements.

Mike: There are interestingly some theories about different prophets who have gotten their insights and communication with God through some sort of sacrament. It’s often not talked about, but there have been researchers looking into those types of sacraments and found a lot of evidence of psychoactive chemicals. [00:19:00] Moses and the burning bush could have been some sort of DMT containing bush. If you’ve ever experienced a DMT trip, if you didn’t know better and if you didn’t know that that’s what you were getting into, I could so, so easily see how you could pull messages from this experience and say, “God was speaking to me.”

At the core of some of these experiences that we’ve had with psychedelics, it’s been a unity consciousness, it’s been more empathy that has come out of it. I’m not sure that that happens for everybody but [00:19:30] it’s interesting actually to think about the prophecies from the perspective of, “Well, what if it was inspired by psychedelics?”

Phil: Yeah, definitely. I don’t know if you guys have ever read the Book of Revelation.

Mike: Nope.

Phil: It is a really trippy narrative. It’s full of seven headed beasts and just the most bizarre characters and bizarre events. It really does read as somebody’s real-time diary as they’re under the influence of some [00:20:00] pretty heavy drug. I know scientists have also talked about the possibility of various prophets in the past. Muhammad, as I understand it, is a candidate here, perhaps even John, who wrote the book of Revelation – perhaps he wasn’t on a drug. Instead, these individuals might have been suffering from a neurological disorder called temporal lobe epilepsy.

Mike: Interesting.

Phil: Yeah, it’s very interesting. I don’t want to overstate the case. I think it’s a contentious [00:20:30] view but part of the reason that they’ve suggested this is that people who suffer from this neurological pathology tend to have exactly these sorts of experience that are perhaps best described as spiritual. I’m sure you’ve read as well, when people have epilepsy or something and they’re going in for brain surgery, oftentimes they will electro stimulate various parts of the neocortex to make sure that they don’t remove something really essential. You can induce all sorts of extraordinary experiences in these people [00:21:00] juts by stimulating the brain. It is quite plausible that hallucinogenics or just a brain disease is behind some of these really extraordinary visions of the future.

Euvie: Yeah, there’s a documentary that touches on this called God on the Brain and it studies patients with epilepsy who have had these kinds of revelatory religious experiences.

Phil: Yeah, that’s really fascinating.

Mike: I’d like to actually go back to when you were talking about alarm versus alarmists. Can you talk a bit more about that?

Phil: Sure. [00:21:30] This gets back to an issue that we were talking about before, namely, epistemology. There are many aspects of contemporary philosophy that are so arcane and they deal with such esoteric issues – issues that are very far removed from anything that would affect quotidian life. But I think epistemology could make a genuinely significant contribution to the public discourse, to the overall reasonableness of society. As Sam Harris once said, [00:22:00] it’s a very good line, “There’s no society in history or no individual who’s ever been harmed by being too reasonable.” By being too much of a stickler about having their beliefs proportion to the best available evidence.

The idea with the being alarmed and being an alarmist is just that sometimes the end presentation could be identical. You could have a climatologist who’s saying, “We’re screwed, this is really bad. I’m not going to have kids.” You could go talk to a [00:22:30] prepper who believes that there’s going to be some major earthquake in the heartland of America that’s going to result in a collapse of American society. Their degree of alarm is comparable. One would account as alarmism for purely epistemological reasons; it’s not a reasonable view to be worried about a zombie apocalypse or worried about Jesus returning or events in the Islamic narrative.

But it is very, very reasonable to be worried about [00:23:00] climate change, biodiversity loss and then, again, from a more futurological perspective, being worried about the possibility of a lone wolf or a small terrorist group getting a hold of Ebola and then weaponizing it in various ways. Those are rational concerns to have. In that sense, although an existential riskologist may sometimes appear alarmist, I would strongly argue that they’re really not. It’s all about the reasons, it’s all about the arguments. It’s all about the rationality of these various beliefs. [00:23:30] Another way to put it is belief are the destinations of a journey for truth, and that journey should always be guided by the evidence.

Beliefs should always be the destination and never the points of departure. I think religion is perhaps the best example we have in the world of systems that begin with their beliefs – the beliefs are the points of departure and then they spend all of their time trying to defend their beliefs against new research that comes in and so on and so on. You could have two people on the corner of a street waving their arms going, “The end is near.” They look perhaps indiscernible; [00:24:00] they’re doing things, they’re engaged in the exact same activity. But if you look a little bit closer, perhaps one, their fears are based on legitimate science, the other one has various concerns that are based on faith in revealed statements about the future.

One is not an alarmist, even though he or she may be quite alarmed and the other very much is an alarmist. The reason I mention alarmism is because that term is used a lot in the media, particularly talking about climate change, [00:24:30] particularly right-wing media – the climate alarmists and so on and so on. I just want to provide some kind of terminological clarity. The climatologists are not alarmists, even though they are quite worried about the future.

Mike: If both on the street are sitting there waving their arms, they’re kind of saying the same thing in a way and the problem is no one’s taking action on this, or not enough people are taking action, I’m interested to know why you believe religion is dangerous if it’s somewhat saying the same thing with different motivations.

Phil: One, because the things [00:25:00] that it’s saying, even though it resembles what scientists might say, it really is quite different. It’s about very supernatural events happening in the future, it’s about ignoring reality. It could certainly put us at danger if you have an apocalyptic group that believes, as many in the past have, that the world must be destroyed in order to be saved. Then you empower them perhaps in the future with not just nuclear weapons but biotechnology, synthetic biology, nanotechnology and so on. They could be motivated for reasons that are [00:25:30] completely unhinged from reality, once again, to induce global catastrophes that result in profound human suffering and misery and maybe even, from a more transhumanist perspective, from realizing all of the good possibilities of future technologies: indefinite lifespans, the elimination of disease, things of that nature.

Really, the issue is epistemology – what is the epistemological basis of different claims? When one’s beliefs are not properly anchored to the evidence, [00:26:00] chances are that’s going to be bad. You’re trying to navigate objective reality with beliefs that have nothing to do with objective reality.

Euvie: Why don’t we circle back to the three solutions that you proposed for possibly preventing these things from having. One we touched on is space colonization and another one is super intelligence AI. The third one is transhumanism. How do you believe that this could be a solution?

Phil: Transhumanism?

Euvie: Yeah.

Phil: [00:26:30] The idea is that if you peer back across human history, there’s a long record of humans behaving irrationally, behaving dangerously, being motivated by various religious beliefs to engage in wars and other sorts of conflicts. For me – and I think for others, as well – the conclusion is that if you took nuclear weapons and transplanted them back into the middle ages, I’m not sure we would be here right now. The west has gone through an enlightenment period but, as a matter of fact, religion is growing worldwide right now. People who are sceptical of supernatural [00:27:00] agents, that demographic is actually shrinking.

Ultimately, the point is that can humans be trusted with technologies that, by virtually all accounts, are going to be unprecedently powerful. Again, in certain circumstances, will probably be incredibly accessible, as well, unlike nuclear weapons. A biohacker or a malicious biohacker can set up a lab for $700 in his or her basement. Can humans be trusted? I’m not so sure. There’s a pretty good case to be made that this is a [00:27:30] recipe for disaster. The whole idea with transhumanism is using technology to modify our mental hardware and bodies in various ways to improve ourselves, make ourselves more rational, wiser, moral, and so on.

I suspect that a world cluttered with, if you will, doomsday machines and humanity is a much more dangerous world than one cluttered with doomsday machines and full of posthumans – if we get it right. The idea is perhaps it’s the case that to survive we need to go extinct, [00:28:00] where the extinction here is an instance of not our lineage terminating like the dodo or the dinosaurs, rather it being replaced by a new species of human or [inaudible [0:28:10] or something like that that is ultimately cyborgish in physical constitution. Yes, essentially more responsible and therefore less likely to destroy the world.

Euvie: I’m going to play devil’s advocate here a little bit. Let’s say we go with this route, what if, A, certain people don’t want to become transhuman do we force them, [00:28:30] and B, what if that backfires somehow?

Phil: On the first issue, I absolutely do not belief a compulsory approach is morally right. It has to be voluntary, I very strongly believe that. That being said, I’m reminded of a fake conversation that Ray Kurzweil has in his 2005 The Singularity is Near book between him and Ned Lud. It’s from him that we get the term Ludite. Ned says in this little ersatz conversation that, “Maybe I don’t want to enhance myself in various ways.” Kurzweil says, “[00:29:00] That’s totally fine with me but you’re really not going to last that long to participate in the conversation.”

I think there probably will be a move towards post-humanity. Once the process gets off the ground, I think probably a lot of people are going to follow it. They’re going to realize they’re not as capable in schools, their parents are going to want their kids to be enhanced in various ways. I suspect there will be societal lomentum. Importantly, that momentum will be voluntary.

Euvie: Even in this scenario, we still have people, biological people, being born today whose [00:29:30] life expectancy is going to be probably 80 or 90 years old. That’s 80 or 90 years that we still have. Even if we had this transhumanist option today, there’s still that period of biological people living and being capable of doing all of these thing.

Phil: The destructive things?

Euvie: Yes.

Phil: Yeah, I think that’s a good point. It’s a bit of an urgent situation. The situation is dangerous precisely because there’s this overlap between… the human form has hardly changed in at least 30,000 years, I don’t know, maybe 100,00 years, [00:30:00] yet, all of a sudden, we’re in this environment where there are levers within reach that, if we pull, we could obliterate society. If the instrumental rationality of our means – the ability to produce change in the world, far surpasses the moral rationality of our ends, that’s a really bad situation. That’s exactly what you’re talking about.

I should add also that not many people have written about these issues. Existential risk studies is in the pre-paradigmatic state, we’re all feeling around in the dark and, to borrow a line from Deuton, there just really aren’t many [00:30:30] shoulders to stand on to see what lies ahead. The whole idea of transhumanism, with respect to your other point, I completely agree that it could utterly backfire. An enhanced Hitler could be absolutely horrific. How do we ensure that the enhancements are used for good and not for destructive ends? I don’t know. Perhaps when technology is developed a little bit more and, therefore, they come into view a bit more, maybe we can see more clearly how exactly to navigate this particular issue.

Mike: Do you mention anywhere in this book about [00:31:00] technologies of empathy?

Phil: I don’t really discuss that, no.

Mike: Okay.

Phil: I do in passing mention what I had gestured at earlier, which is the pill that people could take to enhance their empathy. No, I don’t really go into that.

Mike: It seems you focused quite a bit on…

Euvie: Rationality.

Mike: Yeah, exactly. The rational solutions. Yeah, I’m curious as to why that is.

Phil: I think it’s a great question. I would actually say that it’s not the case that I’m not interested in that – I’m very, very interested in it. I think the issue in terms of writing the book is that there is [00:31:30] such a profusion of ideas in the literature in trying to bring these together in a coherent manner is a task in and of itself. Some possibilities are just going to slip through. I do think that empathy is quite an interesting possibility. As far as I’m concerned, I don’t believe many transhumanists have talked a lot about it. I think some have discussed it a bit but I don’t feel like that possibility has a large representation in the literature. Am I wrong about that?

Mike: I think you’re right actually, I never hear crossover. You probably know if you [00:32:00] are familiar with our podcast, we talk a lot about meditation – now recently psychedelic – but we’re also very fascinated in the technology and the possibilities of the future. We found that nobody crosses over in these two fields.

Euvie: Very few.

Mike: It’s very strange because it seems to me that it’s a severe lack of empathy that is at the heart of a lot of our problems on this planet, which increases the risk. We’re kind of seeking these clever solutions to world-level threats, but we’re not really looking at the individual [00:32:30] in terms of how do we enter this global family.

Phil: Yeah, I think that’s very interesting. It reminds me of right now there’s a burgeoning, if you will, of global terrorism, particularly in the Islamic world. A lot of terrorism scholars believe that a root cause of this is globalization. With globalization there are communities that feel threatened, their identity is challenged. Their dignity, as they perceive it, is compromised. I take it that empathy is very relevant to this. Globalization [00:33:00] in these communities feeling pushed to the brink, traditions being expunged and so on. I suspect empathy has something to do with why it is that globalization is fuelling this sort violence.

The existential risk scholars focus almost entirely on the tools and not the agents. On the other hand, you have the new atheists, whose  fundamental insight, perhaps, is that religion is not only wrong in a philosophical sense but is dangerous or [00:33:30] is inimical to cultural flourishing and the prosperity of civilization. I think there’s plenty of data that backs that up and I’d be happy to talk about it. I felt like there was extraordinary cross- pollinization that could happen between these two fields, but nobody has talked about this so far. It’s exactly the same situation that you’re describing. There seems to be this lacuna in the field – why hasn’t anybody talked about empathy, for example, and the possible benefits in terms of making the world a safer place, of enhancing empathy through meditation [00:34:00] or psychedelics or whatever. I rushed in to try to fill in a conversation between the new atheists and the existential risk scholars. Perhaps you should explore the empathy possibility.

Mike: I sometimes jokingly consider myself a recovered psychopath, so it’s an interesting topic for me.

Euvie: Learning how to be more empathetic.

Mike: Yeah, exactly. So much experience and knowledge has come from use of psychedelics but then [00:34:30] also that has fuelled by rational exploration of different topics, of technologies, of empathy and that sort of thing. I don’t mean to try and pull you out of your wheelhouse here with those sorts of questions.

Phil: No, no, not at all. I think it’s a fantastic idea and I really don’t have a good answer to why I’ve ignored it. There’s a line at the beginning of Bostrom’s most recent book Superintelligence where he acknowledges that everything in his book could be wrong. Again, he’s in a similar situation; he’s talking about [00:35:00] issues where there just aren’t a lot of shoulders to stand on to make insightful and accurate claims about what we might confront in the future.

Mike: Yeah, we’re all dedicating our lives to making a pin prick in advancement of human intelligence and we have to become so specialized to make any kind of effect. I think you’re totally right; everything’s new, there’s no shoulders to stand on here. Anything we do, if we dive deeply enough into it, we’re just so specialized, it’s hard to pull from other disciplines without watering [00:35:30] down your thesis.

Phil: Yeah, I think that’s a huge challenge. Also, I don’t think this is a hyperbolic claim either, I don’t think there’s a time in human history when it’s been more important to have big picture thinkers. We live in a time of specialization, fragmentation, but this is really when we need somebody to get a bird’s eye view of the whole situation and then draw reasonable conclusions based on that particular perspective. It is really hard because the world is super complicated, technology’s changing all the time, there are many factors. Yeah, [00:36:00] there’s a depth-breadth trade off.

Given the limitations of time and memory and things of that nature, the deeper you go the less you’re able to see how your sub sub sub field fits into the larger field. Then the broader you go, you’re missing all sorts of details on a topic like empathy and how it might play a role in ensuring a safe future. This is actually one of the reasons that I would argue for cognitive enhancement – to increase our capacity to juggle [00:36:30] and consolidate and integrate large amounts of information from a multiplicity of different domains thereby to enable us to, perhaps, get a better view.

Euvie: Yeah, I think increasingly there’s more and more need and more value in bridging different fields.

Phil: Yeah. I try my best in the book and the aim for the book is simply to inspire other people, like, “Oh, it’s actually perhaps important to understand religion,” religion being one of, perhaps, the most important cultural phenomenon in the world. [00:37:00] To give an example, 2076, that is a year that means nothing to the vast majority of people who study existential risk technologies or even terrorism. But that is a year that, on the one hand, not only are there probably going to be immensely powerful destructive technologies at that point, but that is a year that will probably be particularly dangerous for humanity. The reason is that roughly corresponds to 1,500 in the Islamic calendar. Historically, the turn of the century is a period of increase in apocalyptic [00:37:30] fervour.

For example, the Iranian revolution, which was widely seen as an apocalyptic occurrence, there was an also an incredible event called the Grand Mosque Siege in Mecca. Grand Mosque is the biggest mosque in the world and it’s built around the Kaaba, which is the most sacred site in Islam. It is a huge mosque. The first time I saw a picture, I couldn’t believe that I had never head of this before because it’s just so massive. In that same year, 1979, as the Islamic revolution, [00:38:00] a group of apocalyptic nut cases took over the Grand Mosque and held something like 100,000 people hostage for a while, claiming that they had the end of days Messianic figure with them.

Later, many of the members were beheaded, because it’s Saudi Arabia and that’s how they execute people to this day. Point being, that was 1,400 in the Islamic calendar, which corresponded to 1979. 1,500 is roughly 2076. If you don’t study [00:38:30] religion, if you just are looking at the tools like a lot of existential riskologists are doing, you’re going to miss that. In the years leading up to 2076, you’re going to have an increase very, very likely an increase in end times enthusiasm within the Islamic community. There’s a poll that projects religion is on the rise, in particular Islam. Islam is the fastest growing religion in the world. By 2076, it could be that 50 percent of the world is Muslim or something.

That means, just from an absolute numbers perspective, the chance that [00:39:00] somebody is going to grab a hold of a very powerful advanced technology and try to use it to destroy the world to save it might be quite high. Big picture thinking I feel like is very important but also, it’s so hard to do well.

Mike: What are some of the potential risks that scare you the most?

Phil: I think it really depends on the time scale. I mentioned earlier that AI probably poses the greatest threat to the long-term future of humanity. I think Bostrom and Steven Hawking, these individuals would concur. More immediately, I worry a lot about [00:39:30] climate change and biodiversity loss. Not because I think that those are going to result in some sort of extinction event, I think that’s unlikely although it could happen. I think the biggest danger posed by these two phenomena is that they are conflict multipliers that will probably change the conflict threshold between state and nonstate actors around the world. As resources become more scarce, tensions rise, the chance of terrorism is going to increase, the probability of a war breaking out [00:40:00] will increase.

There was actually a study published in the proceedings of the national academy of sciences last year that essentially drew a cause align from anthropogenic climate change to the rise of the Islamic state in Syria. It’s quite an extraordinary claim but it’s also very plausible. One event clearly led to the other. There were probably a concatenation of five events and one led to the other and then at the end was the Islamic state, the biggest and best funded terrorist organization in human history. Department of defence, [00:40:30] the CIA current director John Brennan, have all talked about climate change as something that is going to fuel terrorism in the future.

Again, you can imagine a world with a healthy biosphere that contains humans and extremely dangerous technologies. That is clearly going to be safer than a world that contains extremely dangerous technologies and humans and is undergoing extraordinary environmental degradation. That’s certainly a much more risky situation. [00:41:00] I really worry about biodiversity loss and climate change. I think mitigating those two phenomena is one of the best things we could be doing right now.

Euvie: Yeah, I can see what you mean. From what I understand, the Middle East wasn’t always a desert and then at some point it became a desert and that put pressure on the region. It makes sense with what you’re saying how it could actually exacerbate conflict.

Phil: Yeah. There was another study that came out – I believe it was last year, as well – it was totally misreported by the popular media unfortunately. Their claim [00:41:30] was that, on a time scale of decades there will be heatwaves throughout regions of Iran, that general region, I can’t remember in detail where else. But there will be heatwaves that are essentially not survivable by humans. They’ll reach what is called the wet bulb temperature of I think 95 degrees – that’s basically the point where your natural thermoregulatory process of sweating is no longer effective. You stand outside, you sweat all you want, you’re not going to cool down, your body is going to overheat.

Euvie: [00:42:00] Cook.

Phil: It’s quite an extraordinary study. Parts will be unliveable for certain amounts of time by the end of the 21st century.

Mike: What gives you the most optimism about the future?

Phil: Good question. To be honest, when I participate in the existential risk community and read the other work, there are other people who are very smart working on these very difficult issues who really care about the future of humanity, about future lives, about realizing all of the potential marvels of future technology. I feel like my book aims to be [00:42:30] something like the counter to Peter Diamandis’ Abundance. I don’t know if you’ve read that before.

Euvie: Yes.

Phil: Yeah. I have no quarrels with anything he says there; I think the future could be amazing, I just really want to get there. Again, I feel like people have neglected the addental aspect of the agent tool coupling, so I wanted to focus on the tools, the scenarios, and then also the various agents, so we can maximize the probability of an okay outcome.

Mike: It’s almost like [00:43:00] you’re defining variables in an equation here before we enter the game.

Phil: Yes, to some extent. For the purpose of conceptual clarity. I wouldn’t even argue that anything I’ve said is necessarily right. I believe it’s reasonable and worth considering moving forward. Again, nobody has really talked about the agent stuff, about the nature of religion, and how end times beliefs held by a huge, huge number – literally billions by 2050 – are going to interact with all of these technologies in the future. Yeah, doing my best [00:43:30] to feel around in the dark and, hopefully, there will be another generation that is inspired by it and does much better work, which is how progress happens.

Euvie: I would like to touch on the superintelligence as being a potential solution. What do you think are the possibilities here?

Phil: Yeah, superintelligence could – if it’s friendly, which is certainly a very difficult task to accomplish – could help us solve climate change, biodiversity loss, could help us to enhance ourselves in an effective manner. Even more theoretically, I’ve talked about there’s a [00:44:00] difference between a quantitative superintelligence and a qualitative one. Quantitative is, as the name suggests, it’s one that basically has our capacities but can do everything we do much faster. Or, alternatively, simply has the exact same type of memory, except expanded – it can remember more things and it can process that information at a high rate. There’s also a qualitative one that has access to concepts that are inaccessible to us.

From a philosophical perspective, we have [00:44:30] these minds that are embedded in this mind independence reality. Within our minds, we represent aspects of reality via concepts. It follows that, if our concept generating mechanisms are limited, if they can’t generate certain concepts, then the aspects of reality that those concepts correspond to will be forever unknowable to us. Does that make sense?

Euvie: Yeah, totally.

Phil: Just as a dog, no matter how hard it tries, it’s not going to comprehend what a boson is. It just does not have [00:45:00] the necessary concept generating mechanisms. We’re part of the animal kingdom, the difference between us and a dog is a matter of degree; there’s every reason to think that our minds are limited and, in principle, there are just some concepts we just cannot grasp. [inaudible [0:45:15] call it cognitive closure or epistemic boundedness. It could be the case. If you have a qualitative superintelligence, it could potentially gain access to a whole new library of concepts and, therefore, devise theories that will forever allude us.

It could [00:45:30] even, from a risk perspective, it could even identify risks in the universe that exist right now but we not only don’t know about but can never know about. Just as a dog is forever ignorant of the risk of an asteroid impact – you can’t explain this risk to the dog even though the risk haunts the dog no less than any other organism on earth. There could be cosmic risks right now – perhaps all around us, who knows – that pose a very serious danger to our future. If we were to succeed in developing a qualitative superintelligence, [00:46:00] it could go, “I see there are these five risks that you guys have totally ignored,” and it would try to explain the risks to us and we’d have no idea, you just couldn’t comprehend what it’s saying.

It could even potentially protect us from cosmic phenomena like that. It’s quite theoretical but I think there’s actually an argument to be made for these possibilities. The idea with the qualitative thing is you might have phenomena out there that is incredibly simple – it doesn’t have to be complicated at all – it could be as simple as it gets, [00:46:30] yet, we just don’t have the concept for it because we evolved in essentially this mesoscopic level of reality. We’re really good at understanding mesoscopic phenomena but when it comes to the big stuff, when it comes to very small stuff, our conceptual abilities are really strained. Take one step further, again, there could be cosmic phenomena that are super simple but we just don’t have the concept for it. It is to us what a boson is to a dog.

Euvie: I’m wondering what can an average person actually [00:47:00] do about this? Obviously, scientists are studying this stuff and philosophers are thinking about this stuff, but what can an average person do to actually help the situation or educate themselves?

Phil: Good question. I think educating yourself is something you could do. I think voting for the right politicians is most certainly something that is within the individual’s power, then I would say living a more carbon neutral life. Maybe another to do is to donate to support, insofar as one [00:47:30] can, institutions like the Future of Humanity Institute and Centre for the Study of Existential Risks and various others, there are a handful out there. Study basic epistemology, understand what it means to have a justified belief, then do your best to proportion your fears to the best available evidence considered in totality.

Mike: I think that’s all the time we’ve got for this episode, Phil. I’m really glad you came on and talked about all of this.

Phil: Yeah, I’m very, very thankful that you asked me. This is really wonderful.

From global climate change to nuclear war, there are many things threatening our existence at any given time. Existential risks are those that endanger the existence of the entire human species. These are threats that can potentially lead to human extinction, or cause irreversible damage to the human civilization.

End-times worries and predictions are nothing new: philosophers and religious figures have been proclaiming that “the end is near” for thousands of years during our human history.

There is even a special field of theology focused on studying end-times narratives, called Eschatology. Traditionally, this field of study was mainly a religious one. However, in the modern times scientists have also taken an interest in end times scenarios, and a new field of existential risk studies was born.

Existential Risk Studies

The field of existential riskology attempt to understand and weigh these threats, and to come up with ways to lessen or mitigate them. It became a serious scientific field of study after the 2nd world war – no surprisingly. In 1947, the Bulletin of Atomic Scientists created a “Doomsday clock” to symbolize how close the world is to global disaster – the closer the hand of the clock is to midnight, the greater the threat.

In 2015, the hand of the clock was moved to 3 minutes until midnight – the second closest we have ever been to potential global disaster. Concerns over lack of action about global climate change and the stockpiling of nuclear weapons by many countries were the main reasons for this. Since the establishment of the doomsday clock, the only time the hand of the clock was closer to midnight was in 1953, when the United States and Soviet Union both detonated thermonuclear bombs.

Future Technologies and Existential Risks

As we progress through the 21st century and develop new technologies like nanotechnology, biotechnology, artificial intelligence, we will be faced with new existential risks the likes of which we have never seen before. As these technologies become more advanced and more accessible to an average person, the risk of abuse or misuse will also increase.

It is important to study the scientific data that has to do with the major existential risks we are facing today. Risks like climate change can no longer be denied, considering how much data we have that undisputedly proves that it is real and man-made. It is also important to look ahead and understand what risks we will be facing in 10, 20, 30 years.

Phil Torres and the Religious Extremism Factor

Phil Torres is a researcher and author who has been studying existential risks for a number of years. He has worked with the Institute for Ethics and Emerging Technologies for nearly 10 years, where he is an affiliate scholar. He has recently published a book called The End: What Science and Religion Tell Us about the Apocalypse. In this episode, we speak with Phil about his research and views on the main existential risk humanity will face in the 21st century, and the possible solutions to them.

Phil’s research focuses on the interplay of future technologies, and those who could abuse or misuse them. His book puts special attention on religions that have end-times narratives, and the people who may be interested in bringing those predictions into reality.

Quotes:

“When one’s beliefs are not properly anchored to the evidence, chances are that’s going to be bad. You’re trying to navigate objective reality with beliefs that have nothing to do with objective reality.” – Phil Torres

In This Episode of Future Thinkers Podcast:

  • [03:05] – What existential risks we are facing today?
  • [05:00] – Existential risks of the future
  • [07:10] – Why religious narratives are an important thing to study
  • [12:31] – Understanding the relationship between the “agents” and the “tools”
  • [14:24] – Potential solutions to future existential risks
  • [21:20] – Alarm vs. Alarmists
  • [31:59] – How can increasing empathy help?
  • [42:04] – Optimism for the future

Mentions and Resources:

Recommended Books:

More From Future Thinkers:

 

CONTACT US

Got a question / comment / suggestion? Email us!

Sending

Our mission is to evolve technology, society, and consciousness so that we can all be better adapted to the future.

©2018 FutureThinkers.org | CryptoRadio.io

Log in with your credentials

Forgot your details?