You are on page 1of 1

Existential risks arent cumulative we can only die once.

Bostrom, 11 [Nick Bostrom, Professor in the Faculty of Philosophy & Oxford Martin School. The Concept of Existential
Risk, http://www.existential-risk.org/// AC] Finally, when considering existential-risk

probabilities, we must recognize that one existential catastrophe can preempt another. If a meteor wipes us out next year, the existential risk from future machine superintelligence drops to zero. The sum of all-things-considered probabilities of disjoint (mutually exclusive) existential risks cannot exceed 100%. Yet conditional probabilities of disjoint existential risks
(conditional, that is to say, on no other existential disaster occurring preemptively) could well add up to more than 100%. For example, some pessimist might coherently assign an 80% probability to humanity being destroyed by machine superintelligence, and a 70% conditional probability to humanity being destroyed by nanotechnological warfare given that humanity is not destroyed by machine superintelligence. However, if the unconditional (all-things-considered) probability of our being eradicated by superintelligence is 80%, then the unconditional probability of our being eradicated by nanotech war must be no greater than 20%, since we can only be eradicated once.[9]

You might also like