"I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals." --Bill Joy, Why the future doesn't need us
"In the game of life and evolution there are three players at the table: human beings, nature, and machines. I am firmly on the side of nature. But nature, I suspect, is on the side of the machines." --George Dyson, Darwin Among the Machines
To my mind, the only rational, responsible, forward-thinking philosophy to adopt, given that I find myself on a planet dominated by a violent, primitive species of primates capable of building hydrogen bombs, is radical transhumanism.
What do I mean by radical transhumanism? I mean the belief that homo sapiens must be technologically modified to survive in a technological civilization, just as we were biologically modified to survive as hunter-gatherers on the African plain. Mutation and natural selection obviously won't cut it any more; we need to take control of our evolution now, on the accelerated time scale of our technological development, before it’s too late.
Too late for what, you ask? Too late to survive this century with our civilization intact; too late to avoid nuking ourselves into oblivion, consuming our vital natural resources, laying waste to our planetary environment, unleashing self-replicating super-viruses, nanobots or robot armies that wipe us out. The scenarios for self-destruction are well-known, and they won’t go away as long as our technological civilization survives. Our current status as "rapacious, tribal monkeys with nukes" is inherently unstable, and is unlikely to last for long. We either go all-out for posthumanity by altering our biology, developing cybernetic forms of intelligence, and evolving into a more controllable super-organism, or we perish.
Ultimately I believe the Borg model is the one we will have to adopt. Individual minds which can destroy the world at will simply won’t be permissible in a post-Singularity future. This is the paradox that no one seems to be addressing: we create technology to empower ourselves, so that we may, as Stephen Wolfram promises, “be able to do anything we want to do.” But how is this a viable goal so long as even one pathological individual can decide she wants to annihilate everything? Current notions of individuality and freedom will therefore have no place in a truly super-empowered technological future. There will have to be strict limits on thought and behavior, enforced presumably by the architecture of the super-intelligence itself.
If you need further proof of the utter inadequacy of our species, recall the story of Germany from the last century. The same great nation that produced the sublime genius of Nietzsche, Heisenberg, Schrodinger and Zuse gave us the genocidal barbarism of the Nazis. What more needs to be said? Our destructive power grows in proportion to our technological prowess, and all of human history is one long demonstration that there are no political, religious or cultural solutions to the inherent flaws of our species. The only solution that hasn’t yet been tried is radical self-modification via technology -- and the means to do so are now at hand.
Martin Rees calls the 21st century our final hour, and I tend to agree. This century will probably be the swan song for our species in one way or another, but I would prefer that it be a voluntary demise, and that something better follow after us. So we must be bold, brave and brilliant as we create the posthuman future. The risks of pursuing the radical transhumanist agenda are great, and there are many unknowns, but they are risks I believe we must take if we value the long-term survival of Earth-based intelligent life.