Recently, the submission deadline passed for Google's 10 to the 100th contest, with a stupendous $10 million in the prize pool. More than 100,000 ideas to change the world were received. My entry has the extraordinary goal of saving the human race from extinction. Here's my idea.
As a background, let's take the Fermi Paradox. It asks why we haven't had contact with alien lifeforms, despite our belief that alien civilizations are common in the universe. One theory trying to explain it is that a Great Filter stands in the way of the evolution of civilizations, causing their annihilation before reaching a stage where they are able to communicate with other civilizations. Such a filter could be either behind us or in our future. If it is in our future we need to mitigate its risks and prepare for it.
Regardless if you believe in the Filter or not, the fact is that a number of existential risks pose a danger to our existence as a species. Some philosophers are estimating the likelihood of humankind surviving the century at even odds. A significant and increasingly potent group of threats is new technologies. Today, we are struggling to manage the threats of nuclear weapons. Looking back on our track history of that struggle we see that there were times when we were at the brink of launching nuclear weapons on a scale which could have led us to a Nuclear Winter. In our history, this is the first technological threat we faced where our very existence was at risk. We were lucky, but we did not handle it well.
Future technologies (some, already here) has the potential to be much more destructive. They can be deployed, not by nation-states, but by faceless organizations and individual terrorists, and cause a global effect. I am talking about biotechnology, nanotechnology and strong artificial intelligence. For example, already today, genomes of extinct epidemics are in the public domain, just waiting for being resurrected in a bio attack.
It is of supreme importance that the usage and access to new technologies will be globally managed and regulated. We can not take any risks when it comes to humankind's survival, there are no lessons to be learned from a critical mistake. I would expect the United Nations (UN) to be more alert to these issues than they currently are. If the UN don't step up here, another organization has to.
In this perilous vacuum, I want to see the Council for the Survival of Humankind established, a global organization with the agenda of mitigating existential risk.
Comments
Post a Comment