Jump to content

End of Humanity?


Sin

Recommended Posts

Oxford, England—The Global Catastrophic Risks conference sponsored by the Future of Humanity Institute concluded on Sunday. Participants were treated to a series of presentations describing how billions of people could potentially bite the dust over the next century. The possible megadeath tolls of both natural and biotech pandemics were considered. The chances that asteroids, comets, or gamma ray bursts from a nearby supernova could wipe out humanity were calculated. The old neo-Malthusian threats of overpopulation, resource depletion, and famine were trotted out. But these risks to future human well-being paled in comparison to one main menace—malicious human ingenuity.

 

Human ingenuity forged the still massive arsenals of nuclear weapons held by the United States and Russia. And as the conference participants made an argument that human ingenuity is on track to craft nanotech fabricators that can make essentially any product, including weapons of mass destruction, at essentially no cost, not to mention a self-improving artificial intelligence possessing god-like powers to pursue its own goals.

 

First, let's consider the nuclear threat. Joseph Cirincione of the Ploughshares Fund pointed out the good news that the world's nuclear arsenals have been cut in half-down from 65,000 to 26,000 since the height of the Cold War. However, the U.S. retains 10,685 nuclear bombs and Russia is estimated to have around 14,000. Of those, 4,275 in the U.S. and 5,192 in Russia are active. Both countries maintain 2,000 weapons on hair-trigger alert, ready for launching in 15 minutes or so. Cirincione offered a couple of scary scenarios, including one in which there is an unauthorized launch of all 12 missiles from a Russian submarine containing 48 warheads with about 5 megatons total destructive power. Such an attack would kill 7 million Americans immediately. A retaliatory American attack aimed at several hundred Russian military assets would kill between 8 and 12 million Russians.

 

With regard to the possibility of an accidental nuclear war, Cirincione pointed to the near miss that occurred in 1995 when Norway launched a weather satellite and Russian military officials mistook it as a submarine launched ballistic missile aimed at producing an electro-magnetic pulse to disable a Russian military response. Russian nuclear defense officials opened the Russian "football" in front of President Boris Yeltsin, urging him to order an immediate strike against the West. Fortunately, Yeltsin held off, arguing that it must be a mistake.

 

A global nuclear war scenario in which most of both Russian and American arsenals were fired off would result in 105 to 230 million immediate American deaths and 28 to 56 million immediate Russian deaths. One of the effects of such an attack would be a rapid cooling of global temperatures as sunlight was blocked by dust and smoke. Cirincione argued that even a nuclear war limited just to bitter enemies India and Pakistan could produce enough dust and smoke to lower global temperatures by one half to two degrees Celsius, plunging the world back to the Little Ice Age.

 

The good news is that Cirincione sees an opening for negotiations to totally eliminate nuclear weapons. He pointed to an initiative by the "Four Horsemen of the Un-Apocalypse"; that is, by former Secretaries of State Henry Kissinger and George Schultz, former Sen. Sam Nunn (D-Ga.), and former Secretary of Defense William Perry that aim to eliminate nuclear weapons completely. In fact, both of the presumptive major party presidential candidates, Sen. John McCain (R-Ariz.) and Sen. Barack Obama (D-Ill.), have explicitly endorsed the idea of global nuclear disarmament. Cirincione argued that a commitment by the declared nuclear powers would have the effect of persuading countries like Iran that they did not need to become nuclear powers themselves.

 

Cirincione danced around the question of what to about Iran's pursuit of nuclear weapons, pointing out that its nuclear facilities are hardened, dispersed, and defended. Cirincione asserted that the U.S. has 5-day and 10-day plans for taking out Iran's nuclear facilities, but he noted that such plans don't end the matter. Iran has moves too, including trying to block oil shipments through the Straits of Hormuz, revving up terrorist attacks in Iraq, and even aiding terrorist attacks in the U.S. Cirincione claimed that that the Iranians are still five to ten years away from making a nuclear bomb. On a side note, Cirincione admitted that he initially did not believe that the Syrians had constructed a nuclear weapons facility, but is now convinced that they did. The Syrians hid it away in a desert gully, disguising it as an ancient Byzantine building.

 

Terrorism expert Gary Ackerman from the University of Maryland and William Potter from the Monterey Institute of International Studies evaluated the risks from two types of nuclear terrorism—the theft of nuclear material and the construction of a crude bomb and the theft of an intact nuclear weapon. They set aside two lower consequence attacks: the dispersal of radiological material by means of a conventional explosion and sabotage of nuclear facilities. Could non-state actors, a.k.a., a terrorist group, actually build a nuclear bomb? Potter cited an article by Peter Zimmerman in which he estimated that a team of 19 terrorists (the same number that pulled off the September 11 atrocities) could build such a bomb for around $6 million. Their most challenging task would be to acquire 40 kilograms of highly enriched uranium (HEU). There are 1700 tons of HEU in the world, including 50 tons stored at civilian sites. Potter acknowledged that intact weapons are probably more secure than fissile material.

 

Ackerman noted that only a small subset of terrorists has the motivation to use nuclear terrorism. "So far as we know only Jihadists want these weapons," said Ackerman. Specifically, Al Qaeda has made ten different efforts to get hold of fissile material. Ackerman told me that Al Qaeda had been defrauded several times by would-be vendors of nuclear materials. Just before the September 11 atrocities, two Pakistani nuclear experts visited Osama bin Laden in Afghanistan, apparently to advise Al Qaeda on nuclear matters. One possibility is that if Pakistan becomes more unstable intact weapons could fall into terrorist hands. Still, the good news is that intercepted fissile material smugglers have actually been carrying very small amounts. Less reassuringly, Potter did note that prison sentences for smugglers dealing in weapons grade nuclear material have been less than those meted out for drunk driving.

 

One cautionary case: Two groups invaded and seized the control room of the Pelindaba nuclear facility in South Africa in November, 2007. They were briefly arrested and then released without further consequence. Both Ackerman and Potter agreed that it is in no state's interest to supply terrorists with nuclear bombs or fissile material. It could be easily traced back to them and they would suffer the consequences. Ackerman cited one expert estimate that there is a 50 percent chance of a nuclear terrorist attack in the next ten years.

 

While nuclear war and nuclear terrorism would be catastrophic, the presenters acknowledged that neither constituted existential risks; that is, a risk that they could cause the extinction of humanity. But the next two risks, self-improving artificial intelligence and nanotechnology, would.

 

The artificial intelligence explosion?

 

Singularity Institute for Artificial Intelligence research fellow Eliezer Yudkowsky began his presentation with a diagram of the space of possible minds. Among the vast space of possible minds, a small dot represented human minds. His point is that two artificial intelligences (AIs) could be far more different from one another than we are from chimpanzees. Yudkowsky then described the relatively slow processing speeds of human brains, the difficulty in reprogramming ourselves, and other limitations. An AI could run 1 million times faster, meaning that it could get a year's worth of thinking done in 31 seconds. An "intelligence explosion" would result because an AI would have access to its source code and could rapidly modify and optimize itself. It would be hard to make an AI that didn't want to improve itself in order to better achieve its goals.

 

Can an intelligence explosion be avoided? No. A unique feature of AI is that it can be a "global catastrophic opportunity." Success in creating a friendly AI would give humanity access to vast intelligence that could be used to mitigate other risks. But picking a friendly AI out of the space of all possible minds is a hard and unsolved problem. According to Yudkowsky, the unique features of a superintelligent AI as a global catastrophic risk are: There is no final battle, or an unfriendly AI just kills off humanity. And there is nowhere to hide because the AI can find you wherever you are. There is no learning curve since we get only one chance to produce a friendly AI. But will it happen? Yudkowsky pointed out that there is no way to control the proliferation of "raw materials," e.g., computers, so the creation of an AI is essentially inevitable. In fact, Yudkowsky believes that current computers are sufficient to instantiate an AI, but researchers just don't know how to do it yet.

 

What can we do? "You cannot throw money or regulations at this problem for an easy solution," insisted Yudkowsky. His chief (and somewhat self-serving) recommendation is to support a lot of mathematical research on how to create a friendly AI. Of course, former Sun Microsystems chief scientist Bill Joy proposed another solution: relinquishment. That is, humanity has to agree to some kind of compact to never try to build an AI. "Success mitigates lots of risks," said Yudkowsky. "Failure kills you immediately." As a side note, bioethicist James Hughes, head of the Institute for Ethics and Emerging Technologies, mused about how much longer it would be before we would see Sarah Connor Brigades gunning down AI researchers to prevent the Terminator future. (Note to self: perhaps reconsider covering future Singularity Institute conferences.)

 

The menace of molecular manufacturing?

 

Next up was Michael Treder and Chris Phoenix from the Center for Responsible Nanotechnology. They cannily opened with a series of quotations claiming that science will never be able to solve this or that problem. Two of my favorites were: "Pasteur's theory of germs is a ridiculous fiction" by Pierre Pachet in 1872, and "Space travel is utter bilge," by Astronomer Royal Richard Woolley in 1956. Of course, the point is that arguments that molecular manufacturing is impossible are likely to suffer the same predictive failures. Their vision of molecular manufacturing involves using trillions of very small machines to make something larger. They envision desktop nanofactories into which people feed simple raw inputs and get out nearly any product they desire. The proliferation of such nanofactories would end scarcity forever. "We can't expect to have only positive outcomes without mitigating negative outcomes," cautioned Treder.

 

What kind of negative outcomes? Nanofactories could produce not only hugely beneficial products such as water filters, solar cells, and houses, but also weapons of any sort. Such nanofabricated weapons would be vastly more powerful than today's. Since these weapons are so powerful, there is a strong incentive for a first strike. In addition, an age of nanotech abundance would eliminate the majority of jobs, possibly leading to massive social disruptions. Social disruption creates the opportunity for a charismatic personality to take hold. "Nanotechnology could lead to some form of world dictatorship," said Treder. "There is a global catastrophic risk that we could all be enslaved."

 

On the other hand, individuals with access to nanofactories could wield great destructive power. Phoenix and Treder's chief advice is more research into how to handle nanotech when it becomes a reality in the next couple of decades. In particular, Phoenix thinks that it's urgent to study whether offense or defense would be the best response. To Phoenix, offense looks a lot easier—there are a lot more ways to destroy things than to defend them. If that's true, we should narrow our future policy options.

 

This concluion left me musing on British historian Arnold Toynbee's observation: "The human race's prospects of survival were considerably better when we were defenseless against tigers than they are today when we have become defenseless against ourselves." I don't think that's right, but it's worth thinking about.

 

SOURCED:http://www.reason.com/news/show/127676.html

 

Are all of these reasons probable?

Which of these choices would be more of a threat?

And of course, could any other threats be added to this list that could cause an end to Humanity?

Link to comment
Share on other sites

I think the whole "pandemic" thing is vastly overblown. Look at the hype over bird flu. Live Science had an article based on WHO data that on the 26th March 2006, the global death toll crossed 100. Presumably it's gone higher since then, but since more people die on Queensland roads in 6 months than Avian Flu has killed worldwide in 11 years, I really can't sonsider it a threat in any meaningful sense.

 

SARS.

Singapore, (from a newspaper article)

Sars has killed 28 people there, with the country's defence minister, Tony Tan, describing it as "Singapore's September 11".

Oh wow, 28 out of a population of 4.6 million.

As of 10.17 AM, 127 people have died from heart disease in the US in 2009.

 

A sense of perspective seems to be badly lacking when it comes to "pandemics".

Link to comment
Share on other sites

While those were overblown, this sort of thing has happened before, and fairly recently, too: Remember the 1918 flu pandemic that killed over 20 million people. Sadly, our health system hasn't exactly improved by leaps and bounds - we still have shortages of vaccines for the regular flu.

 

A sense of perspective seems to be badly lacking when it comes to "pandemics".

 

I'd say the same for just about every doomsday scenario. The one difference is that, unlike nuclear weapons or asteroids, pathogens evolve.

Link to comment
Share on other sites

Sadly, our health system hasn't exactly improved by leaps and bounds - we still have shortages of vaccines for the regular flu.

 

Not only that, but also several members of the populace now distrust vaccines because of ridiculous misinformation on the internet talking about how they cause autism and growth of third tits and other nonsense. So, not only do we lack availability of vaccines, but we lack a populace intelligent enough to take them when available. Yay, humanity! Go us! :doh:

Link to comment
Share on other sites

I think the whole "pandemic" thing is vastly overblown. Look at the hype over bird flu. Live Science had an article based on WHO data that on the 26th March 2006, the global death toll crossed 100. Presumably it's gone higher since then, but since more people die on Queensland roads in 6 months than Avian Flu has killed worldwide in 11 years, I really can't sonsider it a threat in any meaningful sense.

 

SARS.

Singapore, (from a newspaper article)

 

Oh wow, 28 out of a population of 4.6 million.

As of 10.17 AM, 127 people have died from heart disease in the US in 2009.

 

A sense of perspective seems to be badly lacking when it comes to "pandemics".

 

Surely you don't doubt the very possibility of pandemic? Because I know some plague-infected rats who beg to differ. Perhaps bird flu, SARS, etc. had negligible effect because they were taken so seriously, cases were isolated, and appropriate restrictions were put in place. Maybe if, say, HIV had been taken as seriously when it first emerged, it could have been erradicated relatively easily.

Link to comment
Share on other sites

Also, about bird flu - if I recall correctly it wasn't the bird flu itself that everyone was freaking out about, it was that it had a very good possibility of going from avian -> human to human -> human transmission with minor mutations, at which point it would have very high pandemic potential.

 

What scared everyone was that it had a large enough breeding pool to see those mutations emerge within a high enough probability and cause a pandemic.

Link to comment
Share on other sites

  • 3 months later...

I read that the use of nuclear weapons during WW2 on civilian cities was decided by the thanatocracy

 

What's this ? They only can decide of such physical End of Humanity, or diseases and hungriness due to overcrowded earth by humans. (for this we have a brave new world controlling some societies where all human are sterilized at birth and only few will be chosen for reproduction)

 

But we could see a logical end of humanity : due to the evolution, the future humans will be so different, that the naming of them will change, and we will not call them humans, but another name like human are not anymore called monkey even if they, in some theories, evolve from them.

Link to comment
Share on other sites

But we could see a logical end of humanity : due to the evolution, the future humans will be so different, that the naming of them will change, and we will not call them humans, but another name like human are not anymore called monkey even if they, in some theories, evolve from them.

 

Appreciate your imagination!

Link to comment
Share on other sites

But we could see a logical end of humanity : due to the evolution, the future humans will be so different, that the naming of them will change, and we will not call them humans, but another name like human are not anymore called monkey even if they, in some theories, evolve from them.

I came across ashort story some years ago on this. Mankind had spread through the stars over 100,000 years without finding anybody else in the Galaxy. Then on the far side an explorer ship finds a satellite broadcasting on all wavelengths with a series of telescopes aimed at a single star. A sort of beacon to attract attention and the scopes to show where to go.

 

When the ship arrives at the star it is met by a green skinned copper based alien and finally mankind is not alone.

 

Until the DNA tests come back and the alien is found to be human. Highly modified and evolved, but still descended from the humans of earth. As man spread out around the Galaxy on one arm, so he spread in the other direction, until finally the two different sections of the race meet on the far side.

 

Both human, but having little in common except their humanity.

Link to comment
Share on other sites

A pandemic of bird flu won't wipe out people. Nor will a pandemic of ring-tailed-lemur flu- but it might damage the lemur population.

Of course if we had another pandemic of human flu (like the one in about 1918) then we would have a problem.

Even then you have to remember that we are (largely) the descendents of those who survived that pandemic so thanks to evolution, we might not have as big a problem as our granparents did.

Of course the next pandemic might be something totally new, in which case we are really in trouble.

 

I wonder why the death of the honey bees hasn't had a mention.

Link to comment
Share on other sites

I came across ashort story some years ago on this. Mankind had spread through the stars over 100,000 years without finding anybody else in the Galaxy. Then on the far side an explorer ship finds a satellite broadcasting on all wavelengths with a series of telescopes aimed at a single star. A sort of beacon to attract attention and the scopes to show where to go.

 

When the ship arrives at the star it is met by a green skinned copper based alien and finally mankind is not alone.

 

Until the DNA tests come back and the alien is found to be human. Highly modified and evolved, but still descended from the humans of earth. As man spread out around the Galaxy on one arm, so he spread in the other direction, until finally the two different sections of the race meet on the far side.

 

Both human, but having little in common except their humanity.

 

That's actually a fairly common theme in science fiction, and one I really like. It's a way to realistically have "aliens" that are similar enough to us that their motivations and ways of thinking overlap with ours enough to be comprehensible characters, but still as different as the author chooses, so those differences and their consequences can be explored.

 

If we do ever manage to establish self-sustaining colonies on other worlds (which would likely mean one way trips for whole populations), "speciation" seems pretty much inevitable.

 

Although, on an even shorter timeframe, with the rise of genetic engineering and artificial intelligence, I'm guessing that what it means to "human" is about to get a whole lot muddier.

Link to comment
Share on other sites

  • 2 weeks later...
1. How bad would an all-out nuclear exchange get?

Read "On the Beach". It scared the bejeezus out of me. City after city going dark as the cloud came south.

Is this accurate? I thought that after the blast much of the radioactive contamination was short lived and would eventually wash away? When would those dead trees start growing again? Anyone seen some studies?

I haven't seen any recent ones. It would be fair to say though that an "all out" exchange would be less destructive than 40 years ago. Things would probably settle down again within a few years.

2. Anyone see the Pentagon report that concluded 9 tactical weapons on the right targets in the USA would collapse America? Just 9 out of all those cities? Apparently if one hits the right food and industrial networks that's the world's only superpower, DOWN! I've heard about this from experts I trust but have not found the report yet.

I've not heard of the report (but I haven't been looking for it either) however I find it hard to credit. Tactical nukes are quite small in effect radius and I doubt the US, with it's very distributed infrastructure would have 9 choke points.

 

That's not to say that aren't 9 dams that would cause massive destruction if blown. Which may be the point of the report. You don't have to take out a city, just deprive it of food and water for a few days and wait.

 

Frankly though, I think an exchange is highly improbable these days.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.