Jump to content

Gray Goo


Jim

Recommended Posts

I worry more about grey goo than nukes for the reason stated in the portion I've bolded below:

 

The enabling breakthrough to assemblers seems quite likely within the next 20 years. Molecular electronics - the new subfield of nanotechnology where individual molecules are circuit elements - should mature quickly and become enormously lucrative within this decade, causing a large incremental investment in all nanotechnologies.

 

Unfortunately, as with nuclear technology, it is far easier to create destructive uses for nanotechnology than constructive ones. Nanotechnology has clear military and terrorist uses, and you need not be suicidal to release a massively destructive nanotechnological device - such devices can be built to be selectively destructive, affecting, for example, only a certain geographical area or a group of people who are genetically distinct.

 

An immediate consequence of the Faustian bargain in obtaining the great power of nanotechnology is that we run a grave risk - the risk that we might destroy the biosphere on which all life depends.

 

As Drexler explained:

 

"Plants" with "leaves" no more efficient than today's solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough omnivorous "bacteria" could out-compete real bacteria: They could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop - at least if we make no preparation. We have trouble enough controlling viruses and fruit flies.

 

Among the cognoscenti of nanotechnology, this threat has become known as the "gray goo problem." Though masses of uncontrolled replicators need not be gray or gooey, the term "gray goo" emphasizes that replicators able to obliterate life might be less inspiring than a single species of crabgrass. They might be superior in an evolutionary sense, but this need not make them valuable.

 

The gray goo threat makes one thing perfectly clear: We cannot afford certain kinds of accidents with replicating assemblers.

 

Gray goo would surely be a depressing ending to our human adventure on Earth, far worse than mere fire or ice, and one that could stem from a simple laboratory accident.6 Oops.

 

It is most of all the power of destructive self-replication in genetics, nanotechnology, and robotics (GNR) that should give us pause. Self-replication is the modus operandi of genetic engineering, which uses the machinery of the cell to replicate its designs, and the prime danger underlying gray goo in nanotechnology. Stories of run-amok robots like the Borg, replicating or mutating to escape from the ethical constraints imposed on them by their creators, are well established in our science fiction books and movies. It is even possible that self-replication may be more fundamental than we thought, and hence harder - or even impossible - to control. A recent article by Stuart Kauffman inNature titled "Self-Replication: Even Peptides Do It" discusses the discovery that a 32-amino-acid peptide can "autocatalyse its own synthesis." We don't know how widespread this ability is, but Kauffman notes that it may hint at "a route to self-reproducing molecular systems on a basis far wider than Watson-Crick base-pairing."7

 

In truth, we have had in hand for years clear warnings of the dangers inherent in widespread knowledge of GNR technologies - of the possibility of knowledge alone enabling mass destruction. But these warnings haven't been widely publicized; the public discussions have been clearly inadequate. There is no profit in publicizing the dangers.

 

The nuclear, biological, and chemical (NBC) technologies used in 20th-century weapons of mass destruction were and are largely military, developed in government laboratories. In sharp contrast, the 21st-century GNR technologies have clear commercial uses and are being developed almost exclusively by corporate enterprises. In this age of triumphant commercialism, technology - with science as its handmaiden - is delivering a series of almost magical inventions that are the most phenomenally lucrative ever seen. We are aggressively pursuing the promises of these new technologies within the now-unchallenged system of global capitalism and its manifold financial incentives and competitive pressures.

 

This is the first moment in the history of our planet when any species, by its own voluntary actions, has become a danger to itself - as well as to vast numbers of others.

 

It might be a familiar progression, transpiring on many worlds - a planet, newly formed, placidly revolves around its star; life slowly forms; a kaleidoscopic procession of creatures evolves; intelligence emerges which, at least up to a point, confers enormous survival value; and then technology is invented. It dawns on them that there are such things as laws of Nature, that these laws can be revealed by experiment, and that knowledge of these laws can be made both to save and to take lives, both on unprecedented scales. Science, they recognize, grants immense powers. In a flash, they create world-altering contrivances. Some planetary civilizations see their way through, place limits on what may and what must not be done, and safely pass through the time of perils. Others, not so lucky or so prudent, perish.

 

That is Carl Sagan, writing in 1994, in Pale Blue Dot, a book describing his vision of the human future in space. I am only now realizing how deep his insight was, and how sorely I miss, and will miss, his voice.

 

http://www.wired.com/wired/archive/8.04/joy.html?pg=7&topic=&topic_set=

Link to comment
Share on other sites

That would be something: naturally evolved life brings about a new genesis of "intelligently designed" life, which is superior to it and brings about its destruction. Then, who knows? A self-replicating machine would be capable of evolution... Maybe our earliest ancestors were actually a gray goo accident of a clumsy alien colonist... ;)

 

But why is this in politics?

Link to comment
Share on other sites

Yeah, with the Singularity comes rapid discoveries of better and better ways of destroying ourselves. Singularitarians classify all of these scenarios as existential risks to the human species... or in the case of grey goo, all life on earth...

 

Eric Drexler's proposed solution (in Engines of Creation) of an intelligently controlled nanorobot "shield" (i.e. utility fog) to stop out-of-control replicators seems like the only real solution...

 

Of course, nukes can kill us now.

Link to comment
Share on other sites

That would be something: naturally evolved life brings about a new genesis of "intelligently designed" life' date=' which is superior to it and brings about its destruction. Then, who knows? A self-replicating machine would be capable of evolution... Maybe our earliest ancestors were actually a gray goo accident of a clumsy alien colonist... ;)

 

But why is this in politics?[/quote']

 

For the same reasons nuke issues are in politics - to discuss related policies.

Link to comment
Share on other sites

Of course, nukes can kill us now.

 

Yes, but nukes have already been invented and there is nothing really to be done about them. I'm more concerned with whether this kind of emerging risk is likewise inevitable. Playing defense against nanorobots seems far more implausible than an SDI shield.

Link to comment
Share on other sites

I don't think it's possible to prevent the technology, since it will be developed somewhere, eventually, if not in the nation or nations where you outlaw it. Further, attempting to prevent it is not merely foolish but actually quite dangerous, since it means it will be in the hands of those perhaps not so careful or benevolent as us, and we will have had no time to figure out what to do about it, analogous to trying to develop a vaccine for a disease that doesn't exist yet.

Link to comment
Share on other sites

I don't think it's possible to prevent the technology, since it will be developed somewhere, eventually, if not in the nation or nations where you outlaw it. Further, attempting to prevent it is not merely foolish but actually quite dangerous, since it means it will be in the hands of those perhaps not so careful or benevolent as us, and we will have had no time to figure out what to do about it, analogous to trying to develop a vaccine for a disease that doesn't exist yet.

 

It would be like saying the US is at fault for inventing nukes. The species is very fortunate that we did.

Link to comment
Share on other sites

It would be like saying the US is at fault for inventing nukes. The species is very fortunate that we did.

 

One little mistake could really change that fortune though. (I am optimistic though, and frankly I think if just about any other nation had been first it could have been um, "bad" overall)

 

 

Regarding the goo, wouldn't they be suseptible to a massive EM pulse? Anything that small can't have a lot of EM shielding.

 

I know I know, thats what they always do to fight evil nanos in sci fi shows, but wouldn't it likely work?

 

As for built-in protections to the bots - the whole thing about adaptive bots is they adapt and overcome obstacles, including intentionally engineered limitations.

Link to comment
Share on other sites

Regarding the goo, wouldn't they be suseptible to a massive EM pulse?

 

Depends how they're designed.

 

Anything that small can't have a lot of EM shielding.

 

Neither does the eukaryotic cell and it seems to handle EM pulses just fine...

 

Obscurantism does not provide a lasting solution. I think Eric Drexler's approach is the only sound one.

 

The UN really needs to be more aggressive in terms of diagnosing and planning real solutions for existential risks.

Link to comment
Share on other sites

In the shorter term (and a more immediate policy concern) there are concerns with the safety of nano stuff. This is mainly in regards to the effects of particles, say from machined nanotube materials.

Link to comment
Share on other sites

Also nanomachines have a nasty habit of getting inhaled and lodged in the lungs and bloodstream. So don't sneeze if you see a gray goo coming.

 

And bascule, theoretically, anything electronic is susceptible to an EM pulse. You just can't make enough shielding in a tiny little device, and unless it has no electronics at all, it'll be vulnerable in one way or another. Eukaryotic cells may not be all that vulnerable, but they're organic, not electronic.

Link to comment
Share on other sites

Also nanomachines have a nasty habit of getting inhaled and lodged in the lungs and bloodstream. So don't sneeze if you see a gray goo coming.

 

And bascule' date=' theoretically, anything electronic is susceptible to an EM pulse. You just can't make enough shielding in a tiny little device, and unless it has no electronics at all, it'll be vulnerable in one way or another. Eukaryotic cells may not be all that vulnerable, but they're organic, not electronic.[/quote']

 

It does sound like a bad sci fi/horror flick: A self-replicating nanomachine escapes from a laboratory and multiplies exponentionally like a cancer. It's escape triggers an EM pulse at the lab but a few "cells" escape. Detectors throughout the city of Tulsa start clanging but a venal Mayor worried about reelection is loath to fry the city's electronics. Enter our intrepid and far seeing hero who takes matters into his own hands and sets of a massive EMP and all seems well until it is discovered that the goo has eaten downward, into the center of the earth.

 

I'm just having fun here but it's difficult to play 100% effective defense.

Link to comment
Share on other sites

It would be like saying the US is at fault for inventing nukes. The species is very fortunate that we did.

 

That is pure codswallop.

 

1. The U.S. did not invent nukes. It developed the technology, entirely different.

 

2. Had the japanese won the war, or had it gone on longer with greater loss of life, "the species" would have continued, just under marginally dofferent conditions.

Link to comment
Share on other sites

I wouldn't call an Axis victory "marginally" different, seeing as how victory conditions were basically global empire and genocide.

 

I carefully said Japanese, conveniently separated from US interests at the time, and devoid of American soldiers. Which part of Germany would the US have considered nuking without collateral damage to the allies, especially the Russkies?

Link to comment
Share on other sites

That is pure codswallop.

 

Why so emotional?

 

1. The U.S. did not invent nukes. It developed the technology, entirely different.

 

I'm not sure how this subpoint is responsive to my argument but isn't a nuclear bomb a type of technology?

 

2. Had the japanese won the war, or had it gone on longer with greater loss of life, "the species" would have continued, just under marginally dofferent conditions.

 

My point was that some nation, at some point, would have developed the technology. Had Germany developed nukes (or, if you prefer, the technology for nukes), my 1/32nd Cherokee self wouldn't be here today.

 

As it was, the first country to develop the bomb used it to end a world war. It could have been much much worse.

Link to comment
Share on other sites

I carefully said Japanese, conveniently separated from US interests at the time, and devoid of American soldiers. Which part of Germany would the US have considered nuking without collateral damage to the allies, especially the Russkies?

 

I'm not sure it would have been that much better if Japan had dominated the globe with the bomb.

Link to comment
Share on other sites

JIM

 

You are right, I may have seemed harsh, you had agood basic argument, but it would have been better to have chosen an example less easy to shoot holes in. Lost my fags. Bad mood.

Link to comment
Share on other sites

JIM

 

You are right' date=' I may have seemed harsh, you had agood basic argument, but it would have been better to have chosen an example less easy to shoot holes in. Lost my fags. Bad mood.[/quote']

 

No problem but I still do not see where you shot any holes in the example. *scratches head*

Link to comment
Share on other sites

Any kind of out-of-control chain reaction has the ability to wipe out the human species, or possibly all life on earth.

 

Singularitarians generally discuss this in terms of "GNR" technologies, referring to genetic engineering, nanotechnology, and robotics (i.e. strong AI), discussing replicators in the biological, nanotechnological, and informational worlds.

Link to comment
Share on other sites

Any kind of out-of-control chain reaction has the ability to wipe out the human species' date=' or possibly all life on earth.

 

Singularitarians generally discuss this in terms of "GNR" technologies, referring to genetic engineering, nanotechnology, and robotics (i.e. strong AI), discussing replicators in the biological, nanotechnological, and informational worlds.[/quote']

 

How concerned are you about GNR Bascule? You seem to be fairly optomistic and, at least, more concerned about nukes as a risk than GNR.

Link to comment
Share on other sites

What other risks do you have in mind? Plagues? Asteroids? Mass starvation after failure of GE crops? Dark age brought on by religious fundamentalism?

 

(Gee' date=' this is fun...)[/quote']

 

I worry about cataclysmic risks in the following order of likelihood: Grey Goo (GNR), nukes, chem/bio terrorism, China, pandemics, asteroid collision, fundamentalist influences of a variety of forms, a coup d'etat in America and Jimmy Carter being elected President for a second term.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.