When Eric Drexler published Engines of Creation, he included a section on run-away nanotech replicators. The notion is that if there are replicators running around that can use almost anything as feedstock, they buzz through the biosphere converting everything to copies of themselves, leaving behind an ocean of replicators. This is called the "gray goo scenario".
This idea has been popularized by Bill Joy and Michael Crichton and, along with the toxicity of present-day nanoparticles, is considered by many to be one of the grave risks we face in pursuing nanotechnology. Bill Joy's article in Wired recommends that we (the U.S.? the developed nations? Bill and the mouse in his pocket?) simply refrain from developing the technology.
There's a problem: we might refrain, but others won't. The research is difficult but not impossible. Those who pursue the research would be the people who didn't agree to refrain from development, and this dangerous new technology would end up in the hands of those we'd be most fearful of having it.
Drexler and the Foresight Institute, seeking to educate the public and help reason win out over panic, have struggled with these worries for years now. He has argued that building a free-ranging eat-anything replicator is a very difficult engineering problem, comparable to building a car that can forage in the woods for fuel when it runs out of gasoline. Nobody designs a foraging car by accident. A gray-goo replicator may not be possible at all, and if it is possible, it will be the design work of many years.
A dangerous replicator might evolve from nanomachines that started out safe. Many troublesome viruses probably started out as mutations of innocent snippets of DNA. Human-designed nanomachines should not be permitted to evolve. There has been keen interest in nanofactories in recent years, where the instructions are clearly separate from the assembly machinery (what Ralph Merkle calls a broadcast architecture, and computer folks call a SIMD architecture). The instructions are only put to use when the human user decides to push the button to make stuff happen. Autonomous self-replication can not occur, and therefore evolution cannot occur.
Another way to prevent evolution is to ensure that every mutation is fatal. Suppose the replicator's only copy of its blueprint is encrypted, and replication requires decrypting the blueprint each time. If you change one bit or one character in an encrypted document and try to decrypt it, the whole thing becomes meaningless garbage. Any change to the encrypted blueprint will turn the decryption into garbage, and the machine won't be able to build anything.
Finally, there are different ways to pull the plug. One is to stop supplying the energy required for replication. Another is to require tha replication depend upon some exotic "vitamin" available only in a controlled environment.
No comments:
Post a Comment