However, there are a few truly scary prospects for organic humankind if it ever happens.
Here's one, a techno-futuristic cautionary tale:
WARNING: Reading this article may commit you to an eternity of suffering and torment.
Slender Man. Smile Dog. Goatse. These are some of the urban legends spawned by the Internet. Yet none is as all-powerful and threatening as Roko’s Basilisk. For Roko’s Basilisk is an evil, godlike form of artificial intelligence, so dangerous that if you see it, or even think about it too hard, you will spend the rest of eternity screaming in its torture chamber. It's like the videotape in The Ring. Even death is no escape, for if you die, Roko’s Basilisk will resurrect you and begin the torture again.
Are you sure you want to keep reading? Because the worst part is that Roko’s Basilisk already exists. Or at least, it already will have existed—which is just as bad. ...
The idea is that merely by negatively affecting the possibility of the coming into existence of this particular demon-intelligence NOW, the super-being, having attained the ability to retroactively punish those who didn't help it come into being, will wreak revenge.
http://www.slate.com/articles/technology/bitwise/2014/07/roko_s_basilisk_the_most_terrifying_thought_experiment_of_all_time.htmlvery interesting article, umair. However, the author has to make up his mind as to whether the supercomputer
"which knows just about everything,"
or
has always been right in the past.
If the former, I'm a two-boxes guy, because the but if the latter, then a one box guy.
Yudkowsky is a moral utilitarian: He believes that that the greatest good for the greatest number of people is always ethically justified, even if a few people have to die or suffer along the way. He has explicitly argued that given the choice, it is preferable to torture a single person for 50 years than for a sufficient number of people (to be fair, a lot of people) to get dust specks in their eyes.
I don't know how many is "sufficient", but I can see a situation in which were enough people to get dust specks in their eyes, at least some people would die as a consequence, maybe it causes a car accident that kills an entire family, or some angry person lashes out at a child...to me, sufficient would be about 100,000 but yes, at that number, I'd consider that there were sufficient harm that would ensue by probability that it would justify the torture of that single person.
How could anyone think different?