TiagoTiago (OP)
|
|
June 26, 2011, 09:05:01 PM |
|
If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?
|
(I dont always get new reply notifications, pls send a pm when you think it has happened) Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!
|
|
|
NghtRppr
|
|
June 26, 2011, 09:11:59 PM |
|
If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?
The only connection between logic and morality is that logic can help you see implications of moral beliefs. If you believe killing is wrong and shooting someone with a gun will kill them then you can deduce that shooting someone with a gun is wrong. Aside from that though, the premise that "killing is wrong" is based on emotion, not logic. If you were completely logical with no emotions, you wouldn't even move, feed yourself, etc because there's nothing about logic that says you should prefer pleasure over pain or living over dying, unless you have some emotional predispositions built-in. If an intelligence has no emotions, we could do whatever we want and it would have no preference about whatever we did.
|
|
|
|
Tawsix
Full Member
Offline
Activity: 210
Merit: 100
I have always been afraid of banks.
|
|
June 26, 2011, 09:17:45 PM |
|
If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?
The only connection between logic and morality is that logic can help you see implications of moral beliefs. If you believe killing is wrong and shooting someone with a gun will kill them then you can deduce that shooting someone with a gun is wrong. Aside from that though, the premise that "killing is wrong" is based on emotion, not logic. If you were completely logical with no emotions, you wouldn't even move, feed yourself, etc because there's nothing about logic that says you should prefer pleasure over pain or living over dying, unless you have some emotional predispositions built-in. If an intelligence has no emotions, we could do whatever we want and it would have no preference about whatever we did. Hmm... I don't think survival is emotional, it's instinctual. The logic has to be based in something, and to answer the OP's question, it would depend on what the logic was based on: is slavery ok or not ok? Take the first to its logical conclusion, you have a communist utopia, take the second to its logical conclusion, you have an anarchist utopia.
|
|
|
|
ploum
|
|
June 26, 2011, 09:28:32 PM |
|
Morality is only a very logic survival instinct at the scale of a society.
For example, killing is wrong because if you were allowed to kill others, others would be allowed to kill you and the society would not be a good society for your survival.
This was not decided from nowhere: there were many try and guess during the human history and we are still living in a society not perfect for our survival. But we are improving (that's why we have a surpopulation problem).
The way a society forces the so called "morality" is simply through education. I hope that most of you consider that killing a black man or a jude is wrong. But, not reall far from us, we've seen society where it was completely acceptable and there was nothing wrong in killing a black or a jude (or a witch or any enemy).
Because education could be put into questions, most society developed a very efficient tool called "religion". That tool, which happened spontanously, used the non-understanding of the nature and superstitions to brain-wash people with a fixed morality.
Of course, leaders immediately saw the benefit of exploiting such a tool and instead of teaching a morality for the good of the whole society, the perverted religion to teach a morality good for themselves alone.
At some point, some leaders even reached the point where they believed in their own religion. Which is a direct downward spiral to madness.
As a consequence, the society realized that religion was not a good tool anymore to teach morality. Currently, religion is fighting back in a desesperate attempt and will probably disappear in the following centuries. Optimistic people think that education will now teach the morality to people.
|
|
|
|
NghtRppr
|
|
June 26, 2011, 09:28:55 PM |
|
Hmm... I don't think survival is emotional, it's instinctual. Instincts trigger emotions.
|
|
|
|
Tawsix
Full Member
Offline
Activity: 210
Merit: 100
I have always been afraid of banks.
|
|
June 26, 2011, 09:32:20 PM |
|
Hmm... I don't think survival is emotional, it's instinctual. Instincts trigger emotions. Earthquakes trigger tsunamis, that doesn't mean they're the same.
|
|
|
|
NghtRppr
|
|
June 26, 2011, 09:43:34 PM |
|
Hmm... I don't think survival is emotional, it's instinctual. Instincts trigger emotions. Earthquakes trigger tsunamis, that doesn't mean they're the same. So, would you say people were drowned by an Earthquake? I don't think so. Instincts may trigger emotions but the desire to survive is still emotional. Anyways, this is a pointless argument. The relevant point is that there's nothing dictated by logic that you should desire to survive. It's irrelevant whether you wish to argue that desire is emotional, instinctual are whatever else. That's a red herring.
|
|
|
|
TiagoTiago (OP)
|
|
June 26, 2011, 09:56:22 PM |
|
Actually, over the ages, creatures that had the tendency to desire to live coded in their DNA were more successful at producing offspring, so having descended from them logicly you should also have a desire to live, assuming you're not a mutation or descendant of the rarer ones with a deathwish that somehow managed to live long enough to produce offspring anyway
|
(I dont always get new reply notifications, pls send a pm when you think it has happened) Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!
|
|
|
ploum
|
|
June 26, 2011, 09:56:53 PM |
|
The relevant point is that there's nothing dictated by logic that you should desire to survive.
I disagree. This is perfectly logic. Any life form is simply a tool used by genes to ensure the survival of the said gene. Humans thus want to be sure that their genes survive. That's why they have babies. But in order to ensure that your have as many babies as possible and you raise them until they are autonomous, you want to protect yourself and survive. This is so tight to our human nature that you feel that even when you don't have babies (because the genes don't accept the fact that you don't have babies).
|
|
|
|
Tawsix
Full Member
Offline
Activity: 210
Merit: 100
I have always been afraid of banks.
|
|
June 26, 2011, 10:00:11 PM |
|
Hmm... I don't think survival is emotional, it's instinctual. Instincts trigger emotions. Earthquakes trigger tsunamis, that doesn't mean they're the same. So, would you say people were drowned by an Earthquake? I don't think so. Instincts may trigger emotions but the desire to survive is still emotional. Anyways, this is a pointless argument. The relevant point is that there's nothing dictated by logic that you should desire to survive. It's irrelevant whether you wish to argue that desire is emotional, instinctual are whatever else. That's a red herring. I wasn't arguing with you that survival isn't a logical thing . My point is that logic is a tool, what you use it on will give you different results.
|
|
|
|
NghtRppr
|
|
June 26, 2011, 10:18:01 PM Last edit: June 26, 2011, 10:30:31 PM by bitcoin2cash |
|
You're moving goalposts here. You were asking about an intelligence which operated purely on logic and nothing else and that's what I was responding to. A an intelligence that operates purely on logic would have no motivation to survive, care about anything or even wish to apply logic. You need to understand why humans act. Humans act because they are dissatisfied with their current state of affairs, the state of affairs that would obtain if they didn't act or the state of affairs that would obtain if they stopped their current action. Dissatisfaction however has nothing to do with logic. A purely logical creature wouldn't act. There would have to be some non-logical underpinnings such as emotions, instinct, hardwired programming or whatever else you wish to call it.
|
|
|
|
TiagoTiago (OP)
|
|
June 26, 2011, 11:24:22 PM |
|
Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.
|
(I dont always get new reply notifications, pls send a pm when you think it has happened) Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!
|
|
|
Tawsix
Full Member
Offline
Activity: 210
Merit: 100
I have always been afraid of banks.
|
|
June 27, 2011, 12:06:47 AM |
|
Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.
What is its view on slavery?
|
|
|
|
NghtRppr
|
|
June 27, 2011, 12:23:14 AM |
|
Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.
Define "taking care of". Maximizing our happiness? Sounds like a utilitarian nightmare to me.
|
|
|
|
Anonymous
Guest
|
|
June 27, 2011, 12:27:23 AM |
|
Objective logic is non-existent. A formulation of logic created by humans will be subject to human emotional needs, which said needs tend to be subjective. In the end, only resulting in whims and desires. Creating an AI to take care of humanity will only act as a tool doing the bidding of its creator. Hence, it will only serve the whims and desires of said creator, through supposedly 'objective' logic or otherwise.
TLDR There is no truth.
|
|
|
|
NghtRppr
|
|
June 27, 2011, 03:06:10 AM |
|
There is no truth. Is that true?
|
|
|
|
Anonymous
Guest
|
|
June 27, 2011, 03:09:34 AM |
|
There is no truth. Is that true? Depends on the individual.
|
|
|
|
FreeMoney
Legendary
Offline
Activity: 1246
Merit: 1016
Strength in numbers
|
|
June 27, 2011, 03:12:19 AM |
|
Once you guys figure this out let me know if calculus and number theory are compatible with morality too.
|
Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
|
|
|
smellyBobby
Member
Offline
Activity: 112
Merit: 10
|
|
June 27, 2011, 03:21:39 AM |
|
Morality and ethic stem from emotion. Emotions are a set of purposeless thoughts/processes within an intelligence. I imagine emotions similar to a while loop.
while(alive=true){ getFood(); getMate(); }
They serve no logical purpose, except to keep the machine going.
So given that morality and ethic stem from such things, then the question you raise is ill defined.
|
|
|
|
Anonymous
Guest
|
|
June 27, 2011, 03:24:54 AM |
|
while(alive=true){ getFood(); getMate(); }
This is life to you. Heh.
|
|
|
|
|