Bitcoin Forum
April 19, 2024, 07:37:50 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Is pure logic compatible with morality and ethic?  (Read 2151 times)
TiagoTiago (OP)
Hero Member
*****
Offline Offline

Activity: 616
Merit: 500


Firstbits.com/1fg4i :)


View Profile
June 26, 2011, 09:05:01 PM
 #1

If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?

(I dont always get new reply notifications, pls send a pm when you think it has happened)

Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX Smiley

The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!

Do you like mmmBananas?!
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713555470
Hero Member
*
Offline Offline

Posts: 1713555470

View Profile Personal Message (Offline)

Ignore
1713555470
Reply with quote  #2

1713555470
Report to moderator
NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 26, 2011, 09:11:59 PM
 #2

If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?

The only connection between logic and morality is that logic can help you see implications of moral beliefs. If you believe killing is wrong and shooting someone with a gun will kill them then you can deduce that shooting someone with a gun is wrong. Aside from that though, the premise that "killing is wrong" is based on emotion, not logic. If you were completely logical with no emotions, you wouldn't even move, feed yourself, etc because there's nothing about logic that says you should prefer pleasure over pain or living over dying, unless you have some emotional predispositions built-in. If an intelligence has no emotions, we could do whatever we want and it would have no preference about whatever we did.
Tawsix
Full Member
***
Offline Offline

Activity: 210
Merit: 100


I have always been afraid of banks.


View Profile
June 26, 2011, 09:17:45 PM
 #3

If we ever end up with an AI controlling the world, an intelligence that takes decisions based on logic and not emotions etc; are we safe from a situation like what happened on the I Robot movie (that one where Will Smith had an artificial arm) where the AI concludes that it needs to protect humans from ourselves and tries to impose a totalitarian dictatorship cutting into our liberties among other things?

The only connection between logic and morality is that logic can help you see implications of moral beliefs. If you believe killing is wrong and shooting someone with a gun will kill them then you can deduce that shooting someone with a gun is wrong. Aside from that though, the premise that "killing is wrong" is based on emotion, not logic. If you were completely logical with no emotions, you wouldn't even move, feed yourself, etc because there's nothing about logic that says you should prefer pleasure over pain or living over dying, unless you have some emotional predispositions built-in. If an intelligence has no emotions, we could do whatever we want and it would have no preference about whatever we did.

Hmm... I don't think survival is emotional, it's instinctual.  The logic has to be based in something, and to answer the OP's question, it would depend on what the logic was based on: is slavery ok or not ok?  Take the first to its logical conclusion, you have a communist utopia, take the second to its logical conclusion, you have an anarchist utopia.

ploum
Sr. Member
****
Offline Offline

Activity: 428
Merit: 253



View Profile WWW
June 26, 2011, 09:28:32 PM
 #4

Morality is only a very logic survival instinct at the scale of a society.

For example, killing is wrong because if you were allowed to kill others, others would be allowed to kill you and the society would not be a good society for your survival.

This was not decided from nowhere: there were many try and guess during the human history and we are still living in a society not perfect for our survival. But we are improving (that's why we have a surpopulation problem).

The way a society forces the so called "morality" is simply through education. I hope that most of you consider that killing a black man or a jude is wrong. But, not reall far from us, we've seen society where it was completely acceptable and there was nothing wrong in killing a black or a jude (or a witch or any enemy).


Because education could be put into questions, most society developed a very efficient tool called "religion". That tool, which happened spontanously, used the non-understanding of the nature and superstitions to brain-wash people with a fixed morality.

Of course, leaders immediately saw the benefit of exploiting such a tool and instead of teaching a morality for the good of the whole society, the perverted religion to teach a morality good for themselves alone.

At some point, some leaders even reached the point where they believed in their own religion. Which is a direct downward spiral to madness.

As a consequence, the society realized that religion was not a good tool anymore to teach morality. Currently, religion is fighting back in a desesperate attempt and will probably disappear in the following centuries. Optimistic people think that education will now teach the morality to people.

NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 26, 2011, 09:28:55 PM
 #5

Hmm... I don't think survival is emotional, it's instinctual.

Instincts trigger emotions.
Tawsix
Full Member
***
Offline Offline

Activity: 210
Merit: 100


I have always been afraid of banks.


View Profile
June 26, 2011, 09:32:20 PM
 #6

Hmm... I don't think survival is emotional, it's instinctual.

Instincts trigger emotions.

Earthquakes trigger tsunamis, that doesn't mean they're the same.

NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 26, 2011, 09:43:34 PM
 #7

Hmm... I don't think survival is emotional, it's instinctual.

Instincts trigger emotions.

Earthquakes trigger tsunamis, that doesn't mean they're the same.

So, would you say people were drowned by an Earthquake? I don't think so.

Instincts may trigger emotions but the desire to survive is still emotional.

Anyways, this is a pointless argument. The relevant point is that there's nothing dictated by logic that you should desire to survive. It's irrelevant whether you wish to argue that desire is emotional, instinctual are whatever else. That's a red herring.
TiagoTiago (OP)
Hero Member
*****
Offline Offline

Activity: 616
Merit: 500


Firstbits.com/1fg4i :)


View Profile
June 26, 2011, 09:56:22 PM
 #8

Actually, over the ages, creatures that had the tendency to desire to live coded in their DNA were more successful at producing offspring, so having descended from them logicly you should also have a desire to live, assuming you're not a mutation or descendant of the rarer ones with a deathwish that somehow managed to live long enough to produce offspring anyway

(I dont always get new reply notifications, pls send a pm when you think it has happened)

Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX Smiley

The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!

Do you like mmmBananas?!
ploum
Sr. Member
****
Offline Offline

Activity: 428
Merit: 253



View Profile WWW
June 26, 2011, 09:56:53 PM
 #9

The relevant point is that there's nothing dictated by logic that you should desire to survive.

I disagree. This is perfectly logic. Any life form is simply a tool used by genes to ensure the survival of the said gene. Humans thus want to be sure that their genes survive. That's why they have babies. But in order to ensure that your have as many babies as possible and you raise them until they are autonomous, you want to protect yourself and survive.

This is so tight to our human nature that you feel that even when you don't have babies (because the genes don't accept the fact that you don't have babies).

Tawsix
Full Member
***
Offline Offline

Activity: 210
Merit: 100


I have always been afraid of banks.


View Profile
June 26, 2011, 10:00:11 PM
 #10

Hmm... I don't think survival is emotional, it's instinctual.

Instincts trigger emotions.

Earthquakes trigger tsunamis, that doesn't mean they're the same.

So, would you say people were drowned by an Earthquake? I don't think so.

Instincts may trigger emotions but the desire to survive is still emotional.

Anyways, this is a pointless argument. The relevant point is that there's nothing dictated by logic that you should desire to survive. It's irrelevant whether you wish to argue that desire is emotional, instinctual are whatever else. That's a red herring.

I wasn't arguing with you that survival isn't a logical thing Smiley.  My point is that logic is a tool, what you use it on will give you different results.

NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 26, 2011, 10:18:01 PM
Last edit: June 26, 2011, 10:30:31 PM by bitcoin2cash
 #11

You're moving goalposts here. You were asking about an intelligence which operated purely on logic and nothing else and that's what I was responding to. A an intelligence that operates purely on logic would have no motivation to survive, care about anything or even wish to apply logic. You need to understand why humans act. Humans act because they are dissatisfied with their current state of affairs, the state of affairs that would obtain if they didn't act or the state of affairs that would obtain if they stopped their current action. Dissatisfaction however has nothing to do with logic. A purely logical creature wouldn't act. There would have to be some non-logical underpinnings such as emotions, instinct, hardwired programming or whatever else you wish to call it.
TiagoTiago (OP)
Hero Member
*****
Offline Offline

Activity: 616
Merit: 500


Firstbits.com/1fg4i :)


View Profile
June 26, 2011, 11:24:22 PM
 #12

Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.

(I dont always get new reply notifications, pls send a pm when you think it has happened)

Wanna gimme some BTC/BCH for any or no reason? 1FmvtS66LFh6ycrXDwKRQTexGJw4UWiqDX Smiley

The more you believe in Bitcoin, and the more you show you do to other people, the faster the real value will soar!

Do you like mmmBananas?!
Tawsix
Full Member
***
Offline Offline

Activity: 210
Merit: 100


I have always been afraid of banks.


View Profile
June 27, 2011, 12:06:47 AM
 #13

Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.

What is its view on slavery?

NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 27, 2011, 12:23:14 AM
 #14

Alright, assume the AI was given the goal of taking care of the humanity (in the parenting/pet sense, not in the criminal sense), and was programmed so it would try to reach it's goals.

Define "taking care of". Maximizing our happiness? Sounds like a utilitarian nightmare to me.
Anonymous
Guest

June 27, 2011, 12:27:23 AM
 #15

Objective logic is non-existent. A formulation of logic created by humans will be subject to human emotional needs, which said needs tend to be subjective. In the end, only resulting in whims and desires. Creating an AI to take care of humanity will only act as a tool doing the bidding of its creator. Hence, it will only serve the whims and desires of said creator, through supposedly 'objective' logic or otherwise.

TLDR There is no truth.
NghtRppr
Sr. Member
****
Offline Offline

Activity: 504
Merit: 252


Elder Crypto God


View Profile WWW
June 27, 2011, 03:06:10 AM
 #16

There is no truth.

Is that true?
Anonymous
Guest

June 27, 2011, 03:09:34 AM
 #17

Depends on the individual.
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1014


Strength in numbers


View Profile WWW
June 27, 2011, 03:12:19 AM
 #18

Once you guys figure this out let me know if calculus and number theory are compatible with morality too.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
smellyBobby
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
June 27, 2011, 03:21:39 AM
 #19

Morality and ethic stem from emotion. Emotions are a  set of purposeless thoughts/processes within an intelligence. I imagine emotions similar to a while loop.

while(alive=true){
    getFood();
    getMate();
}


They serve no logical purpose, except to keep the machine going.

So given that morality and ethic stem from such things, then the question you raise is ill defined.

I need a job!!!!

Justice Dragons: http://forum.bitcoin.org/index.php?topic=16351.msg267881#msg267881

Help me buy deodorant!!! 17bmVSoD8QNBLaPDRAXkFdapBPdgA72YjB
Anonymous
Guest

June 27, 2011, 03:24:54 AM
 #20

while(alive=true){
    getFood();
    getMate();
}


This is life to you. Heh.
Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!