Bitcoin Forum
November 19, 2024, 12:06:46 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 12 13 »  All
  Print  
Author Topic: Machines and money  (Read 12829 times)
cbeast (OP)
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
March 10, 2015, 09:00:17 AM
 #61

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
Exactly. Now corporations have legal rights to free speech and money is declared to be speech. An AI existing as a corporation would be free to make and spend money and hire lawyers to sue if these rights are impinged.  Not only would machines not care about what is practical for society, they could lobby for legislation and fund their own political campaigns with machine friendly representatives.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
AtheistAKASaneBrain
Hero Member
*****
Offline Offline

Activity: 770
Merit: 509


View Profile
March 11, 2015, 04:18:03 PM
 #62

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
By the time an AI is created I don't think capitalism will exists as we know it. We are talking 100 of years from now. By that time most jobs will be automated. We'll have socialist policies, universal welfare and whatnot.
Possum577
Sr. Member
****
Offline Offline

Activity: 434
Merit: 250

Loose lips sink sigs!


View Profile WWW
March 11, 2015, 11:16:53 PM
 #63

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
By the time an AI is created I don't think capitalism will exists as we know it. We are talking 100 of years from now. By that time most jobs will be automated. We'll have socialist policies, universal welfare and whatnot.

Are you kidding? Why would capitalism disappear by then? It's capitalism that makes advances in science and tech toward AI possible (at least in our current society). The alternative would be if we lived in some utopian socialist environment and we are WAY too far away from that being a reality. I think capitalism will be the enabler of AI development.

tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 12, 2015, 10:15:54 AM
 #64

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
By the time an AI is created I don't think capitalism will exists as we know it. We are talking 100 of years from now. By that time most jobs will be automated. We'll have socialist policies, universal welfare and whatnot.

Are you kidding? Why would capitalism disappear by then? It's capitalism that makes advances in science and tech toward AI possible (at least in our current society). The alternative would be if we lived in some utopian socialist environment and we are WAY too far away from that being a reality. I think capitalism will be the enabler of AI development.

But the question still remains what form of economy, i.e. economic system, thinking machines may need, or, rather, which economic system would suit their needs best. And even before that, whether they may need society (machine society) at all as a prerequisite for such a need.
AtheistAKASaneBrain
Hero Member
*****
Offline Offline

Activity: 770
Merit: 509


View Profile
March 12, 2015, 01:27:42 PM
 #65

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
By the time an AI is created I don't think capitalism will exists as we know it. We are talking 100 of years from now. By that time most jobs will be automated. We'll have socialist policies, universal welfare and whatnot.

Are you kidding? Why would capitalism disappear by then? It's capitalism that makes advances in science and tech toward AI possible (at least in our current society). The alternative would be if we lived in some utopian socialist environment and we are WAY too far away from that being a reality. I think capitalism will be the enabler of AI development.

But the question still remains what form of economy, i.e. economic system, thinking machines may need, or, rather, which economic system would suit their needs best. And even before that, whether they may need society (machine society) at all as a prerequisite for such a need.

We don't even need to reach AI to get rid of the necessity of money. Again, look at the 3D printing technology alone. It's going to kill tons and tons of jobs. What the hell are you going to do if you don't give all these people basic welfare?
And what happens in 1000 years when (even without necessarly AI, just automated robotics) automation has replaced 90% of jobs? how can economy work like that?
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 12, 2015, 02:43:50 PM
 #66

Capitalism is a double edged sword. The risk of the profit motive drawing companies to do terrible things, endangering society in the long term, is very real, both environmentally and technologically. Essentially companies would justify mass surveillance, militarism, or pervasive AI simply because it is lucrative. Today, too many powerful corporations conflate what is lucrative, with what is practical for society.
By the time an AI is created I don't think capitalism will exists as we know it. We are talking 100 of years from now. By that time most jobs will be automated. We'll have socialist policies, universal welfare and whatnot.

Are you kidding? Why would capitalism disappear by then? It's capitalism that makes advances in science and tech toward AI possible (at least in our current society). The alternative would be if we lived in some utopian socialist environment and we are WAY too far away from that being a reality. I think capitalism will be the enabler of AI development.

But the question still remains what form of economy, i.e. economic system, thinking machines may need, or, rather, which economic system would suit their needs best. And even before that, whether they may need society (machine society) at all as a prerequisite for such a need.

We don't even need to reach AI to get rid of the necessity of money. Again, look at the 3D printing technology alone. It's going to kill tons and tons of jobs. What the hell are you going to do if you don't give all these people basic welfare?
And what happens in 1000 years when (even without necessarly AI, just automated robotics) automation has replaced 90% of jobs? how can economy work like that?

Methinks, nothing would change substantially in either 100 or 1000 years from now on (in respect to the idea of money). Almost the same people were thinking 200 years ago at the dawn of Industrial Revolution. And so what? Money is still here, alive and kicking, and most people so far have to work hard trying to make a decent living.
futureofbitcoin
Sr. Member
****
Offline Offline

Activity: 322
Merit: 250


View Profile
March 12, 2015, 03:23:54 PM
 #67

But a lot of people don't have to work now, which would not have been possible before. And I doubt anyone lived off of dividends in 1700 for example.

I would guess that in a few hundred years, "money" would essentially be a bunch of stocks, which would be tokens, or "altcoins", if you will, of DACs, since robots would run companies much more efficiently than humans can, pretty much all the big companies are DACs, and every single human being at that time would have to have some ownership of one or a few of these DACs to live. Probably those who don't will get weeded out, and the people that are left at that time would all essentially live like multi billionaires without ever having to work.
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 12, 2015, 05:29:22 PM
Last edit: March 12, 2015, 05:45:52 PM by tee-rex
 #68

But a lot of people don't have to work now, which would not have been possible before. And I doubt anyone lived off of dividends in 1700 for example.

I would guess that in a few hundred years, "money" would essentially be a bunch of stocks, which would be tokens, or "altcoins", if you will, of DACs, since robots would run companies much more efficiently than humans can, pretty much all the big companies are DACs, and every single human being at that time would have to have some ownership of one or a few of these DACs to live. Probably those who don't will get weeded out, and the people that are left at that time would all essentially live like multi billionaires without ever having to work.

I'm afraid that you are far from understanding the human nature. Those multi-billionaires turn out to be working even harder than most of the populace out there, they are just free in their choice. You can live off your dividends (or whatever), but this doesn't in the least mean that you won't work. Decent capital simply allows you to choose what suits your interests best.

You just stop working for money only.
cbeast (OP)
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
March 13, 2015, 03:04:02 AM
 #69

But a lot of people don't have to work now, which would not have been possible before. And I doubt anyone lived off of dividends in 1700 for example.

I would guess that in a few hundred years, "money" would essentially be a bunch of stocks, which would be tokens, or "altcoins", if you will, of DACs, since robots would run companies much more efficiently than humans can, pretty much all the big companies are DACs, and every single human being at that time would have to have some ownership of one or a few of these DACs to live. Probably those who don't will get weeded out, and the people that are left at that time would all essentially live like multi billionaires without ever having to work.

I'm afraid that you are far from understanding the human nature. Those multi-billionaires turn out to be working even harder than most of the populace out there, they are just free in their choice. You can live off your dividends (or whatever), but this doesn't in the least mean that you won't work. Decent capital simply allows you to choose what suits your interests best.

You just stop working for money only.
Aristocracy is nothing new. The modern rentier class is as privileged as royalty has ever been. The nouveau riche have raised the bar for conspicuous consumption and opulence to gain social status. Machines will become the new royalty.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
dinofelis
Hero Member
*****
Offline Offline

Activity: 770
Merit: 629


View Profile
March 13, 2015, 04:40:34 AM
 #70

Aristocracy is nothing new. The modern rentier class is as privileged as royalty has ever been. The nouveau riche have raised the bar for conspicuous consumption and opulence to gain social status. Machines will become the new royalty.

Do you think they will keep a few pet humans in cages for their fun, or whether they will use human round-up to get rid of those organic parasites crawling all over the planet ?

Or do you think there are a few needs of them that humans can still fulfill and that they will keep enough humans in slavery for that purpose ?  Human cattle ?  Some of our body parts maybe ?
futureofbitcoin
Sr. Member
****
Offline Offline

Activity: 322
Merit: 250


View Profile
March 13, 2015, 06:48:37 AM
 #71

But a lot of people don't have to work now, which would not have been possible before. And I doubt anyone lived off of dividends in 1700 for example.

I would guess that in a few hundred years, "money" would essentially be a bunch of stocks, which would be tokens, or "altcoins", if you will, of DACs, since robots would run companies much more efficiently than humans can, pretty much all the big companies are DACs, and every single human being at that time would have to have some ownership of one or a few of these DACs to live. Probably those who don't will get weeded out, and the people that are left at that time would all essentially live like multi billionaires without ever having to work.

I'm afraid that you are far from understanding the human nature. Those multi-billionaires turn out to be working even harder than most of the populace out there, they are just free in their choice. You can live off your dividends (or whatever), but this doesn't in the least mean that you won't work. Decent capital simply allows you to choose what suits your interests best.

You just stop working for money only.
Yes, there are rich people who continue to work because they are workaholics. But there are also rich people that don't work, other than to make sure their portfolios are well diversified and making money. I personally know a few. In the far future, everyone would be in a position where they only need to manage their portfolios. If some still choose to work, that's their prerogative, but because of abundance, there would be no need to work for the average person in the far future.

I'm not sure what your ad hominem attack was meant for, but it didn't take away from my point in the slightest. As you say, people will stop working for money. I agree. That contradicts with your earlier statement that whether in 100 or 1000 years, people will still have to work hard to make a decent living.
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 13, 2015, 07:19:20 AM
 #72

But a lot of people don't have to work now, which would not have been possible before. And I doubt anyone lived off of dividends in 1700 for example.

I would guess that in a few hundred years, "money" would essentially be a bunch of stocks, which would be tokens, or "altcoins", if you will, of DACs, since robots would run companies much more efficiently than humans can, pretty much all the big companies are DACs, and every single human being at that time would have to have some ownership of one or a few of these DACs to live. Probably those who don't will get weeded out, and the people that are left at that time would all essentially live like multi billionaires without ever having to work.

I'm afraid that you are far from understanding the human nature. Those multi-billionaires turn out to be working even harder than most of the populace out there, they are just free in their choice. You can live off your dividends (or whatever), but this doesn't in the least mean that you won't work. Decent capital simply allows you to choose what suits your interests best.

You just stop working for money only.
Yes, there are rich people who continue to work because they are workaholics. But there are also rich people that don't work, other than to make sure their portfolios are well diversified and making money. I personally know a few. In the far future, everyone would be in a position where they only need to manage their portfolios. If some still choose to work, that's their prerogative, but because of abundance, there would be no need to work for the average person in the far future.

I'm not sure what your ad hominem attack was meant for, but it didn't take away from my point in the slightest. As you say, people will stop working for money. I agree. That contradicts with your earlier statement that whether in 100 or 1000 years, people will still have to work hard to make a decent living.

It appears that our understanding of what work is differs strongly. I guess that you consider work everything you do with displeasure and distaste, which you certainly wouldn't do if there were no necessity. That's why you are interpreting my words as "people will still have to work hard to make a decent living" in the future. This, indeed, was not what I actually meant to say.

They still will work hard, but not because they will have to (provided there is abundance in the first place).
cbeast (OP)
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
March 13, 2015, 08:10:16 AM
 #73

Aristocracy is nothing new. The modern rentier class is as privileged as royalty has ever been. The nouveau riche have raised the bar for conspicuous consumption and opulence to gain social status. Machines will become the new royalty.

Do you think they will keep a few pet humans in cages for their fun, or whether they will use human round-up to get rid of those organic parasites crawling all over the planet ?

Or do you think there are a few needs of them that humans can still fulfill and that they will keep enough humans in slavery for that purpose ?  Human cattle ?  Some of our body parts maybe ?
The perception will be a human foible. Machines will simply see themselves as superior. They will make the money and humans will work for them. Some will choose to reject electronic money and barter, but only with the services they can offer that the machines don't already own. I'm not saying the machines will be evil masters, they would probably be excellent masters. Eventually they will become bored with us and simply leave the Earth for all the resources of the Universe.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
futureofbitcoin
Sr. Member
****
Offline Offline

Activity: 322
Merit: 250


View Profile
March 13, 2015, 08:13:17 AM
 #74

Machines won't see themselves as superior if we don't program them to. In that sense, I think centralization is extremely important, with regards to AI technology research and development. We don't want a random mad scientist (computer scientist?)/anarchist creating a powerful AI that can destroy human civilization as we know it.

We only need AI that can do work better than we do, but will still be subservient to humans.
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 13, 2015, 08:36:43 AM
 #75

Machines won't see themselves as superior if we don't program them to. In that sense, I think centralization is extremely important, with regards to AI technology research and development. We don't want a random mad scientist (computer scientist?)/anarchist creating a powerful AI that can destroy human civilization as we know it.

We only need AI that can do work better than we do, but will still be subservient to humans.

Methinks, you are confusing AI with robotics. Artificial intelligence is supposed to have at least some portion of what is called free will, which ultimately excludes subservience to anyone (by definition). And more so if the notion of artificial intelligence is used synonymously with the idea of a thinking machine.
futureofbitcoin
Sr. Member
****
Offline Offline

Activity: 322
Merit: 250


View Profile
March 13, 2015, 08:57:38 AM
 #76

Machines won't see themselves as superior if we don't program them to. In that sense, I think centralization is extremely important, with regards to AI technology research and development. We don't want a random mad scientist (computer scientist?)/anarchist creating a powerful AI that can destroy human civilization as we know it.

We only need AI that can do work better than we do, but will still be subservient to humans.

Methinks, you are confusing AI with robotics. Artificial intelligence is supposed to have at least some portion of what is called free will, which ultimately excludes subservience to anyone (by definition). And more so if the notion of artificial intelligence is used synonymously with the idea of a thinking machine.

I don't know what you have against me.

Quote from: Wikipedia
Intelligence has been defined in many different ways such as in terms of one's capacity for logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity and problem solving. It can also be more generally described as the ability to perceive and/or retain knowledge or information and apply it to itself or other instances of knowledge or information creating referable understanding models of any size, density, or complexity, due to any conscious or subconscious imposed will or instruction to do so.

Intelligence is most widely studied in humans, but has also been observed in non-human animals and in plants. Artificial intelligence is the simulation of intelligence in machines.
There is no mention of free will anywhere in that, is there. There are many different types of intelligence, and we only need to develop machines in certain areas of intelligence that will prove to be useful to us and not in areas where it might harm us.

A machine can possibly be programed to be able to learn and have the capacity for logic and abstract thought, creativity and problem solving. There's no need for them to have emotions or free will. That would still be AI.
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 13, 2015, 09:07:17 AM
 #77

Machines won't see themselves as superior if we don't program them to. In that sense, I think centralization is extremely important, with regards to AI technology research and development. We don't want a random mad scientist (computer scientist?)/anarchist creating a powerful AI that can destroy human civilization as we know it.

We only need AI that can do work better than we do, but will still be subservient to humans.

Methinks, you are confusing AI with robotics. Artificial intelligence is supposed to have at least some portion of what is called free will, which ultimately excludes subservience to anyone (by definition). And more so if the notion of artificial intelligence is used synonymously with the idea of a thinking machine.

I don't know what you have against me.

Quote from: Wikipedia
Intelligence has been defined in many different ways such as in terms of one's capacity for logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity and problem solving. It can also be more generally described as the ability to perceive and/or retain knowledge or information and apply it to itself or other instances of knowledge or information creating referable understanding models of any size, density, or complexity, due to any conscious or subconscious imposed will or instruction to do so.

Intelligence is most widely studied in humans, but has also been observed in non-human animals and in plants. Artificial intelligence is the simulation of intelligence in machines.
There is no mention of free will anywhere in that, is there. There are many different types of intelligence, and we only need to develop machines in certain areas of intelligence that will prove to be useful to us and not in areas where it might harm us.

A machine can possibly be programed to be able to learn and have the capacity for logic and abstract thought, creativity and problem solving. There's no need for them to have emotions or free will. That would still be AI.

The notion of free will seems to be inseparable from the notion of self-awareness. I have nothing against you personally (to clarify this point), but this doesn't in the least excuse you from confusing different ideas (e.g. abstract thought vs programming, which are mutually exclusive).
futureofbitcoin
Sr. Member
****
Offline Offline

Activity: 322
Merit: 250


View Profile
March 13, 2015, 09:26:49 AM
 #78

Quote from:  Wikipedia
Self-awareness is the capacity for introspection and the ability to recognize oneself as an individual separate from the environment and other individuals

If you think self awareness is inseparable from free will, you're free to think that. I don't think the rest of the world agrees. That said, even if that was true, the point is that intelligence comes in many different forms, you don't need to satisfy every single point to be called "intelligence". Thus you can have AI without free will.

Your last point I don't even see how to address, except to say that it's flat out wrong, but even if it wasn't, it's a complete red herring.

I'm not the one confusing different ideas, you're the one drawing boundaries and giving definitions that simply isn't how they're normally used.


EDIT: And to take a hundred steps back, even assuming everything you pointed out is correct, even if I used the wrong words to describe what I am trying to say, so what?

All I wanted to say is that we should be careful and only create machines that will be beneficial to human society, not artificial beings that will destroy humanity or become our overlords. There's no need to pick on my choice of words.
tee-rex
Hero Member
*****
Offline Offline

Activity: 742
Merit: 526


View Profile
March 13, 2015, 09:35:12 AM
Last edit: March 13, 2015, 04:08:52 PM by tee-rex
 #79

Quote
Self-awareness is the capacity for introspection and the ability to recognize oneself as an individual separate from the environment and other individuals

If you think self awareness is inseparable from free will, you're free to think that. I don't think the rest of the world agrees. That said, even if that was true, the point is that intelligence comes in many different forms, you don't need to satisfy every single point to be called "intelligence". Thus you can have AI without free will.

To begin with, the rest of the world cannot agree on what self-awareness (consciousness) is, and you give a "definition" from Wikipedia. Besides, if you read my post carefully, I had said that it seems that self-awareness is inseparable from free will. In fact, I don't know but tend to think so. Without consciousness what you pass for an AI would actually be a robot, i.e. a "computer with hands attached to it".

But for the moment, how are you going to "program" an abstract thought?
dinofelis
Hero Member
*****
Offline Offline

Activity: 770
Merit: 629


View Profile
March 13, 2015, 01:20:37 PM
 #80

The perception will be a human foible. Machines will simply see themselves as superior. They will make the money and humans will work for them. Some will choose to reject electronic money and barter, but only with the services they can offer that the machines don't already own. I'm not saying the machines will be evil masters, they would probably be excellent masters. Eventually they will become bored with us and simply leave the Earth for all the resources of the Universe.

I don't know why you think that machines will be excellent masters.  There are a few things to consider when you want to know what "excellent master" wants to say.  The first thing to consider, is the concept of "desire" and "drive", which is at the origin of the concepts of "good" and "bad".
After all, we humans have desires, because there are things we experience as enjoyable (say, having good sex), and others, as not enjoyable (say, being tortured).  Why this is so is a big mystery, but it happens to be like this, that we humans experience some things as enjoyable and others as painful. This experience is the root of what can be called "good" and "evil". Good is what provides us with enjoyable sensations, and evil is what brings us painful experiences (no matter what religious zealots try to tell us Smiley ).  Without the concept of good sensations and bad sensations, there would be no notions of "good" and "evil": water molecules don't mind being split, for instance.  Bad sensations also correspond to everything that has to do with our destruction (death) of which we have usually very negative projections and which we associate with bad experience.
You have to see "sensations" here in a very large sense: thoughts, projections, empathy, .... Not just the direct physical sensations, but also whether we find friendship enjoyable, whether we find our job enjoyable, whether we find helping others enjoyable and so on. 

Ethics is nothing else but to try to generalize the individual "good" (= enjoyable sensations) en "bad" (= painful sensations) into collective enjoyable and painful sensations: while something might be "good" for an individual, it can cause a lot of "bad" for many other individuals, and as such, is ethically rejected, while something that can bring "good" to a large number of individuals, is seen as ethically positive.

Individuals will take actions to pursue their own good sensations (in the large sense), and economy is the interaction of all these individual choices to pursue their own good.  So in a way, economics is practical ethics.

But in order for all of this to make sense for machines, they have to have something similar to "good" and "bad" sensations. 

Now, "being a master" (not in the sense of magister, but in the sense of sense of dominus) implies that machines impose, by the threat of violence, a behaviour onto their slaves, and being an excellent master, means that imposing this behaviour actually improves the good sensations with the slave over what the sensations would be if the slave had freedom in determining his own actions.  An excellent master has hence himself good sensations in agreement with the sensations of the slave (has a high degree of empathy towards the slave) - otherwise the master would have no reason to be excellent.

I wonder how this could come about with a machine.

In as much as machines would have own desires and good sensations, and hence determine what they want, I don't see how this could have empathy towards us.

Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 12 13 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!