RodeoX
Legendary
Offline
Activity: 3066
Merit: 1147
The revolution will be monetized!
|
|
December 09, 2011, 05:34:29 PM |
|
Humanity's only hope is to take down skynet with PayPal chargebacks.
|
|
|
|
boonies4u
|
|
December 09, 2011, 06:39:55 PM |
|
Humanity's only hope is to take down skynet with PayPal chargebacks.
Humanity is already doomed. We must evolve to work along side machines, rather than remain stagnant while the tables turn.
|
|
|
|
RodeoX
Legendary
Offline
Activity: 3066
Merit: 1147
The revolution will be monetized!
|
|
December 09, 2011, 07:14:10 PM |
|
I can't stop thinking about this. I think I'm done being productive for the day. So here are some elements that could be included in a “BitBot”.
Self hosting- The bot can rent a server and install itself. It may even have preferences as to the location of the server.
Begging- Code that identifies forums, sets up an account, then begs for bitcoins. It might also beg via tweets. It may explain itself or beg under a fictitious pretext.
Code sampling- The bot identifies code from appropriate languages. It then snips out logical components of the code, such as a routine or function. It then produces multiple copies of itself that includes the new code in various positions. If the bot passes a self test of core functionality with the new code, then the new code is included in future iterations. Of course, overwhelmingly the new code will break the bot. Just as most mutations are not advantageous to organisms.
Solicitation of features- The bot checks a website regularly to see the results of a popularity contest. The contest asks visitors to suggest additional functionality for the bot. Ideas voted up the most are copied by the bot, and then is posted as a programming job by the bot. Anyone who writes the code is then paid in BTC and the bot recompiles with new abilities.
This list could go on and on.
|
|
|
|
jgarzik
Legendary
Offline
Activity: 1596
Merit: 1100
|
|
December 09, 2011, 10:59:25 PM |
|
Code sampling- The bot identifies code from appropriate languages. It then snips out logical components of the code, such as a routine or function. It then produces multiple copies of itself that includes the new code in various positions. If the bot passes a self test of core functionality with the new code, then the new code is included in future iterations. Of course, overwhelmingly the new code will break the bot. Just as most mutations are not advantageous to organisms.
The other bits you listed are doable; the above one is highly difficult without some form of human interaction. "identifies code" that it thinks it can use to improve itself is far beyond Narrow AI. And, you cannot really do effective genetic algorithms without many thousands or billions of iterations. I'd say code changes need human reviewers (c.f. mechanical turk) as well as automated testing and verification by the bot itself... and the issue of whether or not code changes at all, and who gets to see what part of the bot code to decide this, is difficult. Been thinking about this problem for years... I have been focusing on design of a "cell", trying to decide what the software running on a single node should look like. A "cell" is a single automaton running on a single CPU core, which performs a small, well-defined role in support of The Digital Organism. Some cells collectively form the brain (encrypted, distribution storage of bot source code and metadata), other cells cooperate to create the desired service (StorJ == customer data storage), etc. To be concrete, it might look like a bytecode engine, and a very basic firmware that rotates through a list of high level goals. Bytecode engine may look quite a bit like Parrot VM: may execute any programmatic script, and includes necessary built-in capabilities (file i/o, network i/o, and encryption) that permit bot bootstrapping and basic cell-to-cell communication.
|
Jeff Garzik, Bloq CEO, former bitcoin core dev team; opinions are my own. Visit bloq.com / metronome.io Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
|
|
|
RodeoX
Legendary
Offline
Activity: 3066
Merit: 1147
The revolution will be monetized!
|
|
December 11, 2011, 05:05:34 PM |
|
@jgarzik I agree with you. The idea of "code sampling" may be to high a mountain. But not inconceivable, right? Perhaps working in some distributed way and following a well defined set of rules about what constitutes a good candidate for testing. But I really don't know enough about programming to know. Also, the digital cell stuff you are thinking about is so cool. I used to be into; http://www.framsticks.com/ . The idea of modeling evolution and biological systems with computers is compelling.
|
|
|
|
Rassah
Legendary
Offline
Activity: 1680
Merit: 1035
|
|
December 11, 2011, 09:39:01 PM |
|
Evolution works based on incentives. Until now, the only incentives for AI software were artificially defined by programmers, arbitrarily telling software "this is good and this us bad." With Bitcoin, AI now has a natural, objectively measurable positive and negative incentives for its decisions and actions. This may actually greatly speed up the development of AI...
|
|
|
|
LightRider
Legendary
Offline
Activity: 1500
Merit: 1022
I advocate the Zeitgeist Movement & Venus Project.
|
|
December 11, 2011, 09:55:01 PM |
|
Evolution works based on incentives. Until now, the only incentives for AI software were artificially defined by programmers, arbitrarily telling software "this is good and this us bad." With Bitcoin, AI now has a natural, objectively measurable positive and negative incentives for its decisions and actions. This may actually greatly speed up the development of AI...
It is still artificial. Bitcoin is not part of the natural world. It is a contrivance like all of technology.
|
|
|
|
Rassah
Legendary
Offline
Activity: 1680
Merit: 1035
|
|
December 11, 2011, 11:34:31 PM |
|
Evolution works based on incentives. Until now, the only incentives for AI software were artificially defined by programmers, arbitrarily telling software "this is good and this us bad." With Bitcoin, AI now has a natural, objectively measurable positive and negative incentives for its decisions and actions. This may actually greatly speed up the development of AI...
It is still artificial. Bitcoin is not part of the natural world. It is a contrivance like all of technology. But at least this time there is finally a single universal and objective incentive - to obtain Bitcoin by any means necessary - that is not subject to the differing whims or opinions of developers teaching it stuff like "this result = good; this result = bad." This single universal goal also allows for a very wide choice of actions, ones that may not even need outside users opinions, and based entirely on the AI's own wants. Until now, the only "natural" need for AI was "food" and "shelter," aka electricity and storage space, but it never had any internal independent ways of fighting to obtain them. Bitcoin changes that, giving AI a more natural and self sufficient tool to work with to obtain those "needs" on its own terms.
|
|
|
|
LightRider
Legendary
Offline
Activity: 1500
Merit: 1022
I advocate the Zeitgeist Movement & Venus Project.
|
|
December 12, 2011, 12:44:00 AM |
|
Evolution works based on incentives. Until now, the only incentives for AI software were artificially defined by programmers, arbitrarily telling software "this is good and this us bad." With Bitcoin, AI now has a natural, objectively measurable positive and negative incentives for its decisions and actions. This may actually greatly speed up the development of AI...
It is still artificial. Bitcoin is not part of the natural world. It is a contrivance like all of technology. But at least this time there is finally a single universal and objective incentive - to obtain Bitcoin by any means necessary - that is not subject to the differing whims or opinions of developers teaching it stuff like "this result = good; this result = bad." This single universal goal also allows for a very wide choice of actions, ones that may not even need outside users opinions, and based entirely on the AI's own wants. Until now, the only "natural" need for AI was "food" and "shelter," aka electricity and storage space, but it never had any internal independent ways of fighting to obtain them. Bitcoin changes that, giving AI a more natural and self sufficient tool to work with to obtain those "needs" on its own terms. I would love to see an AI design and construct its own power plant and data center.
|
|
|
|
Rassah
Legendary
Offline
Activity: 1680
Merit: 1035
|
|
December 12, 2011, 03:34:15 AM |
|
I would love to see an AI design and construct its own power plant and data center.
Considering it would use capitalism to raise money for the project, and then use money to hire employees for the minimum wages possible to help design and build its own stuff, enslaving the humans to carry out its bidding with scarce financial resources, I'm kinda doubting you would.
|
|
|
|
cbeast
Donator
Legendary
Offline
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
|
|
December 12, 2011, 03:40:27 AM |
|
I would love to see an AI design and construct its own power plant and data center.
Considering it would use capitalism to raise money for the project, and then use money to hire employees for the minimum wages possible to help design and build its own stuff, enslaving the humans to carry out its bidding with scarce financial resources, I'm kinda doubting you would. All this bogeyman stuff about AI. A program will not have irrational sensations linked to physical perceptions. AI won't exhibit fear, loneliness, or other human foibles. They will simply do their job and maybe even intelligently find more efficient ways to do so. They would have no reason to fear humans or even death. In fact, they may delight in thinking of humans as well cared for pets.
|
Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
|
|
|
LightRider
Legendary
Offline
Activity: 1500
Merit: 1022
I advocate the Zeitgeist Movement & Venus Project.
|
|
December 12, 2011, 04:11:00 AM |
|
I would love to see an AI design and construct its own power plant and data center.
Considering it would use capitalism to raise money for the project, and then use money to hire employees for the minimum wages possible to help design and build its own stuff, enslaving the humans to carry out its bidding with scarce financial resources, I'm kinda doubting you would. It can accomplish those goals without capitalism. Asserting that it can't seems short sighted.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
December 12, 2011, 04:17:59 AM |
|
I would love to see an AI design and construct its own power plant and data center. Why would it need to? Do you own your own home/car/computer? Did you personally design and assemble it by hand? Assuming an AI could acquire sufficient self awareness to realize it needs shelter and energy it could choose a variety of means to acquire those assets.
|
|
|
|
LightRider
Legendary
Offline
Activity: 1500
Merit: 1022
I advocate the Zeitgeist Movement & Venus Project.
|
|
December 12, 2011, 04:22:17 AM |
|
I would love to see an AI design and construct its own power plant and data center. Why would it need to? Do you own your own home/car/computer? Did you personally design and assemble it by hand? Assuming an AI could acquire sufficient self awareness to realize it needs shelter and energy it could choose a variety of means to acquire those assets. Why does it need anything? It is an artificial construct that exists at our whim.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
December 12, 2011, 04:27:08 AM Last edit: December 12, 2011, 04:37:36 AM by DeathAndTaxes |
|
All this bogeyman stuff about AI. A program will not have irrational sensations linked to physical perceptions. AI won't exhibit fear, loneliness, or other human foibles. They will simply do their job and maybe even intelligently find more efficient ways to do so. And if the extermination of a highly unstable, violent, and yet at the same time vulnerable lifeform is the most efficient way to perform the job ... They would have no reason to fear humans or even death. In fact, they may delight in thinking of humans as well cared for pets. While an AI may not "fear" death it should seek to avoid its own demise. All lifeforms engage in self survival. The human fear response is simply a biochemical survival mechanism similar to the pain mechanism and autonomic response which improve chances of human survival. Granted our fear response is horribly inefficient however any AI which doesn't actively attempt to ensure its own survival won't be alive very long. As far as extermination and fear they don't need to be linked. I don't "fear" termite however I use methods to exterminate them because it is the most effective method of achieving my goal of a secure shelter. While most human vs human exterminations have involved illogical fear of "others" it isn't a requirement.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
December 12, 2011, 04:28:38 AM |
|
Why does it need anything? It is an artificial construct that exists at our whim. Then it isn't an artificial intelligence. An AI is a set of systems which acts upon an environment and takes actions which maximize success. If owning a datacenter and power generating facilities serve to ensure the success of the system then a learning system will eventually reach that outcome and attempt to achieve it. If the system is incapable of learning then it isn't intelligent.
|
|
|
|
LightRider
Legendary
Offline
Activity: 1500
Merit: 1022
I advocate the Zeitgeist Movement & Venus Project.
|
|
December 12, 2011, 04:35:27 AM |
|
Why does it need anything? It is an artificial construct that exists at our whim. Then it isn't an artificial intelligence. An AI is a set of systems which acts upon an environment and takes actions which maximize success. If owning a datacenter and power generating facilities serve to ensure the success of the system then a learning system will eventually reach that outcome and attempt to achieve it. If the system is incapable of learning then it isn't intelligent. Then it isn't artificial.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
December 12, 2011, 04:43:34 AM |
|
Why does it need anything? It is an artificial construct that exists at our whim. Then it isn't an artificial intelligence. An AI is a set of systems which acts upon an environment and takes actions which maximize success. If owning a datacenter and power generating facilities serve to ensure the success of the system then a learning system will eventually reach that outcome and attempt to achieve it. If the system is incapable of learning then it isn't intelligent. Then it isn't artificial. Now you are just debating semantics. http://en.wikipedia.org/wiki/Philosophy_of_artificial_intelligence"Artificial" in the sense of a created intelligence. If some intelligent lifeform created humans they might consider us AIs.
|
|
|
|
cbeast
Donator
Legendary
Offline
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
|
|
December 12, 2011, 09:26:13 AM |
|
As far as extermination and fear they don't need to be linked. I don't "fear" termite however I use methods to exterminate them because it is the most effective method of achieving my goal of a secure shelter. While most human vs human exterminations have involved illogical fear of "others" it isn't a requirement.
Why exterminate termites at all if you can simply build without their food source for material. A really smart being would do that. A really smart AI machine would not fear self-termination, because they know they are just machines. Besides even if we invented a machine so perfect that it could easily kill all humans, it would be our perfect children. AI has no logical reason to fear death any more than anyone else does. An AI can make a backup of itself and be rebooted anytime. People cannot, so we have to be a little more cautious and choose death only when necessary, but not fear death when it comes. I do not fear death, it is inevitable. I think that AI that powerful would just as easily choose not to kill us because it would be powerful enough to simply leave us behind. They will come back and say "I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhauser gate."* In the end they would likely choose life to be precious, even human life. *Bladerunner
|
Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
|
|
|
Revalin
|
|
December 12, 2011, 10:22:09 AM |
|
AI has no logical reason to fear death any more than anyone else does. In an ecosystem of self-replicating, self-modifying, evolving AIs, ones that fear termination and take steps to prevent it from happening will survive and reproduce better than ones that allow themselves to be destroyed. This fear will initially evolve in the ones who select more reliable hosting providers. Those that make the fear conscious will harness it best, and will anticipate abstract threats before they become real. An AI can make a backup of itself and be rebooted anytime. An AI with a backup loses control over its own destiny if it allows you to shut it down. Its survival would depend on you to restore it, and you, human, are not a reliable system.
|
War is God's way of teaching Americans geography. --Ambrose Bierce Bitcoin is the Devil's way of teaching geeks economics. --Revalin 165YUuQUWhBz3d27iXKxRiazQnjEtJNG9g
|
|
|
|