waltmarkers (OP)
Member
Offline
Activity: 104
Merit: 10
|
|
August 17, 2012, 09:57:56 PM |
|
Assuming BTCST settles out successfully, which I believe to be quite likely, I'd like to nominate the following group for the charity to win the bet. CHAOS HOMEBREW CLUB http://test.chaosbrewclub.netThey share some of our values. Homebrew is about community, thumbing your nose at alcohol taxes, art, and self-reliance. We're trying to move into a bigger brew house because our space is too small! We teach people in our community how to brew from scratch, give them a place to brew, and have social and educational events. A bigger space will touch thousands of lives a year between members, those that come to our events and classes, and those that get to have our members' beer. If we get the donation, we are fully committed to writing an article about it, and sharing all aspects of the donation with the local press, emphasizing bitcoin. We've had some great press before, and we will likely have more.
|
|
|
|
Phinnaeus Gage
Legendary
Offline
Activity: 1918
Merit: 1570
Bitcoin: An Idea Worth Spending
|
|
August 17, 2012, 10:55:14 PM |
|
Assuming BTCST settles out successfully, which I believe to be quite likely, I'd like to nominate the following group for the charity to win the bet. CHAOS HOMEBREW CLUB http://test.chaosbrewclub.netThey share some of our values. Homebrew is about community, thumbing your nose at alcohol taxes, art, and self-reliance. We're trying to move into a bigger brew house because our space is too small! We teach people in our community how to brew from scratch, give them a place to brew, and have social and educational events. A bigger space will touch thousands of lives a year between members, those that come to our events and classes, and those that get to have our members' beer. If we get the donation, we are fully committed to writing an article about it, and sharing all aspects of the donation with the local press, emphasizing bitcoin. We've had some great press before, and we will likely have more. Are you affiliated with Bill or Lisa Mason? ~Bruno~
|
|
|
|
theymos
Administrator
Legendary
Offline
Activity: 5390
Merit: 13427
|
|
August 17, 2012, 11:05:05 PM |
|
- The Bitcoin development group - The Seasteading Institute - The Singularity Institute
|
1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
|
|
|
finkleshnorts
|
|
August 17, 2012, 11:08:04 PM |
|
- The Singularity Institute
Is that Ray Kurzweil/transhumanism stuff? Gives me the heebie jeebies.
|
|
|
|
theymos
Administrator
Legendary
Offline
Activity: 5390
Merit: 13427
|
|
August 17, 2012, 11:18:23 PM |
|
Is that Ray Kurzweil/transhumanism stuff?
Yes, though the Singularity Institute doesn't directly work on transhumanism much as far as I know. They focus mainly on trying to kick-start the singularity (ie. explosive technological progress) by creating benevolent human-level artificial intelligence that is able to quickly improve itself.
|
1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
|
|
|
RoloTonyBrownTown
|
|
August 17, 2012, 11:20:02 PM |
|
I'd like to see it go to something useful. Some sort of cancer research, alzheimers, etc.
|
|
|
|
finkleshnorts
|
|
August 17, 2012, 11:20:16 PM |
|
Is that Ray Kurzweil/transhumanism stuff?
Yes, though the Singularity Institute doesn't directly work on transhumanism much as far as I know. They focus mainly on trying to kick-start the singularity (ie. explosive technological progress) by creating benevolent human-level artificial intelligence that is able to quickly improve itself. Very cool.
|
|
|
|
sadpandatech
|
|
August 17, 2012, 11:30:57 PM |
|
Is that Ray Kurzweil/transhumanism stuff?
Yes, though the Singularity Institute doesn't directly work on transhumanism much as far as I know. They focus mainly on trying to kick-start the singularity (ie. explosive technological progress) by creating benevolent human-level artificial intelligence that is able to quickly improve itself. Very cool. indeed, very cool.
|
If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system. - GA
It is being worked on by smart people. -DamienBlack
|
|
|
Scott J
Legendary
Offline
Activity: 1792
Merit: 1000
|
|
August 17, 2012, 11:32:46 PM |
|
I'd like to see it go to something useful. Some sort of cancer research, alzheimers, etc. + 1 for alzheimers
|
|
|
|
myrkul
|
|
August 18, 2012, 12:15:21 AM |
|
- The Seasteading Institute
They get my vote. Can't really get behind creating skynet... no guarantee AI is benevolent.
|
|
|
|
drlatino999
|
|
August 18, 2012, 12:22:45 AM |
|
Here -
|
Sappers clear the way
|
|
|
riX
|
|
August 18, 2012, 12:24:34 AM |
|
I'd like it to go to something that most mainstream people would appreciate, like medical research or similar. The most important for me is that it is accompanied with lots of positive media attention.
|
|
|
|
Bitcoin Oz
|
|
August 18, 2012, 02:53:11 AM |
|
Gamblers Anonymous
|
|
|
|
mb300sd
Legendary
Offline
Activity: 1260
Merit: 1000
Drunk Posts
|
|
August 18, 2012, 05:52:49 AM |
|
Something that contributes to the continued growth and expansion of bitcoin.
|
1D7FJWRzeKa4SLmTznd3JpeNU13L1ErEco
|
|
|
ShadowAlexey
Donator
Legendary
Offline
Activity: 968
Merit: 1002
|
|
August 18, 2012, 06:03:23 AM |
|
Find one child and pay for his treatment. Or setup a fond, and sponsor developing all sorts of btc app and hardware.
|
|
|
|
Bitcoin Oz
|
|
August 18, 2012, 06:05:25 AM |
|
If pirate defaults why would he pay 5000btc to a charity ?
|
|
|
|
repentance
|
|
August 18, 2012, 06:12:49 AM |
|
If pirate defaults why would he pay 5000btc to a charity ?
BurtW's 10,000 BTC bet is more interesting. He bet that BS&T wouldn't "fail". Does it having to wind up count as "failure" even if everyone receives back their investment plus any interest they're owed?
|
All I can say is that this is Bitcoin. I don't believe it until I see six confirmations.
|
|
|
herzmeister
Legendary
Offline
Activity: 1764
Merit: 1007
|
|
August 19, 2012, 12:21:30 AM |
|
- The Singularity Institute
Luke Muehlhauser, CEO of the Singularity Institute for Artificial Intelligence: " Unfortunately, the singularity may not be what you're hoping for. By default the singularity (intelligence explosion) will go very badly for humans, because what humans want is a very, very specific set of things in the vast space of possible motivations, and it's very hard to translate what we want into sufficiently precise math, so by default superhuman AIs will end up optimizing the world around us for something other than what we want, and using up all our resources to do so.
The AI does not love you, nor does it hate you, but you are made of atoms it can use for something else." http://www.reddit.com/r/Futurology/comments/y9lm0/i_am_luke_muehlhauser_ceo_of_the_singularity/c5tm6fuAlso sounds much like a Zeitgeist/Venus Project planned and automated economy that I thought most of you guys are against.
|
|
|
|
theymos
Administrator
Legendary
Offline
Activity: 5390
Merit: 13427
|
|
August 20, 2012, 10:58:38 PM |
|
Luke Muehlhauser, CEO of the Singularity Institute for Artificial Intelligence:
The Singularity Institute is meant to create AI that will work for humanity's benefit. Also sounds much like a Zeitgeist/Venus Project planned and automated economy that I thought most of you guys are against.
If there is no scarcity, anarcho-communism and anarcho-capitalism end up being mostly the same thing.
|
1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
|
|
|
Meni Rosenfeld
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
August 22, 2012, 09:23:23 PM |
|
Luke Muehlhauser, CEO of the Singularity Institute for Artificial Intelligence:
The Singularity Institute is meant to create AI that will work for humanity's benefit. Exactly. The premise is that with high likelihood, a recursively self-improving superhuman artificial general intelligence leading to a singularity will eventually happen anyway. Unless a special effort is made, that AGI will be unfriendly to humans. SIAI it making a special effort to make sure that it will be friendly. That statement is about what SIAI is trying to prevent, not what it is trying to create. I believe they'd settle for no AGI of any kind, but as said that's not really an option. Is that Ray Kurzweil/transhumanism stuff? Gives me the heebie jeebies.
Eliezer Yudkowsky (the brains behind SIAI) has mentioned in the past that his vision of a singularity is very different from Kurzweil's.
|
|
|
|
|