Bitcoin Forum
April 26, 2024, 10:13:18 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 [36] 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 ... 99 »
  Print  
Author Topic: $XAI Sapience AIFX - Decentralized AI | 11% PoS | PlumeDB,IBTP on Testnet  (Read 150176 times)
mafort1469
Sr. Member
****
Offline Offline

Activity: 364
Merit: 250



View Profile
February 05, 2015, 04:31:25 PM
 #701

I think so. He did say he was working on it so hopefully!
1714126398
Hero Member
*
Offline Offline

Posts: 1714126398

View Profile Personal Message (Offline)

Ignore
1714126398
Reply with quote  #2

1714126398
Report to moderator
According to NIST and ECRYPT II, the cryptographic algorithms used in Bitcoin are expected to be strong until at least 2030. (After that, it will not be too difficult to transition to different algorithms.)
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
trader19
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile WWW
February 05, 2015, 04:36:13 PM
 #702

Are we supposed to get a new wallet today?
today, tomorow this week no fix eta on update but it will be series of updates incoming.. Mean time get your free xai coins only few houhrs left until finished
http://forums.dfx.io/index.php?topic=79

Join the Elastic revolution!  Elastic - The Decentralized Supercomputer
ELASTIC WEBSITE | NEW ANNOUNCEMENT THREAD | ELASTIC SLACK | ELASTIC FORUM
ntsdm
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
February 05, 2015, 04:41:41 PM
 #703

Are we supposed to get a new wallet today?
today, tomorow this week no fix eta on update but it will be series of updates incoming.. Mean time get your free xai coins only few houhrs left until finished
http://forums.dfx.io/index.php?topic=79
has anyone received those free coins yet?
myagui
Legendary
*
Offline Offline

Activity: 1154
Merit: 1001



View Profile
February 05, 2015, 04:54:44 PM
 #704

has anyone received those free coins yet?

The giveaway is not closed yet ntsdm. The coins will only be distributed after the claim period is over.

the $XQN supply is bigger...and it doesnt lack dumpers

There's also the important fact that Quotient/XQN was mostly mined, and miners will dump at whatever value pleases their return rates. On the other hand, for Sapience/XAI there was no mining, and those that invested were individuals that believed in the project. Any serious investor will not part with his XAI coins easily, specially given how little was raised in the crowd fund, making the supply extremely short.

Are we supposed to get a new wallet today?

I recall this next update to be 'testnet' only, but I might be wrong. For the range of features that XAI aims to deliver, a 'testnet' period is always necessary in order to validate the code, check for any bugs, obtain reliable performance statistics, etc. I know that there's more than one update on the pipeline, but I don't know how exactly the releases will be timed.

Cheers!

etoque
Legendary
*
Offline Offline

Activity: 1974
Merit: 1000


View Profile
February 05, 2015, 05:39:43 PM
 #705

good. So this one its the good one Cool
etoque
Legendary
*
Offline Offline

Activity: 1974
Merit: 1000


View Profile
February 05, 2015, 05:49:35 PM
 #706

good. So this one its the good one Cool

Yes, it's the good one. No other alt right now has the risk/return perspective that XAI can offer.

Recall this time last year when garbage like MEOW was at $4M market cap on $500k+ volume. If you're telling me that XAI can't eventually hit $1 a coin, of roughly $661k market cap, then crypto is broken.

Hell, $10 a XAI is only roughly $6M market cap. Auroracoin peaked at almost $1 BILLION market cap with over $5M in volume. $6M isn't asking for much, and with the right community/development, I'd say it is very possible.

Just a question of PR/marketing
  Smiley
Kruemmelmonster
Sr. Member
****
Offline Offline

Activity: 338
Merit: 250


View Profile
February 05, 2015, 08:10:15 PM
 #707

maybe we can get on poloniex. just submitted a request
trader19
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile WWW
February 05, 2015, 08:10:57 PM
 #708

catch your free xai as long as u can, value increasing as we speak.  Cool don't miss out!
http://forums.dfx.io/index.php?topic=79  

its a community giveaway!

Join the Elastic revolution!  Elastic - The Decentralized Supercomputer
ELASTIC WEBSITE | NEW ANNOUNCEMENT THREAD | ELASTIC SLACK | ELASTIC FORUM
etoque
Legendary
*
Offline Offline

Activity: 1974
Merit: 1000


View Profile
February 05, 2015, 08:22:52 PM
Last edit: February 05, 2015, 08:44:24 PM by etoque
 #709

maybe we can get on poloniex. just submitted a request

Good idea,this is more visibility for us ,we need more exchange !! Smiley

Everyone,do a request here on polo;

https://poloniex.com/coinRequest
SapienceFan
Newbie
*
Offline Offline

Activity: 12
Merit: 0


View Profile
February 05, 2015, 08:49:49 PM
 #710

Weekly Consolidation Update 2 Containing Communications Direct From XAI Dev:

Previous consolidation (Week 1) : https://bitcointalk.org/index.php?topic=864895.msg10301510#msg10301510


There has been much progression since last week, most notably work towards an update that is due to be released very soon.

The following is a range of information that JoeMoz (XAI developer) has provided in various sources including Slack.


In relation to comments about him listing SharePoint on his LinkedIn profile:

SharePoint just happens to be one thing I am an expert in... and focused on for a couple years because frankly its the highest billable rates :wink:


Age old Opensource / Closedsource dilemma continues:

I mean the tech we are doing is sufficiently advanced that anyone who would try to rip it off right out of the gate would probably wind up pushing broken stuff.
From an open source perspective, it would probably be a valuable contribution to the community at large, like DHT but i remember with XQN, within a few days there were a couple of clone coins with the profit explorer graph feature etc.


Choosing a Name for the Data Layer:

I'm thinking of ditching the treespaces name i came up with, because it generates confusion with the biology stuff when you google it. I think i am going to call the entire data layer PlumeDB, because it is something that can be broken out and marketed for a lot more than just the AI stuff and it is essentially a full blown decentralized database engine running on top of the coin network funneling network communications over the bitcoin p2p protocol.


Where does the name plume come from?

I thought it sounded cool! I was thinking of clouds of data so "databases" are plumes in the cloud.


More notes on fee structure:

The way I am doing fees right now is..it is based on data reservations by kilobyte-hour.  e.g. 0.0001 XAI/kbh when you initialize a new data plume, you put out a request for data reservations based on the estimated size of the database and the lifetime you want. You get responses from volunteer/available slave nodes that are willing to replicate your data index, and you have to pay the total kbh fee to each of the slaves by default  Cool

The actual data itself at a low level is distributed across all nodes via filesystem abstracted into DHT and all nodes participate for public data sets, you don't pay for data reservations, but consumers pay-per-use.

So, I am thinking maybe there should be a small fee for loading public data, as an anti-spam type of measure. The other thing i am thinking is that maybe all Atom data types (opencog atomspaces), should be public, to keep contributing to a large body-of-knowledge, seems maybe pointless to load Atoms that are private...hmm, so who should get the fee for public data loads... maybe the 1st relay node.


How is data loaded onto the Sapience distributed AI network?

You load data either through RPC or the console.

More in-depth infomation related XAI and PlumeDB:

http://wiki.dfx.io/display/XAI/Sapience+AIFX+Home

http://wiki.dfx.io/display/XAI/PlumeDB


Hint on Potential New Look:

I'm doing something cool with it and doing the UI in QML/QT Quick, eventually i want to rewrite the entire wallet using it and ditch the existing hokey interface.  It'll let us get a "responsive" UI on the android/different devices so stuff isn't sitting off the side of the screen etc., and get a wallet that actually looks like a modern app and not something from 1998.


Further development related comments:

Although in the latest sapience source i have moved the leveldb dependency out from being code included in the source to being external dependency so i can use latest google leveldb from github + cross-compile for android easily. Just means an extra step, have to pull and compile leveldb separately and set the include/lib path. I'm trying to get it so i can use 1 .pro file i suppose on linux you could just install the libleveldb-dev package and do it that way, same as libboost-all-dev etc.

I should note that this is a test/beta build... so there's definitely TODO's etc. Under the hood tweaking and stuff that we might want to adjust like by default the low level DHT that is just doing mindless raw data storage will tries to get a penetration of 72 nodes for a given record but i don't know if we even have 72 live nodes on the network, let alone getting them on testnet and there's rebuild scenarios like what if all 8 slave nodes go down, having the network detect that and automatically assign replacements and rebuild trie indexes, etc.

There's really like 4 overlay networks running on top of each other at once... the low level DHT, a mapping & rebuild info DHT, the slave PHT's, and the master/originator the biggest thing we'll have to play around with in testing is seeing what the latency is like i know there will be latency, because it is a p2p database so for some use cases it might not be suitable, for others it might mean just adapting how you work with the data.

The more nodes the better... if we had a couple thousand nodes on the network for instance you can do better load balancing but just in general, basically _everything_ in the entire system is async.

Its just different i guess, from anything i've worked with to date at least :wink:  should enable new scenarios actually a good analogy is using it is more like hitting web services asynchronously, instead of direct database access... but in exchange, you get the redundancy/massive scale-out/decentralization.


Is there anyway for the system to determine what nodes are closest?

Well, there isn't any geolocation /location based proximity right now... but that is something i've been thinking about.


Uniqueness?

As far as i know, this is the only (DHT) implementation that is running over the bitcon p2p protocol, even maidsafe DHT is a parallel/external implementation running over UDP.


Is that useful for anything outside of AI?

There's like a million other things besides AI you can build on top of a decentralized DHT.


Process:

The way i did it is each key in addition to a hash can have 3 attributes, and those get indexed by the slaves in PHTs so that is the value-added service you are getting from the slaves in exchange for the XAI/kilobyte-hour fee. It's key/value storage on the low raw DHT level but with 3 attribute components in the key. So like lets say i want to run a distributed aggregate range query to do a SUM across the data plume where attribute 2 is between A and B... the slaves provide the service of fast lookup to get the subset of infohashes that fall within the criteria, then you use those against DHT2 to get the possible nodes that have each one, that are then looked up against DHT1 to retrieve the individual records/values from the raw key/value store....So with something like a SUM, you slave can pull the subset of infohashes and then chunk them out into groups of say 100 and dole those out as individual compute operations across nodes, and then the results are concentrated and returned to the originator.


DHT1 and DHt2 are just levels within the 4(?) DHTs in PlumeDB?

Layered dht's... yeah, sort of i mean, like for something like a torrent its pretty trivial, so a basic k/v dht works fine but as soon as you want to do anything more involved, you need more metadata etc.so you layer it on top. the raw DHT gives you the redundancy/resilience, etc. and can just focus on getting the values where they need to be


In response to a question on Slaves.

Slaves are responsible for building that multi-rooted PHT the DHT only knows about hash256 + value...there's TODO's i'm probably not going to get to for the release tonight (today), like being able to configure how much of your free space you want to allocate so don't go loading gigantic data sets Tongue


locohammerhead
Hero Member
*****
Offline Offline

Activity: 530
Merit: 500



View Profile
February 05, 2015, 08:57:08 PM
 #711

Thank you for the update sapiencefan Smiley

etoque
Legendary
*
Offline Offline

Activity: 1974
Merit: 1000


View Profile
February 05, 2015, 08:59:29 PM
 #712

this is definitively going over 100k

To the dev; There's a PR guys on team ? More video,more graphic stuff,community are growing and we can share stuff to promote Sapience Smiley
trader19
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile WWW
February 05, 2015, 09:04:37 PM
 #713

Weekly Consolidation Update 2 Containing Communications Direct From XAI Dev:

Previous consolidation (Week 1) : https://bitcointalk.org/index.php?topic=864895.msg10301510#msg10301510


There has been much progression since last week, most notably work towards an update that is due to be released very soon.

The following is a range of information that JoeMoz (XAI developer) has provided in various sources including Slack.


In relation to comments about him listing SharePoint on his LinkedIn profile:

SharePoint just happens to be one thing I am an expert in... and focused on for a couple years because frankly its the highest billable rates :wink:


Age old Opensource / Closedsource dilemma continues:

I mean the tech we are doing is sufficiently advanced that anyone who would try to rip it off right out of the gate would probably wind up pushing broken stuff.
From an open source perspective, it would probably be a valuable contribution to the community at large, like DHT but i remember with XQN, within a few days there were a couple of clone coins with the profit explorer graph feature etc.


Choosing a Name for the Data Layer:

I'm thinking of ditching the treespaces name i came up with, because it generates confusion with the biology stuff when you google it. I think i am going to call the entire data layer PlumeDB, because it is something that can be broken out and marketed for a lot more than just the AI stuff and it is essentially a full blown decentralized database engine running on top of the coin network funneling network communications over the bitcoin p2p protocol.


Where does the name plume come from?

I thought it sounded cool! I was thinking of clouds of data so "databases" are plumes in the cloud.


More notes on fee structure:

The way I am doing fees right now is..it is based on data reservations by kilobyte-hour.  e.g. 0.0001 XAI/kbh when you initialize a new data plume, you put out a request for data reservations based on the estimated size of the database and the lifetime you want. You get responses from volunteer/available slave nodes that are willing to replicate your data index, and you have to pay the total kbh fee to each of the slaves by default  Cool

The actual data itself at a low level is distributed across all nodes via filesystem abstracted into DHT and all nodes participate for public data sets, you don't pay for data reservations, but consumers pay-per-use.

So, I am thinking maybe there should be a small fee for loading public data, as an anti-spam type of measure. The other thing i am thinking is that maybe all Atom data types (opencog atomspaces), should be public, to keep contributing to a large body-of-knowledge, seems maybe pointless to load Atoms that are private...hmm, so who should get the fee for public data loads... maybe the 1st relay node.


How is data loaded onto the Sapience distributed AI network?

You load data either through RPC or the console.

More in-depth infomation related XAI and PlumeDB:

http://wiki.dfx.io/display/XAI/Sapience+AIFX+Home

http://wiki.dfx.io/display/XAI/PlumeDB


Hint on Potential New Look:

I'm doing something cool with it and doing the UI in QML/QT Quick, eventually i want to rewrite the entire wallet using it and ditch the existing hokey interface.  It'll let us get a "responsive" UI on the android/different devices so stuff isn't sitting off the side of the screen etc., and get a wallet that actually looks like a modern app and not something from 1998.


Further development related comments:

Although in the latest sapience source i have moved the leveldb dependency out from being code included in the source to being external dependency so i can use latest google leveldb from github + cross-compile for android easily. Just means an extra step, have to pull and compile leveldb separately and set the include/lib path. I'm trying to get it so i can use 1 .pro file i suppose on linux you could just install the libleveldb-dev package and do it that way, same as libboost-all-dev etc.

I should note that this is a test/beta build... so there's definitely TODO's etc. Under the hood tweaking and stuff that we might want to adjust like by default the low level DHT that is just doing mindless raw data storage will tries to get a penetration of 72 nodes for a given record but i don't know if we even have 72 live nodes on the network, let alone getting them on testnet and there's rebuild scenarios like what if all 8 slave nodes go down, having the network detect that and automatically assign replacements and rebuild trie indexes, etc.

There's really like 4 overlay networks running on top of each other at once... the low level DHT, a mapping & rebuild info DHT, the slave PHT's, and the master/originator the biggest thing we'll have to play around with in testing is seeing what the latency is like i know there will be latency, because it is a p2p database so for some use cases it might not be suitable, for others it might mean just adapting how you work with the data.

The more nodes the better... if we had a couple thousand nodes on the network for instance you can do better load balancing but just in general, basically _everything_ in the entire system is async.

Its just different i guess, from anything i've worked with to date at least :wink:  should enable new scenarios actually a good analogy is using it is more like hitting web services asynchronously, instead of direct database access... but in exchange, you get the redundancy/massive scale-out/decentralization.


Is there anyway for the system to determine what nodes are closest?

Well, there isn't any geolocation /location based proximity right now... but that is something i've been thinking about.


Uniqueness?

As far as i know, this is the only (DHT) implementation that is running over the bitcon p2p protocol, even maidsafe DHT is a parallel/external implementation running over UDP.


Is that useful for anything outside of AI?

There's like a million other things besides AI you can build on top of a decentralized DHT.


Process:

The way i did it is each key in addition to a hash can have 3 attributes, and those get indexed by the slaves in PHTs so that is the value-added service you are getting from the slaves in exchange for the XAI/kilobyte-hour fee. It's key/value storage on the low raw DHT level but with 3 attribute components in the key. So like lets say i want to run a distributed aggregate range query to do a SUM across the data plume where attribute 2 is between A and B... the slaves provide the service of fast lookup to get the subset of infohashes that fall within the criteria, then you use those against DHT2 to get the possible nodes that have each one, that are then looked up against DHT1 to retrieve the individual records/values from the raw key/value store....So with something like a SUM, you slave can pull the subset of infohashes and then chunk them out into groups of say 100 and dole those out as individual compute operations across nodes, and then the results are concentrated and returned to the originator.


DHT1 and DHt2 are just levels within the 4(?) DHTs in PlumeDB?

Layered dht's... yeah, sort of i mean, like for something like a torrent its pretty trivial, so a basic k/v dht works fine but as soon as you want to do anything more involved, you need more metadata etc.so you layer it on top. the raw DHT gives you the redundancy/resilience, etc. and can just focus on getting the values where they need to be


In response to a question on Slaves.

Slaves are responsible for building that multi-rooted PHT the DHT only knows about hash256 + value...there's TODO's i'm probably not going to get to for the release tonight (today), like being able to configure how much of your free space you want to allocate so don't go loading gigantic data sets Tongue



thanks for update. reposting

Join the Elastic revolution!  Elastic - The Decentralized Supercomputer
ELASTIC WEBSITE | NEW ANNOUNCEMENT THREAD | ELASTIC SLACK | ELASTIC FORUM
trader19
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile WWW
February 05, 2015, 09:09:17 PM
 #714

retweet pls  Roll Eyes
https://twitter.com/locohammerhead/status/563442060670541824

Quote
Only 6 hrs left in the 3000 @SapienceAIFX giveaway!  Signup at http://forums.dfx.io/index.php?topic=79.0 to get your 20 XAI!  Current price 34k Satoshi each!

Join the Elastic revolution!  Elastic - The Decentralized Supercomputer
ELASTIC WEBSITE | NEW ANNOUNCEMENT THREAD | ELASTIC SLACK | ELASTIC FORUM
trader19
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile WWW
February 05, 2015, 09:48:57 PM
 #715

retweet pls  Roll Eyes
https://twitter.com/locohammerhead/status/563442060670541824

Quote
Only 6 hrs left in the 3000 @SapienceAIFX giveaway!  Signup at http://forums.dfx.io/index.php?topic=79.0 to get your 20 XAI!  Current price 34k Satoshi each!
guys at current value 1xai @34k satoshis, makes 20xai = 0.007btc = 1.5$ worth /// most valuble giveaway in alt history???

Join the Elastic revolution!  Elastic - The Decentralized Supercomputer
ELASTIC WEBSITE | NEW ANNOUNCEMENT THREAD | ELASTIC SLACK | ELASTIC FORUM
myagui
Legendary
*
Offline Offline

Activity: 1154
Merit: 1001



View Profile
February 05, 2015, 10:34:06 PM
 #716

[...] most valuable giveaway in alt history???

It will be, once Sapience/XAI reaches its rightful value.  Grin

locohammerhead
Hero Member
*****
Offline Offline

Activity: 530
Merit: 500



View Profile
February 05, 2015, 11:20:20 PM
 #717

this is definitively going over 100k

To the dev; There's a PR guys on team ? More video,more graphic stuff,community are growing and we can share stuff to promote Sapience Smiley

nobody on the team is a PR guy per say.  I've kind of been doing that on occasion but it's far from my main line of work Smiley.  If you know anybody feel free to let us know!

billotronic
Legendary
*
Offline Offline

Activity: 1610
Merit: 1000


Crackpot Idealist


View Profile
February 06, 2015, 12:20:11 AM
 #718

this is definitively going over 100k

To the dev; There's a PR guys on team ? More video,more graphic stuff,community are growing and we can share stuff to promote Sapience Smiley

nobody on the team is a PR guy per say.  I've kind of been doing that on occasion but it's far from my main line of work Smiley.  If you know anybody feel free to let us know!


aww don't sell yourself short loco, you are doing a hell of a job!

This post sums up why all this bullshit is a scam
Read It. Hate It. Change the facts that it represents.
https://bitcointalk.org/index.php?topic=1606638.msg16139644#msg16139644
zoppp
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
February 06, 2015, 03:11:45 AM
 #719

Why is this syncing so slow??
locohammerhead
Hero Member
*****
Offline Offline

Activity: 530
Merit: 500



View Profile
February 06, 2015, 04:24:00 AM
 #720

this is definitively going over 100k

To the dev; There's a PR guys on team ? More video,more graphic stuff,community are growing and we can share stuff to promote Sapience Smiley

nobody on the team is a PR guy per say.  I've kind of been doing that on occasion but it's far from my main line of work Smiley.  If you know anybody feel free to let us know!


aww don't sell yourself short loco, you are doing a hell of a job!

 Tongue

Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 [36] 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 ... 99 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!