Thanks for your answer. There is a lot of information in your report sample, i am surprised! I think that your project will make the used car market more transparent
|
|
|
How will you store your data?
|
|
|
Hi Team,
Simply follow some of the posts of the people that give congrats to this new alleged Datacoin eth something something... They comment the same thing on many other coin knockoffs threads with Ethereum donation and "airdorp" links... Appears to be some kind of cut/paste/lie scam that's hitting many other coins.
Regards, The AtomSea
|
|
|
Can't help but feel a bit flattered . . . . . . and wow, what an amalgam of disjointed ideas there, hope people see the obvious scam
|
|
|
Max Supply of Datacoin
One question I am being asked is the 2 billion max supply of Datacoin that will be produced.
This is something that we can discuss. For myself I have no strong attachment to the 2 billion.
Since the supply has already passed 28 million we cannot reduce it to 21. But we can certainly talk about 42 million as a possible stop.
-extro
If anyone has a decent technical argument to keep the total supply set at 2 billion I would like to hear it. 42 million would be a blessing, Imo early investors should be rewarded for holding dtc in the long term and not feel like they are invested in the Dogecoin for cloud storage. To firstly call out the elephant in the room, if the total supply is changed, instant whales will be created - HugPuddle being one of them. Full disclosure --> we control 500,000+ Datacoin (currently 1.76 percent of the money supply -- we've been mining on and off for years now) Note that our intentions are not for riches, but for giving people a tool to immutably store their history and creations. If there is a reason to tinker with the money supply to protect the Datacoin network, then I can see the need for a change, but if the network is fine as-is, changing the code is a slippery slope, especially changing the money supply... it can lead to damage of the network, and distrust from people on what developers might do next... make no mistake, this is a big deal. That being said, if there is some fundamental flaw in the generation of said coin that could lead to harming the Datacoin network, that needs to be looked at. I think the numbers need to be crunched very carefully. I'd like the tech savvy people to weigh in on this: What's the best level of money supply, block times, and data fee to sustain and protect the network? That question needs to be the guiding light. Without a sustained and protected network, our vision of an environment for people to etch ideas/history/creations will not occur, nor will anyone's investment pay off. There should be enough generated in a day to meet user storage needs, but not so much that it could clog/spam up Datacoin and cripple the network. But note that this can be deterred by some kind of variable cost of storage/miner fee & block times, not necessarily by the amount of coin... Also note that the problem with Dogecoin wasn't its high money supply necessarily, it was that it offered nothing unique to the eco-system for people to stay. Plus, the very thing that grew Doge, the miners, became irrelevant with the Script asic and the difficulty in getting next generation Script miners... GPU mining was key to their involvement. I think GPU mining is important, and again, I think our guiding light should be what's the best level of money supply, block times, and data fee to sustain and protect the network over time. Just some thoughts - I'm loving all the energy around Datacoin
|
|
|
We are currently syncing on CryptoID! WELL DONE! V e r y n i c e ! Hi Team, Big thanks to MarcusDE for letting us know about CryptoID! (also a thank you to fairglu for CryptoID itself! it appears to be a great resource!) Datacoin appears fully synced! https://chainz.cryptoid.info/dtc/On that note, if anyone knows of any other helpful tools or resources that would benefit Datacoin, please bring them to the group and don't hesitate to ask for assistance to make them happen. We at HugPuddle do have some limited resources that we're more than willing to utilize to make things happen. We believe in decentralized entities (especially immutable decentralized archiving) to bring about a greater good for The People... Datacoin clearly fits in this mission. Kind Regards, The AtomSea
|
|
|
Hi Team, We are currently syncing on CryptoID! CryptoID is a block explorer, and more, for development and curiosity needz. https://chainz.cryptoid.info/dtc/Go Datacoin! Kind Regards, The AtomSea [ paid for one year by the not-for-profit archiving entity, HugPuddle - 1HuGpUDDLEhvehXE1P6xeudqAHqKfs1BFM ]
|
|
|
I just etched a high resolution 50MB image on datacoin testnet variable storage (DTC-TV) from my home in Long Beach CA. The bitfossil realtime indexer picked it up in good ole moorhead MN and rebuilt it without issue. <3 http://bitfossil.com/42c76b00c883983adc48a21ed305ee5b0105aeb5844067c5db48cbc0385120b6/index.htmThis is now the largest file etched with http://Apertus.io that i am aware of. <3 The cost appears to be about 4.55 DTC per 90K at a pretty consistent rate of around 250K per minute It took 3 hrs and 25 minutes to complete. I was seeing a lot of unconfirmed ? transactions but they appear to eventually clean themselves up. Thanks for your time. That's awesome embii, this puts my dos emulators and 20k game of Snake to shame I have had a go at streaming copyright free films using the TorrentTime plugin, I can etch and run the player without issue but for some reason the player can't find any sources after I have added in the magnetlink. It's something I need to spend more time on. whoa whoa whoooooa... hold on. "game of Snake" ?!?!? on Datacoin? Where is this? And how do I play?!? Kinda exxcited, The AtomSea
|
|
|
If anyone needs Datacoin for development and/or experimental purposes, hit me up, l'll send you some
Hi Everyone, Just a reminder -- we have Datacoin available for free We also have Datacoin Testnet coins too. Just shoot me an addy! You can also get Datacoin at BTCPOP.CO as well. Kind Regards, The AtomSea
|
|
|
[APERTUS UPDATE]
Good news! HugPuddle.org is currently testing a fix for the bug that was preventing it from rebuilding variable storage etchings stored in a remote wallet. We will be testing it's functionality soon by archiving a message in a wallet located in Moorhead MN and rebuilding the message via transaction ID in a remote wallet located in Long Beach CA. if it works as expected we will release the fix in v0.3.12-beta and make it available for download at Apertus.io shortly.
PS: If you are experimenting with DTC-V (Datacoin Variable Storage) you may begin to see your etchings show up on the bitfossil.org real time indexer later in the evening as we deploy the fix.
Thanks for your time.
Hi Team, Tests successful: "One small step for a block... one giant leap for blockchain" http://bitfossil.org/8587d9f36e8d62202147921a8f0039a9bbcdd75abc11d2b797cf4ce307e8397c/index.htm"Datacoin native field etch test on a non HugPuddle network machine to see if a HugPuddle networked machine pics up the Datacoin native field etching" http://bitfossil.org/2a9e4d8c76a5fed92e3f9c6de7aec98a273751cb9c0dd6b71d385cbe3cd9bd38/index.htmNative Datacoin variable field in full usage by Apertus, and completely viewable on bitFossil.org Onwards and upwards. The AtomSea
|
|
|
#Note this is a draft copy, improved version coming soon to Steemit
Datacoin - How to Etch an IBM PCx86 simulator to the Blockchain Pt 2 (Applications)
Windows Tutorial
Continuing from the previous guide your simulator should be up and running, it's now time to add some software using the links feature included with Apertus. Links allow you to use previously etched content stored in the Apertus root index. The following tutorial will demonstrate how to add a copy of VisiCalc, a trailblazing spreadsheet program released for the Apple II in the late 1970's. More information on VisiCalc is available below. ...
Holy crap! Amazing work dtc2017!!! To say I'm exxcited by this is a huge understatement. We love that so many see what can be done with a properly running decentralized cypherpunk blockchain in conjunction with the right tools. It's extremely gratifying with extra joyness That said -- we want to help the Datacoin eco-system: We at HugPuddle are willing pay for another blockchain explorer (send me the BTC address for CryptoID payment, if you cats want to roll with them). Also, we will pay a bounty to anyone that gets a second Datacoin mining pool going. In addition, let us know how else we can be of service to aid in making the Datacoin eco-system stronger Decentralize all the Things! Kind Regards, The AtomSea
|
|
|
Hi Team!
More Mining Pools.
MarcusDE's dedication to Datacoin cannot be denied considering he's kept a mining pool going for years despite Datacoin usage doldrums that killed off everyone else.... A massive ongoing thank you to him.
That being said, it's key we get more.
I think I need not give a cypherpunk blockchain 101 decentralization lesson to make my point . . .
We all know.
Hope some can take up the noble cause!
Kind Regards, The AtomSea
|
|
|
I've asked CryptoId for adding DTC, here's response: Hi, I am asking for a contribution to hosting costs of $109 for a year (or $59 for 6 months), via Paypal, BTC or LTC. Let me know if you are interested! Thank you for the info MarcusDE. What does everyone think? These cats legit? HugPuddle will gladly pay for 6 months --- perhaps by then we'll have so many Datacoin guru's making nifty tools that we won't need to pay
|
|
|
Hi embii, Thanks a lot especially for writing this summary for everyone. Consider getting your testnet network operational as this can be used as a temporRy data store for things you do not want to persist forever like group chats.
Just to do a bit of double-checking and to make certain we've all converged on the same testnet -- is your testnet swarm at height 4521 presently, with a block hash at the tip of d61407a09506df6ef440abaa4d6392c20e30b72470e77bf40ae68e9008f302a6? Best Regards, -Chicago Amazing news on the Datacoin Testnet... What peers are active on her? Can we get a list? I'll add them, open the port to be a full-node, and put a cpu core on her to help move her along Hello AtomSea, Well, since I was just asking a bit ago to see if everybody had converged on the same testnet swarm; I'll just share what I know and then hopefully if there are separate p2p swarms for testnet we'll get them all joined. What peers are active on her? Can we get a list? I'll add them, open the port to be a full-node, and put a cpu core on her to help move her along
Here are the full node addresses I see. - 144.76.64.49
- 119.9.108.125
- [2401:1800:7800:104:be76:4eff:fe1c:5d6]
Best Regards, -Chicago Datacoin Testnet nodes added -- hung at height 4521. setgenerate true 1 Testnet now live and active. funkadelic
|
|
|
Hi embii, Thanks a lot especially for writing this summary for everyone. Consider getting your testnet network operational as this can be used as a temporRy data store for things you do not want to persist forever like group chats.
Just to do a bit of double-checking and to make certain we've all converged on the same testnet -- is your testnet swarm at height 4521 presently, with a block hash at the tip of d61407a09506df6ef440abaa4d6392c20e30b72470e77bf40ae68e9008f302a6? Best Regards, -Chicago Amazing news on the Datacoin Testnet... What peers are active on her? Can we get a list? I'll add them, open the port to be a full-node, and put a cpu core on her to help move her along
|
|
|
If anyone needs Datacoin for development and/or experimental purposes, hit me up, l'll send you some
|
|
|
Anyone else having trouble syncing? I'm running geth --rpc as usual which starts fine loading 256 blocks at a time fast (every second or so) and I'm watching the \chaindata folder adding files with current timestamps so all seems fine. Then after a few minutes the geth process stops printing new lines in the console and shows "Synchronisation failed: no peers to keep download active", but I can see \chaindata is still adding new files, but after a few minutes more it also stops. So I close geth and restart and the whole process repeats.
This used to work just fine -- even if I had a lot of catch-up to do -- I'd just let it run for a few hours and it would catch up and then increment by 1 block every few minutes to keep in sync. The same pattern has been happening for the past several days. I have 50/5 internet and a good wifi connection.
Anyone know what might be causing this?
I'm having very similar issues - I'm also looking for answers or a troubleshooting page for this. . . {Windows 7 machine - all updated. Fast HD, fast internet, strong processor}
|
|
|
~ "check storj" Storj is a complimentary service to Datacoin. Datacoin is immutable - Storj isn't. Example of complimentary service: One could store a full-file on Storj, then the hash on Datacoin. That way one could be certain that the original file is what they claim to be on Storj by using the unchangable hash stored on Datacoin ~ "Is it started by the original dev of DTC?" No, one can go through this thread and deduce with reasonable certainty that they aren't... ----- Exploration of known and unknown uses of blockchain tech is important - I feel it's wise to maintain quality blockchains for experimentation - Datacoin easily qualifies - the more the better That's why I'm putting hashing power into Datacoin, supporting full-nodes, and offering Datacoin for development. DM me for some Datacoin if you wanna play around with it
|
|
|
|