Bitcoin Forum
May 03, 2024, 11:13:10 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 [53] 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 ... 208 »
1041  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 12, 2016, 05:16:03 PM
@AlexGR I'm still puzzled at what the hell you are going off about. It seems almost entirely strawman to me.

Very few new applications are being written in C afaik (and I do have somewhat of an idea). It is only being done by a few fetishists such as jl777.

Technically you may be correct about new applications. I don't know. But what I'm seeing in linux repositories is a flood of c/c++ programs. Python is probably close behind. I am not seeing much javascript really.
1042  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 12, 2016, 04:45:58 AM
AlexGR, it is possible that what you were trying to say is that C is too low-level to be able to reduce higher-level semantic driven bloat and that C doesn't not produce the most optimum assembly code, so might as well use assembly for low-level and a higher-level language otherwise.

In that case, I would still disagree that we should use assembly in every case where we need lower-level expression instead of a low-level language that is higher-level than assembly. Every situation is different and we should use the semantics that best fit the scenario. I would agree with the point of using higher-level languages for scenarios that require higher-level semantics as the main priority.

I would also agree that the higher-level languages we have today do lead to bloat because they are all lacking what I have explained upthread. That is why I am and have been for the past 6 years contemplating designing my own programming language. I also would like to unify low-level and higher-level capabilities in the same language. Rust has advanced the art, but is not entirely what I want.

What I'm trying to say is something to that effect: We have to differentiate between
a) C as a language in terms of what we write as source (the input so to speak)
and
b) the efficiency of binaries that come out from the compiler.

The (b) component changes throughout time and is not constant (I'm talking about the ratio of performance efficiency versus the hardware available at each given moment).

When C was created back in the 70's, every byte and every clock cycle were extremely important. If you wanted to write an operating system, you wouldn't do that with a new language if it was like n times slower (in binary execution) than writing it in assembly. So the implementation, in order to stand on merit, *HAD* to compile efficient code. It was a requirement because inefficiency was simply not an option. 1970 processing power was minimal. You couldn't waste that.

Fast forward 45 years later. We have similar syntax and stuff in C but we are now leaving a lot of performance on the table. We are practically unable to produce speed-critical programs (binaries) from C alone. A hello world program might be of a similar speed with a 1980's hello world, but that's not the issue anymore... Processors now have multiple cores, hyperthreading, multiple math coprocessors, multiple SIMD units, etc. Are we properly taking advantage of these? No.

Imagine trying to create a new coin with a new hash or hash-variant that is written in C. You can almost be certain that someone else is doing at least 2-3-5x in speed by asm optimizations and you are thus exposed to various economic and mining attack vectors. Why? Because the compilers can't produce efficient vectorized code that properly uses the embedded SIMD instruction sets. Someone has to do it by hand. That's a failure right there.

I'm not really proposing to go assembly all the way. No. That's not an option in our era. What we need is far better compilers but I don't think we are going to get them - at least not from human programmers.

So, I'm actually saying the opposite, that since C (which is dependent upon the compilers in order to be "fast") is now fucked up by the inefficient compilers in terms of properly exploiting modern hardware (and not 3-4 decades old hardware - which it was good at), and is also "abrasive" in terms of user-interfacing / mentality of how it is syntaxed/constructed, perhaps using friendlier, simpler and more functional languages while making speed concessions may not be such a disaster, especially for non-speed critical apps.
1043  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 11, 2016, 04:16:44 PM
That is the propaganda but it isn't true. There are many scenarios where optimization is still critical, e.g. block chain data structures, JIT compilation so that web pages (or in my goal Apps) can start immediately upon download of device-independent code, cryptography algorithms such as the C on my github.

It's not propaganda. It's, let's say, the mainstream practice. Yes, there are things which are constantly getting optimized more and more, but every time I see serious optimizations needed, they all have to go down to asm. Why? Because c, as a source, is just a textfile. The actual power of the language is the compiler. And the compilers we have suck badly - even the corporate ones like Intel's icc - which is better in vectorizing and often a source for ripping off its generated code, but still.

Quote
We almost never should ignore optimization for mobile Apps because otherwise it consumes the battery faster.

True. But sometimes the problem is at the kernel level. If you do a make menuconfig for the kernel and you go all the options one by one, you realize a very problematic truth: one piece of software may be deciding when to preempt, another routine may be evaluating cache usage and cache flushing, another may be evaluating idleness, another may be checking for untrusted code execution, another may be checking for buffer overflows in order to reboot as a precaution, another is checking whether there is stuff to debug, another is logging everything, another is doing real-time performance analysis, etc etc etc. It's like for every process doing work there are 20 processes watching over the 1 that actually does something - and each one promising either "low overhead" or "performance improvement". All combined they create a bloated feel.

Quote
There are still people in this world who can't afford 32 GB of RAM and if we forsake optimization, then a power user such as myself with 200 tabs open on my browser will need 256 GB of RAM. A typical mobile phone only has 1 GB.

I have to daily reboot my Linux box because I only have 16 GM of RAM and the garbage collector of the browser totally freezes my machine.


I have 4 on my desktop and 1 on my laptop running a pentium-m 2.1ghz, single core (like >10yr old).

"Amazingly" running 2012 pc linux 32 bit, the 1gb laptop can run stuff like chromium, openoffice, torrents, etc etc, all at the same time and without much problem.
 
My quad-core desktop 4gb/64 bit is so bloated that it's not even funny. And I also run zram to compress ram in order to avoid swapping out to the ssd. In any case, thinking about this anomaly, I used 32bit browser packages on the 64bit desktop... ram was suddenly much better and I was doing the exact same things Roll Eyes

Quote
I am thinking you are just spouting off without actually knowing the technical facts. The reason is ostensibly because for example Javacript forces the use of garbage collection, which is an example of algorithm which is not as accurate and performant as expert human designed memory deallocation. And because memory allocation is a hard problem, which is ostensibly why Mozilla funded the creation of Rust with its new statically compiled memory deallocation model to aid the human with compiler enforced rules.

1) Let's just say that even the packages of the distribution are garbage compared to the mozilla builds (PGO-ptimized). Download a mozilla build from mozilla and then download a precompiled binary from your distro. Run a javascript benchmark and tell me the results. Same C language, much different results for the user depending where he actually got his binary.

2) Even by loading a single blank page, mozilla wants >200mb.

3) Loading 5 tabs with 5 different sites, where none is running any js (noscript activated), and there is no flash involved, I go to ~400mb. This is bullshit. I doubt their problems lie in the language.

4) Seriously, wtf are you loading and you need a reboot every day with 16gb? Roll Eyes


Man haven't you noticed that the capabilities of webpages have increased. You had no DHTML then thus nearly no Javascript or Flash on the web page.

With firefox I don't have flash, I have ipc container off, I block javascript (and the pages look like half-downloaded shit) and still the bloat is there.

Quote
Above you argued that bloat doesn't matter and now you argue it does.   Roll Eyes

My argument is that people using C hasn't done wonders for speed due to the existence of bloat and bad practices.

We are immersed in junk software that are abusing hardware resources and we are doing so with software written in languages that are supposedly very efficient and fast. I believe we can allow people to write software that is slower if the languages are much simpler. Microsoft did a step in that direction with VBasic in the 90s... I used it for my first GUI programs... interesting experience. And I'd bet that the programs I made back then are much less bloated than today's junk even if written in c.

Quote
Agreed you do not appear to have the natural inclination for C. But C is not bullshit. It was a very elegant portable abstraction of assembly that radically increased productivity over assembly.

It was useful to write a unix with back in 1970. It may not be useful today for coding everything with it.

Quote
Your talent probably lies else where. I don't know why you try to force on yourself something that your mind is not structured to do well.

I don't have the same issue with the structures of basic or pascal, so clearly it's not an issue of talent. More like a c-oriented or c-non-oriented way of thinking.

Quote
I strongly suggest you stop blaming your handicaps and talents on others:

Nice article. I can relate to what he's writing. However what I'm saying here is different. For example eating with chopsticks is not a talent/dexterity I'm envious of. I prefer forks because I consider them superior and I would not spend any of my time to learn eating with chopsticks. It's efficiency-oriented thinking. Coding C is ...chopsticks. We need forks. I can't say it any more simply than that.

You are not an expert programmer, ostensibly never will be, and should not be commenting on the future of programming languages. Period. Sorry. Programming will never be reduced to art form for people who hate abstractions, details, and complexity.

My problem is with unnecessary complexity. In any case, the problem is not to dumb down programming languages. I explicitly said that you can let the language be tunable to anything the programmer designs but you can also make it work out of the box with concessions in terms of speed.

In a way, even C is acting that way because where it is slow you have to go down to asm. But that's 2 layers of complexity instead of 1 layer of simplicity and 1 of elevated complexity.

Quote
Sorry never! You and Ray Kurzweil are wrong and always will be. Sorry, but frankly. I don't care if you disagree, because you don't have the capacity to understand.

Kurzweil's problem is not his "small" opinions on subject A or B. It's his overall problematic vision of what humanity should be, or, to put it better, what humanity should turn into (transhumanism / human+machine integration). This is fucked up.
1044  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 11, 2016, 06:24:33 AM
Stuff like integers overflowing, variables not having the precision they need, programs crashing due to idiotic stuff like divisions by zero, having buffers overflow etc etc - things like that are ridiculous and shouldn't even exist.

Impossible unless you want to forsake performance and degrees-of-freedom. There is a tension between infinite capability that requires 0 performance and 0 degrees-of-freedom.

Sorry some of the details of programming that you wish would disappear, can't.

The time when programmers tried to save every possible byte to increase performance is long gone because the bytes and the clock cycles aren't that scarce anymore. So we are in the age of routinely forsaking performance just because we can, and we do so despite using - theoretically, very efficient languages. The bloat is even compensating for hardware increases - and even abuses the hardware more than previous generations of software.

You see some programs like, say, Firefox, Chrome, etc etc, that should supposedly be very efficient as they are written in C (or C++), right? Yet they are bloated like hell, consuming gigabytes of ram for the lolz. And while some features, like sandboxing processes so that the browser won't crash, do make sense in terms of ram waste, the rest of the functionality doesn't make any sense in terms of wasting resources. I remember in the win2k days, I could open over 40-50 netscape windows without any issue whatsoever.

Same bloat situation for my linux desktop (KDE Plasma)... It's a piece of bloated junk. KDE 4 was far better. Same for windows... Win7 wants 1gb ram just to run. 1gb? For what? The OS? Are they even thinking when they make the specs?

Same for our cryptocurrency wallets. Slow as hell and very resource-hungry.

In any case, if better languages don't exist, the code that should be written won't be written because there will be a lack of developers to do so. Applications that will never be written won't run that fast... they won't even exist!

Now that's a prime corporate problem. "We need able coders to develop X, Y, Z". Yeah well, there are only so many of them and the corporations won't be able to hire as many as they want. So...

In the 80's they thought if they train more coders they'll solve the problem. In the 90's they discovered that coders are born, not educated - which was very counterintuitive. How can you have a class of 100 people and 10 good coders and then have a class of 1000 people and 20 good coders instead of 100? Why aren't they increasing linearly but rather the number of good coders seem to be related to some kind of "talent" like ....music? Well, that's the million dollar question, isn't it?

I searched for that answer myself. I'm of the opinion that knowledge is teachable so it didn't make any sense. My mind couldn't grasp why I was rejecting C when I could write asm. Well, after some research I found the answer to the problem. The whole structure of C, which was somehow elevated as the most popular programming language for applications, originates from a certain mind-set that C creators had. There are minds who think like those who made it, and minds who don't - minds who reject this structure as bullshit.

The minds who are rejecting this may have issues with what they perceive as unnecessary complexity, ordering of things, counter-intuitive syntax, etc. In my case I could read a line of asm and know what it did (like moving a value to a register or calling an IRQ) and then read a line of C and have 3 question-marks on what that fuckin line did. I've also reached the conclusion that it is impossible to have an alignment with C-thinking (preferring it as a native environment) without having anomalies in the way one thinks in life, in general. It's similar to the autism drawback of some super-intelligent people, but in this case it is much milder and doesn't need to be at the level of autism. Accepting the rules of the language, as this is, without much internal mind-chatter objection of "what the fuck is this shit" and just getting on with it, is, in a way, the root cause why there are so few people using it at any competent level globally. If more people had rejected it then we'd probably have something far better by now, with more popular adoption.

There are so many languages springing up but they are all trying to be the next C instead of being unique. For sure, programmer's convenience in transitioning is good, but it should be abandoned in favor of much friendlier languages.

While all this is nice in theory the problem will eventually be solved by AI - not language authors. It will ask what we want in terms of execution and we'll get it automatically by the AI-programming bot who will then be sending the best possible code for execution.

AI-Generic interface: "What do you want today Alex?"
Alex: "I want to find the correlation data of soil composition, soil appearance, minerals and vegetation, by combining every known pattern recognition, in the data set I will be uploading to you in a while".
AI-Generic interface: "OOOOOOK, coming right up" (coding it from the specs given)
AI-Generic interface: "You are ready... do you want me to run the program with the data in the flash drive you just connected?"
Alex: "Sure".

This means that the only thing we need to program is a self-learning AI. Once this is done, it will be able to do everything that a coder can do, even better and faster. It will be able to do what an if-then-else idiotic compiler does today, but far better in terms of optimizations and hardware exploitation. Most of the optimizations that aren't done, is because the if-then-else compiler doesn't recognize the logic behind the program and if the optimization would be safe or not. But if programmer and compiler are the same "being" then things can start ...flying in terms of efficiency.

This got futuristic very fast, but these aren't so futuristic anymore. They might have been 10 years ago but now it's getting increasingly closer.
1045  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 11, 2016, 03:00:04 AM
Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.

Are you a programmer AlexGR? I understood from the other thread discussion we had that you mostly aren't, but not sure.

It depends on the definition. I do not consider myself one but I've written some simple programs, mainly for my own use, in the past.

I still code occasionally some things I need. For example I started recently (an attempt) to implement some neural network training to find patterns in SHA256, to see if it might be feasible to create a counterbalanced approach to ASIC mining, by pooling cpu and gpu resources in a distributed-processing neural network with tens of thousands of nodes (the equivalent of a mining pool). Anyway while I think the idea has merit, I didn't get very far. Now I'm contemplating whether neural networks can find weaknesses in hashing functions like cryptonight, x11, etc etc and leverage this for shortcutting the hashing. In theory all hashing should be pretty (pseudo)random but since there is no such thing as random it's just a question of finding the pattern... and where "bad crypto" is involved, it could be far easier to do so.

Anyway these are a mess in terms of complexity (for my skills) and I'm way over my head due to my multifaceted ignorance. My background wasn't that much to begin with: I started with zx spectrum basic back in the 80s then did some basic and pascal for the pc, then did some assembly.... and then the Internet came and I kind of lost interest in "singular" programming. In the "offline" days it was easy to just stick around a screen, with not much distraction, and "burn" yourself to exhaustion, staring the screen for 15 hours to write a program. In the post-internet era that much was impossible... ICQ, msn, forums, all these gradually reduced my attention span and intention to focus.

I always hated c with a passion while pascal was much better aligned with how I was thinking. It helped enormously that TPascal had a great IDE for DOS, a nice F1-Help reference etc. Now I'm using free pascal for linux, which is kind of similar, but not for anything serious. As for c, I try to understand what programs do so that I can change a few lines to do my work in a different way.

In theory, both languages (or most languages?) do the same job... you call the library you want, use the part you want as the "command" and voila. Where they may differ is code execution speed and other more subtle things. Library support is obviously not the same, although I think fpc supports some c libs as well.

Still, despite my preference for the way pascal was designed as a more human-friendly language, it falls waaaaaaaaaay short of what I think a language should actually be.

Syntax should be very simple and understandable - kind of like pseudocode. Stuff like integers overflowing, variables not having the precision they need, programs crashing due to idiotic stuff like divisions by zero, having buffers overflow etc etc - things like that are ridiculous and shouldn't even exist.

By default a language should have maximum allowance for the sizes that a user could fit in vars and consts but also let the user fine tune it only if he likes to do so to increase speed (although a good compiler should be able to tell which sizes can be downsized safely and do so, by itself, anyway). Compilers should be infinitely better, taking PROPER advantage of our hardware. How the hell can we be like over 15 years later than SSE2 and not having them properly used? You make a password cracker and you have to manually insert ...sse intrinsics. I mean wtf? Is this for real? It's not an instruction set that was deployed last month - it's there for ages. Same for SSE3/SSSE3/SSE4.1/4.2 that are like 8yr old or more and go unused. How the hell can a language pretend to be "fast" when it throws away SIMD acceleration, is beyond me. But that's not the fault of the language (=> the compiler is at fault). Taking advantage of GPU resources should also be a more automated process by now.

Quote
I think, though I'm not sure, that most C coding these days (which still seems to be quite popular) is indeed system programming, firmware for devices, performance-critical callouts from other languages (Python for example), etc. But I have few if any sources for that, and it may be completely wrong.

OS are definitely C but almost the majority of OS apps are also C, especially in linux.
1046  Alternate cryptocurrencies / Altcoin Discussion / Re: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? on: April 11, 2016, 02:06:46 AM
Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.

There's too much code that needs to be written but there won't be many programmers to write it with all those piece of shit languages that turn people away from coding. There needs to be a breakthrough in terms of what a language is. Not the same old, same old with slightly different syntax and features.
1047  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 01:53:12 AM
Legal and illegal are extremely relative terms, depending one's geographical coordinates.

Bitcoin may be legal somewhere, and somewhere else it may not be.

Mining coins may be legal somewhere, illegal somewhere else, taxable or non-taxable.

Coins could range from legally irrelevant, to currency (legal or illegal), to commodity or something else.

Encryption can also be very relative in terms of its legal status.
1048  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 01:21:03 AM
I see so if you benefit from a scam it's not a scam.

In game theory there are zero and non-zero sum games.

Ripping people off = zero-sum-game. Someone wins at the expense of another. There are hundreds of coins that fall under this pattern.

Giving value to a coin that begins its life as worthless, and further adding functionality (and by extension value) = non-zero-sum-game, as everyone benefits from increased functionality, network effect and value.

Does this mean that there are no uncertainties? Nope. Crypto remains a high-risk / high-reward type of investment, whether we are talking bitcoin or altcoins (even bigger uncertainties and risks).
1049  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 12:53:18 AM

The victims you require are every miner that was scammed. Do you get it yet?

Thanks for confirming that.

Now we know who's noses are out of joint and who's making all the noise  Wink

I just bought mine in the market where they were available in abundance. Maybe you should have too if had valued that much.

#fauxOutrage


NO i will not ever buy and support scams. They should be crushed not bought into.

Oh really, thought it was monero only.

Now it's people (miners that were scammed) sorry but anyone is able to complain if they choose.

Viewing the FACTS up thread the entire board should always be kept informed about the darkcoin past.

Dash will serve as an example so others are not Dashed in the future.

2 year old thread: https://bitcointalk.org/index.php?topic=560138.0 "darkcoin scam scam scam"

Price back then 0.001.
Price now 0.016.

I guess people that "avoided" the "scam", weren't "ripped off" by 16x gains. Good for them.
1050  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 12:44:16 AM
We may not know what the value of the scam is (?), we may not be able to tell how people were scammed, we may not know of any victims, but heck, if we repeat it is a scam a million times, it must be. Surely.
1051  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 12:31:54 AM
From 6M$ - 13M$ in 5 h, Edufield did some lucrative work here!

Two years ago:

"https://bitcointalk.org/index.php?topic=560138.msg6107610#msg6107610"

"Blatant Scam Darkcoin - Instamine 2 millions drk = 1.2 million USD !!!"

Now it's 6m - 13m?

Wow... Next year it could be 100mn Grin
1052  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 12:26:44 AM
You can't keep your story straight.

First you say it was a "problematic launch" (your words).

Now you claim that statements being made about a "fair and transparent" launch are helpful disclosure.

/facepalm.

=>

Smooth, what you are saying is not a "scam", it's a problematic launch. A scam requires VICTIMS.

/doublefacepalm.
1053  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 11, 2016, 12:16:49 AM
The Dash insiders are well aware that investors still care about the instamine

This issue has been settled with finality by the community over 2 years ago: https://bitcointalk.org/index.php?topic=559932.0

Apparently not, because if it was settled with finality, the Dash community would still not be attempting to cover it up by editing the OP and putting out new Official Statements, both done much less than 2 years ago. These are actions of the Dash community, not my actions. I'm just reporting them.

If dash doesn't have relative info in their pages => OH NO IT'S A SCAM
If dash has relative info in their pages => OHHHH NOOO IT'S A SCAM

You can't win, ever.

In another thread some of you were complaining about lack of transparency, now when some info is updated => "ohhhh noooo".

Ffs, grow up.
1054  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 11:25:25 PM
The Dash insiders are well aware that investors still care about the instamine

This issue has been settled with finality by the community over 2 years ago: https://bitcointalk.org/index.php?topic=559932.0

The only people "caring" are Monero trolls. And that's a fact.

Why is it that everywhere you look a "dash scam" thread, there is a Monero troll behind it? Competitive reasons.
1055  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 11:15:32 PM
Smooth, what you are saying is not a "scam", it's a problematic launch.

Does it describe a "fair and transparent" launch to you?

Was bitcoin "Fair and transparent"? I never heard about it.

Was it "fair" that Satoshi was solomining for a year?

Ah well, who cares by now.
1056  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 11:13:52 PM
Does the Official Statement about the Instamine claim that Litecoin's difficulty adjustment is responsible for the extra coins, when in fact that is not the case for most of the extra coins?

Can you explain why Litecoin instamined half a million coins?

Because the the slow Litecoin difficulty adjustment causes a small instamine.

Only if the hashrate is large Wink

And remember xco's hashrate was large, but also on a new and different algorithm.
1057  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 09:52:24 PM
Smooth, what you are saying is not a "scam", it's a problematic launch. A scam requires VICTIMS.

Let's say you missed the launch for any of the reasons you mention.

The scam is where exactly?

In that those that missed the launch didn't get to profit to the same degree? The coins were WORTHLESS. It's not like acquiring them by buying them would be somehow expensive, as it happens with ICOs.

You see, even that rationale is bogus. 10k XCOs were sold for 0.25 BTC (current valuation ~150 BTC).

The price was stable at 0.000025/xco or drk for the next 2+ weeks even if you missed the launch.

And even if you missed the first month or two, you could buy at 0.001.

It's at 0.016-17 right now.

Does the Official Statement about the Instamine claim that Litecoin's difficulty adjustment is responsible for the extra coins, when in fact that is not the case for most of the extra coins?

Can you explain why Litecoin instamined half a million coins?
1058  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 09:42:21 PM
How many BTC were mined in the first 24 hours?

How many XMR were mined in the first 24 hours?

How many dash were mined in the first 24 hours?

Is this the only thing you are interested? A blockchain explorer can tell you that, you don't need me, or anyone else to tell you the number of coins.

I'm just amazed that a dozen or so dashers can't answer a simple question. Will you melt or something? Why is it such a big deal to answer?

The answer is ~2mn for XCO, I haven't looked into XMR or BTC, but I suspect BTC would be close to 7200.
1059  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 09:36:07 PM
a) Your "facts" are conjecture and presented as evidence.

No, my facts are what was stated by whom and when, along with things like what block rewards were generated when. Specific sources are given, and it is all (or nearly all) fully verifiable and objective.

It's bullshit, that's what it is.

Let me give you an analogy of what you are doing:

Let's say you accuse Evan for example.

I'll play "Thomas the unbeliever" aka Smooth, a la reversed role:

Please tell me the precise number of coins he instamined, or GTFO.
Please tell me the precise number of coins his associates instamined, or GTFO.
Please tell me the precise number of coins they dumped and when, or GTFO.
Please tell me the precise number of coins they bought and when, or GTFO.
Please tell me the precise number of coins they have right now, or GTFO.
Please tell me the addresses of where they hold these coins, or GTFO.

I could be asking things in this fashion all day long, putting the burden of proof on your shoulders, like:

Please PROVE to me, that the instamine redistribution never happened and that all coins that were instamined remained with their holders.
Please PROVE to me what was the number of the initial miners of the first day and who got what.
Please PROVE to me that these people are currently the masternode owners of ~4000 MNs.
Please PROVE to me that all the people who were buying and selling XCOs and DRKs at the start, whether by PMs, in ccex or poloniex, were "sockpuppets".

(the list could be extremely extensive and you'd come up empty handed every single time)

Quote
I haven't seen any such facts. All I have seen is conjecture as to motives, unobservable and/or unverifiable actions, etc.

The bar I was saying earlier.

Quote
I've seen no such evidence that is verifiable proof of anything useful.

Of course you haven't.
1060  Alternate cryptocurrencies / Altcoin Discussion / Re: Why the darkcoin/dash instamine matters on: April 10, 2016, 09:27:54 PM
How many BTC were mined in the first 24 hours?

How many XMR were mined in the first 24 hours?

How many dash were mined in the first 24 hours?

Is this the only thing you are interested? A blockchain explorer can tell you that, you don't need me, or anyone else to tell you the number of coins.
Pages: « 1 ... 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 [53] 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 ... 208 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!