Thx hashappliance On the chart...I think people are putting total watts used for Total Dissipated Power. TDP is in % of total power used and not total watts used. Need a column that says Total power used in watts.Thx EDIT: Ok I see...you're going by the manufacturers TDP. In Linux - it's WATTS cap ... Not percentage used when using nvidia-smi to control power mate. Thai is all nVidia's fault for allowing two different standards to take place, where the Windows percentage standard is completely wrong. The total WATTage used should ALWAYS be the common method of measure as it true and precise, where as the percentage way will ALWAYS be a misleading method of measurement, as the starting point for different cards are completely different. The applications in Windows also use this method and made it standard, which lead me to believe that these Windows developers, have a knack for screwing things up even at the modest tiny levels, like measurement standards. Always a non-standard, when they could just be simple and accurate. Either way, the TDP is quite an accurate way of comparison IF you have the ACTUAL figure, and not SOME percentage over SOME starting point that SOME card has started from. #crysx Yes I see and agree. GPU-Z app. should change their TDP % to actual wattage used too. That would be nice.
|
|
|
My 570 4Gb are better than my 570 8Gb, is it normal ? Same core but higher memory frequency on 8Gb. 2.3-2.4MHs on 4Gb 1.9-2.1MHs on 8Gb 256x0 on both. Tried 384x0 on 8Gb, very few change on hashrate.
EDIT : well, it seems I have big differences between my differents x70 4Gb, and my first test was on my bests ...
Your mining speed is so fast! Can you tell me what algorithm you are mining? Scrypt algo
|
|
|
My 570 4Gb are better than my 570 8Gb, is it normal ? Same core but higher memory frequency on 8Gb. 2.3-2.4MHs on 4Gb 1.9-2.1MHs on 8Gb 256x0 on both. Tried 384x0 on 8Gb, very few change on hashrate.
EDIT : well, it seems I have big differences between my differents x70 4Gb, and my first test was on my bests ...
Did you try 512x0 or 1024x0 or 512x512 ?
|
|
|
Is this coin jumped 1000% on yobit or I'm imagining?
Yobit does not have those coins.. I mean they keep dead coins... no blockchain... It is funny to see how unsuccessful projects I was part, rise on yobit even without blockchain! Is its blockchain active? Can't it be transferred? No and no. Exactly because of those old coins Yobit hold I think of them as not legit, because there are no chains. Dead. Is anyone working on the original XDECoin? Thx
|
|
|
I see Chaincoin is still hanging on.
|
|
|
Thx hashappliance On the chart...I think people are putting total watts used for Total Dissipated Power. TDP is in % of total power used and not total watts used. Need a column that says Total power used in watts.Thx EDIT: Ok I see...you're going by the manufacturers TDP.
|
|
|
Anyone have a 1050 gtx mining. Please .. what hashrate do you get. Thx.
|
|
|
Yes, the project has protection from asics.
But not from FPGA's, which is basicaly same thing like ASICs, since big farms got FPGA cards and private miners. Changing algo to something like ProgPOW would be only solution IMO. Progpow, Like ethash, is also working on fpgas.. The only real solution are Algo changes on a regular basis Yes exactly ocminer.
|
|
|
Is Raven still resistant to ASICs and FPGA?
As far as it equals the price growth - then it's definetly not asics. Exactly my opinion too.
|
|
|
What is this voting going on plz? EDIT:
|
|
|
Voted
|
|
|
We really need to see the h/watts too.
|
|
|
When will coinmarketcap be listed?
Ask the dev. I'd like to see it there too.
|
|
|
Thank you for adding x22i algo. I want to share my hashrates using Trex miner
Nvidia GTX 1070 - 6.4MH/s (Gigabyte GTX 1070 OC ) Nvidia GTX 1070 Ti - 7.5 Mh/s (Gigabyte GTX 1070Ti OC ) Nvidia GTX 1080Ti - 11.5 Mh/s (MSI GTX 1080Ti)
100 Mh Core OC only.
How many kH/Watt? My 750ti's hashrate. Core 1.345 ghz and mem 1.620 ghz. 20181027 15:10:46 x22i block 21744, diff 17162.719 20181027 15:10:45 WARN: GPU #0: Gigabyte GeForce GTX 750 Ti, intensity set to 15 20181027 15:10:45 WARN: GPU #1: Gigabyte GeForce GTX 750 Ti, intensity set to 18 20181027 15:13:06 GPU #0: Gigabyte GTX 750 Ti - 1518.29 kH/s, [T:52C, P:31W, F:56%, E:51kH/W] 20181027 15:13:06 GPU #1: Gigabyte GTX 750 Ti - 1619.46 kH/s, [T:50C, P:29W, F:39%, E:52kH/W]EDIT: I set my memory clocks to stock..... OC the memory didn't make much difference.
|
|
|
What about the mining algo?
|
|
|
I agree that there probably won't be official wallet made by ETN team, but I remember there was non official GUI wallet made by a community member. The link was somewhere in this thread or the earlier one (first ANN), I can't find it right now, will try to dig it later.
Thx.. appreciate that. Damn, search on bitcointalk is so broken I gave up, but found this reddit post (which is probably form same user I was looking for here) and you can find the wallet releases on his Github page - https://github.com/XzenTorXz/ElectroneumGUIWallet/releases . Since I wasn't using it by myself I'm not sure how stable is it, but you can test it. Thx sud. EDIT: That GUI wallet is broken.
|
|
|
|