Bitcoin Forum
September 23, 2017, 05:55:26 AM *
News: Latest stable version of Bitcoin Core: 0.15.0.1  [Torrent]. (New!)
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 »
  Print  
Author Topic: Wolf's XMR/BCN/DSH CPUMiner - 2x speed compared to LucasJones' - NEW 06/20/2014  (Read 398624 times)
Wolf0
Legendary
*
Offline Offline

Activity: 1694


Miner Developer


View Profile
June 21, 2014, 02:54:39 PM
 #321

Any updated miners ? The GPU now seems to have a massive advantage over CPUs now on monero

GPUs don't have a massive advantage yet - and when they do, there will be nothing I can do about it, seeing as GPUs are just better at this sort of thing.

Code:
Donations: BTC: 1WoLFdwcfNEg64fTYsX1P25KUzzSjtEZC -- XMR: 45SLUTzk7UXYHmzJ7bFN6FPfzTusdUVAZjPRgmEDw7G3SeimWM2kCdnDQXwDBYGUWaBtZNgjYtEYA22aMQT4t8KfU3vHLHG
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1506146126
Hero Member
*
Offline Offline

Posts: 1506146126

View Profile Personal Message (Offline)

Ignore
1506146126
Reply with quote  #2

1506146126
Report to moderator
azhago
Full Member
***
Offline Offline

Activity: 182


View Profile
June 21, 2014, 07:58:36 PM
 #322

Any updated miners ? The GPU now seems to have a massive advantage over CPUs now on monero

Massive advantage how? Claymore reports 600 H/s on a 290x, which is about double what you get on a fast desktop CPU, but it also costs about twice as much. I see no major advantage to GPUs at this point, they are just another reasonable option.


A 280x do ~400H/s
If you are a miner, you surely have one or more rigs, with several GPUs on each.
Having multiple CPU is a bit more complicated/expensive.

My rig with 5 280X perform @ 2300H/s
My dual xeon 2687w do ~740H/s with Wolf cpu miner and cost a lot more than the rig

smooth
Legendary
*
Online Online

Activity: 1540



View Profile
June 22, 2014, 01:09:37 AM
 #323

My rig with 5 280X perform @ 2300H/s
My dual xeon 2687w do ~740H/s with Wolf cpu miner and cost a lot more than the rig

DP xeons are really, really expensive compared to desktop CPUs. I don't know the most efficient recipe for a CPU mining rig, but that's certainly not it. Also consider power usage. The most power hungry CPUs are 130W, most are less. Every high end GPU is higher.
azhago
Full Member
***
Offline Offline

Activity: 182


View Profile
June 22, 2014, 08:11:07 AM
 #324

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

Agamemnus
Member
**
Offline Offline

Activity: 72

🌟ATLANT ICO: 7/09/17🌟


View Profile
June 22, 2014, 08:49:10 AM
 #325

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

smooth
Legendary
*
Online Online

Activity: 1540



View Profile
June 22, 2014, 09:04:04 AM
 #326

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

Well said. As I have explained before, there is a role for GPU mining, which is why I am currently the largest individual contributor to the bounty for an open source GPU miner. However, GPU mining is not dominant for this algorithm the way it is for most others, merely competitive (as you correctly explain, by design).

dewdeded
Legendary
*
Offline Offline

Activity: 980


Monero Evangelist


View Profile WWW
June 22, 2014, 02:38:03 PM
 #327

How to compile under OS X.

1.) ./autogen.sh
2.) ./configure CFLAGS="-march=native -mno-avx"
3.) edit Makefile, search string "-fuse-linker-plugin", delete this string/option from the "AM_CFLAGS" setting
(e.g. "AM_CFLAGS = -Ofast -flto -fuse-linker-plugin -funroll-loops \" becomes "AM_CFLAGS = -Ofast -flto -funroll-loops \")
4.) make
5.) Huh
6.) mine monero, take money, get bitches
7.) profit!

AceCobra1
Sr. Member
****
Offline Offline

Activity: 294



View Profile
June 22, 2014, 05:08:52 PM
 #328

What is the stratum for moneropool.com ?

Casu
Full Member
***
Offline Offline

Activity: 154


View Profile
June 22, 2014, 05:53:30 PM
 #329

What is the stratum for moneropool.com ?

How about mining on a different pool? They took it off the site for a reason.
3xpl0r3r
Member
**
Offline Offline

Activity: 90


View Profile
June 22, 2014, 08:29:22 PM
 #330

I keep getting "Stratum authentication failed....retry in 10 secs.  Anybody around??  On xmr pool
5w00p
Hero Member
*****
Offline Offline

Activity: 630



View Profile
June 22, 2014, 08:34:13 PM
 #331

I keep getting "Stratum authentication failed....retry in 10 secs.  Anybody around??  On xmr pool

There are some pool(s) that got DDOS'd.

Use wolf's pool, it is up and only 1% fee.

http://pool.cryptoescrow.eu/

Sample code to run minerd: minerd -o stratum+tcp://mine.cryptoescrow.eu:3333 -u YOUR-WALLET-ADDRESS -p x -t 4
3xpl0r3r
Member
**
Offline Offline

Activity: 90


View Profile
June 22, 2014, 08:37:32 PM
 #332

Already doing that.  Must be something on my end.
azhago
Full Member
***
Offline Offline

Activity: 182


View Profile
June 23, 2014, 12:43:00 AM
 #333

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

AceCobra1
Sr. Member
****
Offline Offline

Activity: 294



View Profile
June 23, 2014, 12:43:11 PM
 #334

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

I think your estimations of the power draw is incorect. I have 3 rigs + my desktop attached to my single socket with these power usage meter things... I have 1 x 280x, 3 x 7970, 7 x 7950 and 1 x 7950 with 1 x 750ti and 1x 650GTX and using my overclocked 2500k to mine, total power draw is 1550 to 1650w from the wall...

Wolf0
Legendary
*
Offline Offline

Activity: 1694


Miner Developer


View Profile
June 24, 2014, 06:22:35 AM
 #335

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.

Code:
Donations: BTC: 1WoLFdwcfNEg64fTYsX1P25KUzzSjtEZC -- XMR: 45SLUTzk7UXYHmzJ7bFN6FPfzTusdUVAZjPRgmEDw7G3SeimWM2kCdnDQXwDBYGUWaBtZNgjYtEYA22aMQT4t8KfU3vHLHG
Onicle
Newbie
*
Offline Offline

Activity: 1


View Profile
June 24, 2014, 08:29:05 AM
 #336

Im getting around 180H/s from my Xeon E5-2620 using this miner on Windows 7 64bit. My hashrate went up by using this miner from 150H/s. Seems bit small since Im under impression this processor should be rather good? Im new guy to CPU mining, did some mining before with my Quaddro K4000, but its just waste of time and I decided to try out with CPU.

ot: I guess you are the same wolf that runs the pool, Im seeing only 70H/s at pool statistics, normal?
Wolf0
Legendary
*
Offline Offline

Activity: 1694


Miner Developer


View Profile
June 24, 2014, 08:32:33 AM
 #337

Im getting around 180H/s from my Xeon E5-2620 using this miner on Windows 7 64bit. My hashrate went up by using this miner from 150H/s. Seems bit small since Im under impression this processor should be rather good? Im new guy to CPU mining, did some mining before with my Quaddro K4000, but its just waste of time and I decided to try out with CPU.

ot: I guess you are the same wolf that runs the pool, Im seeing only 70H/s at pool statistics, normal?

If it's been a short time, yes, it's normal, as it takes 10min to be accurate.

Code:
Donations: BTC: 1WoLFdwcfNEg64fTYsX1P25KUzzSjtEZC -- XMR: 45SLUTzk7UXYHmzJ7bFN6FPfzTusdUVAZjPRgmEDw7G3SeimWM2kCdnDQXwDBYGUWaBtZNgjYtEYA22aMQT4t8KfU3vHLHG
sparks2013
Jr. Member
*
Offline Offline

Activity: 55


View Profile
June 24, 2014, 02:11:35 PM
 #338

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.


It was released early this morning..
https://github.com/tsiv/ccminer-cryptonight

cryptonite: CcPSVicrABUeNVMJ8AtsghxfKtBPFS9aPg
antonio8
Legendary
*
Offline Offline

Activity: 1190


View Profile
June 24, 2014, 02:35:40 PM
 #339

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.


It was released early this morning..
https://github.com/tsiv/ccminer-cryptonight

I am getting 1,030 H/s with 5 750ti's

If you are going to leave your BTC on an exchange please send it to this address instead 1GH3ub3UUHbU5qDJW5u3E9jZ96ZEmzaXtG, I will at least use the money better than someone who steals it from the exchange. Thanks Wink
Wolf0
Legendary
*
Offline Offline

Activity: 1694


Miner Developer


View Profile
June 25, 2014, 10:03:16 AM
 #340

If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a 3930K@4.8GHz) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalk.org/index.php?topic=167229.msg7458872#msg7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.


It was released early this morning..
https://github.com/tsiv/ccminer-cryptonight

Well, I stand corrected. But it sucks on AWS - I assumed it'd be great there. Probably why it was released.

Code:
Donations: BTC: 1WoLFdwcfNEg64fTYsX1P25KUzzSjtEZC -- XMR: 45SLUTzk7UXYHmzJ7bFN6FPfzTusdUVAZjPRgmEDw7G3SeimWM2kCdnDQXwDBYGUWaBtZNgjYtEYA22aMQT4t8KfU3vHLHG
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!