sveetsnelda
|
|
January 22, 2012, 09:34:51 PM |
|
Because on some cards you can not down clock the memory far enough on Linux (6970). On Windows my rigs that have 6970's, the mem can be down clocked to 300, on Linux they cannot!
If it's a dedicated mining rig, just flash the BIOS. Yeah, it's lame that the ATI driver in Linux wont let you change the memclock, but a quick BIOS flash completely takes care of the issue.
|
14u2rp4AqFtN5jkwK944nn741FnfF714m7
|
|
|
os2sam
Legendary
Offline
Activity: 3583
Merit: 1094
Think for yourself
|
|
January 22, 2012, 09:56:59 PM |
|
Why do people keep torturing themselves with windows? Just use plain old debian and save yourself tons of time and headaches.
Well much to your, and my, chagrin the computing world has decided to use M$ Windoze. And those of use who support vertical market applications need to have multiple machines with the various versions of M$ OS's on them. So when I can, I mine on them, which is why I really like stand alone miners like CGMiner and Ufasoft. But when I need to do my job on them they are ready to go for work as well. So diverting machines to dedicated use with some *nix variant doesn't work for me. So I appreciate the effort of doing the Windoze builds as well. Sam
|
A: Because it messes up the order in which people normally read text. Q: Why is top-posting such a bad thing? A: Top-posting. Q: What is the most annoying thing on usenet and in e-mail?
|
|
|
Eveofwar
|
|
January 23, 2012, 08:29:34 AM |
|
cgminer version 2.1.2 - Started: [2012-01-16 18:14:21] -------------------------------------------------------------------------------- (5s):1664.2 (avg):1656.5 Mh/s | Q:198847 A:200504 R:5649 HW:0 E:101% U:22. TQ: 4 ST: 5 SS: 10 DW: 9250 NB: 938 LW: 75021 GF: 531 RF: 16 Connected to <mypool> with LP as user <myuser> Block: 00000b6e89bb00a92d1f75b327f1573f... Started: [00:20:12] -------------------------------------------------------------------------------- [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit GPU 0: | 373.3/372.4Mh/s | A:44859 R:1137 HW:0 U: 4.98/m I: 8 GPU 1: | 373.9/372.3Mh/s | A:45492 R:1214 HW:0 U: 5.05/m I: 8 GPU 2: | 303.0/302.6Mh/s | A:36805 R:1074 HW:0 U: 4.08/m I: 8 GPU 3: | 304.6/304.2Mh/s | A:36709 R:1051 HW:0 U: 4.07/m I: 8 GPU 4: | 305.6/304.9Mh/s | A:36640 R:1173 HW:0 U: 4.06/m I: 8 -------------------------------------------------------------------------------- <shares being accepted> Hmmm...where did my temps and fan speeds go ?
|
|
|
|
jake262144
|
|
January 23, 2012, 09:29:33 AM Last edit: January 23, 2012, 10:30:23 AM by jake262144 |
|
Sounds like you compiled without ADL support, no?
EDIT:: I just verified that the most recent git version compiles as expected. What Eveofwar posted is NOT a bug in cgminer itself.
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
January 23, 2012, 09:51:10 AM |
|
cgminer version 2.1.2 - Started: [2012-01-16 18:14:21] -------------------------------------------------------------------------------- (5s):1664.2 (avg):1656.5 Mh/s | Q:198847 A:200504 R:5649 HW:0 E:101% U:22. TQ: 4 ST: 5 SS: 10 DW: 9250 NB: 938 LW: 75021 GF: 531 RF: 16 Connected to <mypool> with LP as user <myuser> Block: 00000b6e89bb00a92d1f75b327f1573f... Started: [00:20:12] -------------------------------------------------------------------------------- [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit GPU 0: | 373.3/372.4Mh/s | A:44859 R:1137 HW:0 U: 4.98/m I: 8 GPU 1: | 373.9/372.3Mh/s | A:45492 R:1214 HW:0 U: 5.05/m I: 8 GPU 2: | 303.0/302.6Mh/s | A:36805 R:1074 HW:0 U: 4.08/m I: 8 GPU 3: | 304.6/304.2Mh/s | A:36709 R:1051 HW:0 U: 4.07/m I: 8 GPU 4: | 305.6/304.9Mh/s | A:36640 R:1173 HW:0 U: 4.06/m I: 8 -------------------------------------------------------------------------------- <shares being accepted> Hmmm...where did my temps and fan speeds go ? It obviously shouldn't look like that however: Did you either compile a non-ADL binary or (if you use linux) run without export DISPLAY=:0 Would be good to know if anyone else has the same issue in the same OS
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
January 23, 2012, 09:58:32 AM |
|
I have been using phoenix for some time, but CGminer have the most powerful and beautiful interface I have ever seen, a master piece!
Really loved it and run it through screen in auto-start script, but sometimes that screen session just disappeared and I have to restart the machine to get it back to work, is it due to overclocking of the card or network problem? Anyone have same experience?
|
|
|
|
sharky112065
|
|
January 23, 2012, 10:51:36 AM |
|
cgminer version 2.1.2 - Started: [2012-01-16 18:14:21] -------------------------------------------------------------------------------- (5s):1664.2 (avg):1656.5 Mh/s | Q:198847 A:200504 R:5649 HW:0 E:101% U:22. TQ: 4 ST: 5 SS: 10 DW: 9250 NB: 938 LW: 75021 GF: 531 RF: 16 Connected to <mypool> with LP as user <myuser> Block: 00000b6e89bb00a92d1f75b327f1573f... Started: [00:20:12] -------------------------------------------------------------------------------- [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit GPU 0: | 373.3/372.4Mh/s | A:44859 R:1137 HW:0 U: 4.98/m I: 8 GPU 1: | 373.9/372.3Mh/s | A:45492 R:1214 HW:0 U: 5.05/m I: 8 GPU 2: | 303.0/302.6Mh/s | A:36805 R:1074 HW:0 U: 4.08/m I: 8 GPU 3: | 304.6/304.2Mh/s | A:36709 R:1051 HW:0 U: 4.07/m I: 8 GPU 4: | 305.6/304.9Mh/s | A:36640 R:1173 HW:0 U: 4.06/m I: 8 -------------------------------------------------------------------------------- <shares being accepted> Hmmm...where did my temps and fan speeds go ? This happened to two of my rigs a couple days ago. I quit Cgminer and started it back up and temps and fan speeds showed up again. It was the same for me. It had been running for days before that info stopped displaying. Those two rigs I mentioned are running on Windows 7 using the compiled version provided by ckolivas.
|
Donations welcome: 12KaKtrK52iQjPdtsJq7fJ7smC32tXWbWr
|
|
|
jake262144
|
|
January 23, 2012, 11:21:02 AM Last edit: January 23, 2012, 11:31:19 AM by jake262144 |
|
This joke is getting old but what the hell... lolwindows: reboot || reinstallation || defenestration
The REAL joke is that the above is actually quite a powerful troubleshooting guide for Redmond-based OS-s.
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
January 23, 2012, 01:04:52 PM |
|
I have been using phoenix for some time, but CGminer have the most powerful and beautiful interface I have ever seen, a master piece!
Really loved it and run it through screen in auto-start script, but sometimes that screen session just disappeared and I have to restart the machine to get it back to work, is it due to overclocking of the card or network problem? Anyone have same experience?
The screen session shouldn't just disappear on it's own no matter what was running in it - so it most likely would be related to overclocking and the computer itself locking up. Also note that if cgminer does get any of the ATI cards running too hard, it can lock up the computer (as reported here by many people) and thus network access will appear unresponsive also
|
|
|
|
jjiimm_64
Legendary
Offline
Activity: 1876
Merit: 1000
|
|
January 23, 2012, 02:38:34 PM |
|
cgminer version 2.1.2 - Started: [2012-01-16 18:14:21] -------------------------------------------------------------------------------- (5s):1664.2 (avg):1656.5 Mh/s | Q:198847 A:200504 R:5649 HW:0 E:101% U:22. TQ: 4 ST: 5 SS: 10 DW: 9250 NB: 938 LW: 75021 GF: 531 RF: 16 Connected to <mypool> with LP as user <myuser> Block: 00000b6e89bb00a92d1f75b327f1573f... Started: [00:20:12] -------------------------------------------------------------------------------- [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit GPU 0: | 373.3/372.4Mh/s | A:44859 R:1137 HW:0 U: 4.98/m I: 8 GPU 1: | 373.9/372.3Mh/s | A:45492 R:1214 HW:0 U: 5.05/m I: 8 GPU 2: | 303.0/302.6Mh/s | A:36805 R:1074 HW:0 U: 4.08/m I: 8 GPU 3: | 304.6/304.2Mh/s | A:36709 R:1051 HW:0 U: 4.07/m I: 8 GPU 4: | 305.6/304.9Mh/s | A:36640 R:1173 HW:0 U: 4.06/m I: 8 -------------------------------------------------------------------------------- <shares being accepted> Hmmm...where did my temps and fan speeds go ? This happened to two of my rigs a couple days ago. I quit Cgminer and started it back up and temps and fan speeds showed up again. It was the same for me. It had been running for days before that info stopped displaying. Those two rigs I mentioned are running on Windows 7 using the compiled version provided by ckolivas. This will happen if I remote into a box, has to do with DISPLAY as kano suggested.
|
1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
|
|
|
jjiimm_64
Legendary
Offline
Activity: 1876
Merit: 1000
|
|
January 23, 2012, 02:43:41 PM |
|
This is all now in ckolivas' git (and in the README) It will be part of cgminer 2.2.0 when it's ready or you can get it from the git directly as usual I updated the API version to "1.0"
A few differences in names (see the README) ... also of course you need to specify the GPU number for GPU commands ... and one difference in functionality for gpufan: just a single value. If you want 'fan' to also work like some of the cgminer options, then specify what you want using the cgminer option names (I've made it like the screen interface but can change it of course)
While I also think the new "switchpool", "config", "gpuintensity" and "save" are useful, I will say, however, that changing the GPU/fan values doesn't really make a lot of sense. If you have auto-fan and auto-gpu on, then changing them wont work very well (cgminer will change them back soon enough) Also if you have an external program looking after them, same issue. If you don't have auto-fan and auto-gpu on (or something checking them) then you better make sure your API program pays good attention to the settings and adjusts them well.
5 btc sent to 1LPbuDSPT4DdYbwiqAVWDJm2sHHuh6PnqB Thank you.
|
1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
January 23, 2012, 03:40:41 PM |
|
This is all now in ckolivas' git (and in the README) It will be part of cgminer 2.2.0 when it's ready or you can get it from the git directly as usual I updated the API version to "1.0"
A few differences in names (see the README) ... also of course you need to specify the GPU number for GPU commands ... and one difference in functionality for gpufan: just a single value. If you want 'fan' to also work like some of the cgminer options, then specify what you want using the cgminer option names (I've made it like the screen interface but can change it of course)
While I also think the new "switchpool", "config", "gpuintensity" and "save" are useful, I will say, however, that changing the GPU/fan values doesn't really make a lot of sense. If you have auto-fan and auto-gpu on, then changing them wont work very well (cgminer will change them back soon enough) Also if you have an external program looking after them, same issue. If you don't have auto-fan and auto-gpu on (or something checking them) then you better make sure your API program pays good attention to the settings and adjusts them well.
5 btc sent to 1LPbuDSPT4DdYbwiqAVWDJm2sHHuh6PnqB Thank you. Received - thanks very much Meanwhile 2 things: 1) I don't really see the point of changing the strategy - wouldn't you normally have that correct to start with? But if you really want it as an API option just say so (no I'm not trying to get more BTC from you ) 2) The API fans act like the screen interface (one value - change the fan) - not like the command arguments (one value OR value range) I looked a the code for both (screen and arguments) and they are different so I matched the screen since that's all you can do when cgminer is running from the screen interface. If there's no reason for the difference (I'll check with ckolivas) then I guess if you want the value range option I should also change the screen input code to match. Let me know.
|
|
|
|
gnar1ta$
Donator
Hero Member
Offline
Activity: 798
Merit: 500
|
|
January 23, 2012, 04:27:04 PM |
|
Is U a measure of shares processed/min, shares submitted/min, or shares accepted/min? Or what I'm looking for really is does the miner know if the pool declares a submitted/accepted share stale? Or maybe it is just rejected?
|
Losing hundreds of Bitcoins with the best scammers in the business - BFL, Avalon, KNC, HashFast.
|
|
|
bitlane
Internet detective
Sr. Member
Offline
Activity: 462
Merit: 250
I heart thebaron
|
|
January 23, 2012, 05:01:12 PM |
|
Not sure if this has already been covered.....but I have a quick question about the ADL/5xxx Cards/6xxx Cards/GPU-Z and of course....CGMiner. NOW, all of my 5000 Series Cards (5770, 5830's) show my LOWERED MEMORY FREQUENCY correctly in everything I throw at them (GPU-Z, CGMiner, CCC etc). Unfortunately, there is a discrepancy with my 6000 series cards (6870, 6950 etc). When lowering memory frequency to 300Mhz using CGMiner (ADL), GPU-Z reports the lowered value. I was surprised to see that CGMiner reports the 6000 series memory speeds as stock, even when lowered and confirmed by GPU-Z. Which program is reporting the wrong memory frequency ?Here is a quick html snapshot of a monitor I am using, using the CGMiner API: http://members.shaw.ca/bitlane/bit/cgminerweb.htmThe miners in the link are named for the cards that are in them ( although the 0000MINER is a mixed pair of 6870's + a single 5830). You will notice that the HOSTS (in the link above) shown as 5770MINER and 5830MINER all have 300Mhz memory frequencies shown, while the HOST listed as 6950MINER (even though set using CGMiner to 300Mhz) shows memory @ 1250Mhz. On the 000MINER Host (top machine on page), the 2x 6870's show memory frequency of 1050Mhz, while the single 5830 in that box shows the correct 300Mhz frequency. I use a single clock and memory frequency setting in my BAT files for all cards, not separate. (ie. ......-I 8 --auto-fan --gpu-engine 930 --gpu-memclock 300 --temp-target 69....etc) As I said, GPU-Z shows 300Mhz for EVERY card, but CGMiner tells a different story. Which displays the correct memory frequency ? GPU-Z or CGMiner ?
|
|
|
|
jjiimm_64
Legendary
Offline
Activity: 1876
Merit: 1000
|
|
January 23, 2012, 06:10:26 PM |
|
Not sure if this has already been covered.....but I have a quick question about the ADL/5xxx Cards/6xxx Cards/GPU-Z and of course....CGMiner. NOW, all of my 5000 Series Cards (5770, 5830's) show my LOWERED MEMORY FREQUENCY correctly in everything I throw at them (GPU-Z, CGMiner, CCC etc). Unfortunately, there is a discrepancy with my 6000 series cards (6870, 6950 etc). When lowering memory frequency to 300Mhz using CGMiner (ADL), GPU-Z reports the lowered value. I was surprised to see that CGMiner reports the 6000 series memory speeds as stock, even when lowered and confirmed by GPU-Z. Which program is reporting the wrong memory frequency ?Here is a quick html snapshot of a monitor I am using, using the CGMiner API: http://members.shaw.ca/bitlane/bit/cgminerweb.htmThe miners in the link are named for the cards that are in them ( although the 0000MINER is a mixed pair of 6870's + a single 5830). You will notice that the HOSTS (in the link above) shown as 5770MINER and 5830MINER all have 300Mhz memory frequencies shown, while the HOST listed as 6950MINER (even though set using CGMiner to 300Mhz) shows memory @ 1250Mhz. On the 000MINER Host (top machine on page), the 2x 6870's show memory frequency of 1050Mhz, while the single 5830 in that box shows the correct 300Mhz frequency. I use a single clock and memory frequency setting in my BAT files for all cards, not separate. (ie. ......-I 8 --auto-fan --gpu-engine 930 --gpu-memclock 300 --temp-target 69....etc) As I said, GPU-Z shows 300Mhz for EVERY card, but CGMiner tells a different story. Which displays the correct memory frequency ? GPU-Z or CGMiner ?I can answer this very easily. the 6 series, at least the ones I have, you cannot lower the mem more then 100 below the core clock. trust cgminer numbers.
|
1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
January 23, 2012, 06:29:25 PM Last edit: January 23, 2012, 08:24:59 PM by DeathAndTaxes |
|
Not sure if this has already been covered.....but I have a quick question about the ADL/5xxx Cards/6xxx Cards/GPU-Z and of course....CGMiner.
NOW, all of my 5000 Series Cards (5770, 5830's) show my LOWERED MEMORY FREQUENCY correctly in everything I throw at them (GPU-Z, CGMiner, CCC etc).
Unfortunately, there is a discrepancy with my 6000 series cards (6870, 6950 etc). When lowering memory frequency to 300Mhz using CGMiner (ADL), GPU-Z reports the lowered value. I was surprised to see that CGMiner reports the 6000 series memory speeds as stock, even when lowered and confirmed by GPU-Z.
Which program is reporting the wrong memory frequency ? Neither. You SET the memclock to 300 but it is RUNNING at stock. GPU-Z shows what the card is SET to now what it is running at. cgminer shows what the card is actually RUNNING at. In GPU-Z if you go to the sensors tab it will show you what the card is actually RUNNING at and the value will match cgminer. 6000 series cards "suck" because there is a limit on how far you can push the memclock down. No idea why. 5000 series don't have this problem and neither do the 7000 series. Worse it doesn't give you any error it simply says "ok 300 Mhz" and then runs it at stock speed. On edit: if you are hardcore flashing the bios w/ a custom bios will let the card run at whatever you want it to.
|
|
|
|
bitlane
Internet detective
Sr. Member
Offline
Activity: 462
Merit: 250
I heart thebaron
|
|
January 23, 2012, 06:57:33 PM |
|
Thanks guys. That makes sense. I will pay closer attention to GPU-Z's sensors tab in the future. ....and here all this time I thought I was fighting power consumption/heat production from lowering the memory on those 6000 series POWER PIGS...lol
|
|
|
|
miscreanity
Legendary
Offline
Activity: 1316
Merit: 1005
|
|
January 23, 2012, 07:12:38 PM |
|
Thanks guys. That makes sense. I will pay closer attention to GPU-Z's sensors tab in the future. ....and here all this time I thought I was fighting power consumption/heat production from lowering the memory on those 6000 series POWER PIGS...lol
The limit is strange - pegged at 125Mhz between the core and memory rates, so 900 core won't allow any lower than 725 memory or it'll kick the memory clock back up to stock settings. Earlier, sveetsnelda suggested flashing the GPU BIOS. This works well and I have some 69xx series cards running with 920 core and 300 memory speeds. It's fairly easy to use a FreeDOS USB stick with AMD/ATI flash tools to boot from. Just pull the current BIOS, open using GPU-Z and then flash the same cards with the updated settings. After that, changing the core speed doesn't automatically cause a reset to high memory clock rates. My card temperatures dropped by at least 10C.
|
|
|
|
bitlane
Internet detective
Sr. Member
Offline
Activity: 462
Merit: 250
I heart thebaron
|
|
January 23, 2012, 07:22:39 PM Last edit: January 23, 2012, 07:39:20 PM by bitlane |
|
Thanks guys. That makes sense. I will pay closer attention to GPU-Z's sensors tab in the future. ....and here all this time I thought I was fighting power consumption/heat production from lowering the memory on those 6000 series POWER PIGS...lol
The limit is strange - pegged at 125Mhz between the core and memory rates, so 900 core won't allow any lower than 725 memory or it'll kick the memory clock back up to stock settings. Earlier, sveetsnelda suggested flashing the GPU BIOS. This works well and I have some 69xx series cards running with 920 core and 300 memory speeds. It's fairly easy to use a FreeDOS USB stick with AMD/ATI flash tools to boot from. Just pull the current BIOS, open using GPU-Z and then flash the same cards with the updated settings. After that, changing the core speed doesn't automatically cause a reset to high memory clock rates. My card temperatures dropped by at least 10C. I was able to lower my Reference 6870's from 1050Mhz to 850Mhz and saw an immediate 5c drop in temps. amazing, as these cards run hot normally. The 6950's are a bit more of a challenge and seem to crash once I start playing with lowering mem clocks. [EDIT] After some trial and error (mostly ERROR...read: LOCKUPS...lol)... I was able to get my 6950's running at 900 Mhz GPU & 900Mhz Mem (stock speed being 1250Mhz Mem). Anything lower than that seemed to lockup the machine while starting CGMiner. My 4 6950's are non reference (2x Asus DCUII 1GB @ 810/1250 stock...& 2x XfX 1GB 6950's @ 800/1250Mhz stock). The immediate drop in temps is a welcomed change.
|
|
|
|
jake262144
|
|
January 23, 2012, 09:05:31 PM |
|
The 6950's are a bit more of a challenge and seem to crash once I start playing with lowering mem clocks. [EDIT] After some trial and error (mostly ERROR...read: LOCKUPS...lol)... I was able to get my 6950's running at 900 Mhz GPU & 900Mhz Mem (stock speed being 1250Mhz Mem). My 4 6950's are non reference (2x Asus DCUII 1GB @ 810/1250 stock...& 2x XfX 1GB 6950's @ 800/1250Mhz stock).
That must be a Windows-based machine, right? My only Asus 6950 DCII 1GB has been merrily running at 942/820 MHz on my Debian machine since September. I hadn't sold it because of the decent cooling system and passable energy efficiency (140W while mining resulting in 2.70 MHash/W). Even so, I'll most probably ditch it when more of the the new 7xxx cards come out. The 7970 has been a bit underwhelming to say the very least.
|
|
|
|
|