So I played around with optimum settings at different clock rates on SDK 2.4
...and I thought, 'great idea!
So here's my table of hashrate versus changes in v and w for a stock 6990 on DiabloMiner (based on 20mins to 2 hours of averaging):
v w hashrate % invalid GPU1 temp GPU2 temp fan speed
1 256 657015 0.7% 74.0 78.0 3905
1 128 659598 0.9% 74.0 77.5 3931
1 64 658721 2.7% 74.5 78.0 3940
2 256 589505 1.6% 73.0 77.0 3880
2 128 653134 0.0% 75.5 79.0 3980
2 64 661726 0.0% 75.5 79.0 3980
3 256 649722 1.1% 76.0 80.0 4070
3 128 661191 2.4% 76.0 80.0 4080
3 64 570522 0.0% 73.0 77.0 3880
4 256 518821 2.3% 70.5 74.5 3670
4 128 523139 9.1% 70.5 74.5 3700
4 64 660914 0.6% 76.0 79.5 3959
5 256 518460 2.0% 70.5 74.0 3670
5 128 522742 3.1% 70.0 73.5 3670
5 64 490119 3.4% 69.5 73.0 3620
6 256 522556 5.6% 70.0 73.5 3670
6 128 525496 2.4% 70.0 74.0 3670
6 64 525356 0.0% 70.5 74.0 3670
18 256 505655 1.6% 69.0 72.5 3600
18 128 511409 0.8% 69.0 73.0 3600
18 64 584084 0.0% 72.0 75.5 3770
19 256 498097 1.7% 69.0 71.0 3606
19 128 512743 1.2% 69.2 71.8 3628
19 64 587775 4.0% 72.0 75.5 3746
20 256 133321 1.4% 65.5 68.8 3294
20 128 131673 1.9% 65.7 68.7 3346
20 64 134202 0.0% 65.1 69.0 3359
21 256 143255 0.0% 66.2 69.0 3348
21 128 142799 1.2% 66.1 69.6 3358
21 64 143776 2.2% 66.5 69.7 3370
So default flags (-v1 -w256) works pretty well, and v1 w128 and w256 have the lowest temps of the settings that will give the highest rate. It's a shame I can't post my chart porn since I have a really nice density distribution plots showing bands of hashrate at: 660000, 582000, 520000, 495000, 132000. Odd. Nearly multiples of 11.
Also, the second GPU is consistently higher temp than the first one - dud core?
Cheers for the miner - that averager is the only way you can really get an accurate idea of what changing the flags actually does.
Next: changing clock speeds for the highest scoring flags, then finally changing -f.