themike5000
Member
Offline
Activity: 99
Merit: 10
|
|
July 07, 2011, 08:37:02 PM |
|
What software are you using to mine?
GUIMiner 0701 But is GUIMiner taking advantage of the GPU core? Yes. You can't get 65mhash out of a normal CPU.
|
Vertcoin: VdHjU3L2dcHCR3uQmqpM6mf4LCvp2678wh
|
|
|
CanaryInTheMine
Donator
Legendary
Offline
Activity: 2352
Merit: 1060
between a rock and a block!
|
|
July 29, 2011, 11:09:33 PM |
|
I have a A8-3850 Box,
Running all 4 cores. Getting 65MHash/s
How are you cooling it? stock sink/fan or something else? Have you tried overclocking?
|
|
|
|
KKAtan
Newbie
Offline
Activity: 50
Merit: 0
|
|
July 30, 2011, 04:59:50 PM |
|
So, the 2.9GHz will do 65Mhash/s...
What about the 2.6GHz version? The cpu speed is irrelevant. What you look at is how many shaders there are, and how fast the shaders run. 400 shaders at 600 mhz is the strongest Llano you can get. Running all 4 cores. Getting 65MHash/s Best to deactivate the cpu mining and only mine with the graphics shaders. Overclocking the shaders from 600 Mhz should boost your hash rate far more than the cpus ever could achieve. If they make a power-efficient It's vliw5 at 32 nm soi. It's far more power efficient than any other graphics card in existence currently.
|
|
|
|
GenTarkin
Legendary
Offline
Activity: 2450
Merit: 1002
|
|
July 30, 2011, 08:19:33 PM |
|
I have set one of these up and using CGminer ... the a8 can pull off around 68mh/s .. OS is win7 x64. Not too bad!
|
|
|
|
caston
|
|
October 05, 2011, 12:24:21 PM |
|
Could you use this with something like Tenebrix working with both the CPU and the shaders?
|
bitcoin BTC: 1MikVUu1DauWB33T5diyforbQjTWJ9D4RF bitcoin cash: 1JdkCGuW4LSgqYiM6QS7zTzAttD9MNAsiK
-updated 3rd December 2017
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 05, 2011, 12:39:15 PM |
|
The shader core is a standard GPU (called HD 6550D) its specs are similar to the GPU on HD 6570 video card. It is 400 shaders vs 480 running at higher clock and it is more efficient using 32nm process but it is for all intents and purposes a GPU.
It has all the advantages and disadvantages of GPU. Namely it only has 64KB on cache on the GPU which is insufficient to efficiently run scrypt. The CPU "side" of the chip is a rather pedestrian CPU. It should perform similarly to any other modern CPU.
So while you "could" use it to mine "CPU friendly" you won't get a massive boost over other CPU as the GPU is "crippled". A higher clocked multi-core chip (like 6 core Phenom II) would still be superior.
|
|
|
|
caston
|
|
October 05, 2011, 01:22:33 PM |
|
The shader core is a standard GPU (called HD 6550D) its specs are similar to the GPU on HD 6570 video card. It is 400 shaders vs 480 running at higher clock and it is more efficient using 32nm process but it is for all intents and purposes a GPU.
It has all the advantages and disadvantages of GPU. Namely it only has 64KB on cache on the GPU which is insufficient to efficiently run scrypt. The CPU "side" of the chip is a rather pedestrian CPU. It should perform similarly to any other modern CPU.
So while you "could" use it to mine "CPU friendly" you won't get a massive boost over other CPU as the GPU is "crippled". A higher clocked multi-core chip (like 6 core Phenom II) would still be superior.
Thank you. I have been reading up a bit on them and it is quite confusing. You'd think that in AMD's whole fusion strategy they could allow the GPU and the CPU to access the same cache. Perhaps this is further down the pipeline.
|
bitcoin BTC: 1MikVUu1DauWB33T5diyforbQjTWJ9D4RF bitcoin cash: 1JdkCGuW4LSgqYiM6QS7zTzAttD9MNAsiK
-updated 3rd December 2017
|
|
|
Shevek
|
|
October 05, 2011, 01:58:05 PM |
|
It'll be about 92% of a 5570, or about 55 MH/s. I have a 5570 and can do 200+ Mh/S... Unless a 92% means something different now, there is a problem somewhere. OT I'm curious about your settings ...
|
Proposals for improving bitcoin are like asses: everybody has one 1SheveKuPHpzpLqSvPSavik9wnC51voBa
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 05, 2011, 02:54:55 PM |
|
Thank you. I have been reading up a bit on them and it is quite confusing. You'd think that in AMD's whole fusion strategy they could allow the GPU and the CPU to access the same cache. Perhaps this is further down the pipeline. Possibly someday. One thing is for sure we can expect tighter and tighter integration of CPU and GPU cores. Currently the GPU doesn't have access to CPU cache though. A high end "fusion" could be interesting for high performance computing. Imagine a chip w/ 4 or 8 x86 cores and a much larger shader core one with internal high speed cache and ability to share L2 caches of "traditional" cores. In a 4 socket server that would give you pretty amazing computing densities. Of course AMD whole blending of the lines between CPU & GPU will eventually kill the "CPU friendly" block chains. It is a futile endeavor.
|
|
|
|
MadHacker
|
|
October 05, 2011, 04:03:51 PM |
|
if AMD would just make gpu chips stack-able in to arrays that would be something
that would be awsome get 10 HD6970 chips stacked together.. it could dump 3000 watts of heat in a square inch. just have to figure out how to cool it
|
|
|
|
CanaryInTheMine
Donator
Legendary
Offline
Activity: 2352
Merit: 1060
between a rock and a block!
|
|
October 05, 2011, 04:05:36 PM |
|
if AMD would just make gpu chips stack-able in to arrays that would be something
that would be awsome get 10 HD6970 chips stacked together.. it could dump 3000 watts of heat in a square inch. just have to figure out how to cool it with a very cold liquid
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 05, 2011, 04:23:29 PM |
|
if AMD would just make gpu chips stack-able in to arrays that would be something
that would be awsome get 10 HD6970 chips stacked together.. it could dump 3000 watts of heat in a square inch. just have to figure out how to cool it liquid cooling could handle it. It takes 4 watt hours of power to raise 1 gallon by 1 deg C. Thus to keep the temp of 3000 watt chip within 10 deg of ambient it would require 147 gallons per hour. Sounds like a lot but that is just a mere 2 gallons per minute. A good water cooling pump has 10x that capacity. Now to avoid the water heating up you would need a pretty large radiator to dissipate 3kW of heat but it wouldn't require anything exotic. To correct what someone above said the temp of water is negligible, water is a very effective conductor of heat. "Warm water" is cools just as well as "ice cold" water is your goal is just to keep the temps below say 60C.
|
|
|
|
MadHacker
|
|
October 05, 2011, 04:54:54 PM |
|
if AMD would just make gpu chips stack-able in to arrays that would be something
that would be awsome get 10 HD6970 chips stacked together.. it could dump 3000 watts of heat in a square inch. just have to figure out how to cool it liquid cooling could handle it. It takes 4 watt hours of power to raise 1 gallon by 1 deg C. Thus to keep the temp of 3000 watt chip within 10 deg of ambient it would require 147 gallons per hour. Sounds like a lot but that is just a mere 2 gallons per minute. A good water cooling pump has 10x that capacity. Now to avoid the water heating up you would need a pretty large radiator to dissipate 3kW of heat but it wouldn't require anything exotic. To correct what someone above said the temp of water is negligible, water is a very effective conductor of heat. "Warm water" is cools just as well as "ice cold" water is your goal is just to keep the temps below say 60C. I still think it would be hard to create a water block that will be able to absorb 3000W of heat from a square inch. i think you would then need micro channels or tubes through the GPU chip itself to remove the heat.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 05, 2011, 05:07:21 PM |
|
I still think it would be hard to create a water block that will be able to absorb 3000W of heat from a square inch. i think you would then need micro channels or tubes through the GPU chip itself to remove the heat.
No reason for it to be 3000W from 1" squared. Even if you could do that the outer chip layers would act as an insulator and "cook" the inner chips. You could simply have a chip-waterblock sandwhich. waterblock chip waterblock chip waterblock chip waterblock chip waterblock or maybe something more like a stacked grid array (4x4 chips under a waterblock and then stacked)
|
|
|
|
CanaryInTheMine
Donator
Legendary
Offline
Activity: 2352
Merit: 1060
between a rock and a block!
|
|
October 05, 2011, 05:13:42 PM |
|
I still think it would be hard to create a water block that will be able to absorb 3000W of heat from a square inch. i think you would then need micro channels or tubes through the GPU chip itself to remove the heat.
No reason for it to be 3000W from 1" squared. Even if you could do that the outer chip layers would act as an insulator and "cook" the inner chips. You could simply have a chip-waterblock sandwhich. waterblock chip waterblock chip waterblock chip waterblock chip waterblock or maybe something more like a stacked grid array (4x4 chips under a waterblock and then stacked) I think I saw something about interleaving chips with liquid cooling pathways... i think it was IBM?
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 05, 2011, 05:15:27 PM |
|
It was IBM (and others) however those designs are at micro scale. Water cooling INSIDE the chip in tiny channels and pumped via electrical impulses inside the chip itself. We likely are some years away from when that would be commercially viable.
IBM does currently sell some watercooled servers. Some of their high performance servers use a 32 core Power6 chip. With up to 44 chips per 44U rack (2 per 2U server), that's 1400 cores per standard datacenter rack. To dissipate that kind of thermal load IBM sells an enterprise rated watercooling kit.
Many people don't know that most early computers were watercooled. Most mainframes were (and some still are) liquid cooled. It was only when power densities dropped that air cooling became viable. In datacenter servers are getting smaller, more in a rack, and pulling more power. Using forced air to remove that heat is horribly inefficient and noisy. As the power density starts to climb we will see more enterprise grade watercooling.
|
|
|
|
kokojie
Legendary
Offline
Activity: 1806
Merit: 1003
|
|
October 05, 2011, 05:25:02 PM |
|
Really? share your magic then, I have a 5570 and can't even reach 100 mh/s It'll be about 92% of a 5570, or about 55 MH/s. I have a 5570 and can do 200+ Mh/S... Unless a 92% means something different now, there is a problem somewhere.
|
btc: 15sFnThw58hiGHYXyUAasgfauifTEB1ZF6
|
|
|
epenue
|
|
October 21, 2012, 09:23:00 PM |
|
Trying yo mine with one of these (3870K) and its just impossible;
i get 3MH, obviously GPU is doing nothing,
The problem i think is that i mine in Windows XP64, i have installed latest SDK available, 2.3, but with guiminer - opencl i get only 3MH, and trying to mine from command line i get this error:
C:\guiminer>poclbm Traceback (most recent call last): File "poclbm.py", line 48, in <module> pyopencl.LogicError: clGetPlatformIDs failed: invalid/unknown error code
All help i find about that error is for linux based systems,
any help?
Thanks,
|
|
|
|
legolouman
|
|
October 21, 2012, 09:25:50 PM |
|
Are we still debating?
The A4-3400 gets about 100mhash/s. I have firsthand experience in using them.
|
If you love me, you'd give me a Satoshi! BTC - 1MSzGKh5znbrcEF2qTrtrWBm4ydH5eT49f LTC - LYeJrmYQQvt6gRQxrDz66XTwtkdodx9udz
|
|
|
superfastkyle
|
|
October 22, 2012, 03:54:41 AM |
|
has anyone got an a10 yet to test? I believe on the sha-256 bandwidth tests it was about 3 times as fast as the a8 so I was hoping it maybe worth trying if I build new gpu based systems
|
|
|
|
|