Bitcoin Forum
November 12, 2024, 07:49:40 AM *
News: Check out the artwork 1Dq created to commemorate this forum's 15th anniversary
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Using the GPU in an A-Series "Trinity" APU  (Read 647 times)
dialingwand (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
April 21, 2013, 01:40:00 AM
 #1

Hi there, long time lurker, first post.

I have an A-series APU in my first little rig and I was wondering if anyone had managed to get the GPU therein, limited as it may be, to work alongside bigger discreet GPUs.

I had to disable it to get the system to post with two discreet GPUs installed and have no idea if such a thing could work, but it would be nice to put the little guy to work, even if we're only talking a few extra Mh/s.
thesnoo23
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
April 21, 2013, 01:42:04 AM
 #2

I've poked around with that as well, but haven't been able to get anything worth getting with it.  But I would be very interested in anyone who HAS. I have an ungodly number of computers sitting in this room with me, and would absolutely plug them up and get them to work if it was economically viable.
dialingwand (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
April 21, 2013, 02:02:21 AM
 #3

I can't seem to even find an BIOS/UEFI option to allow both, even though there seems be a lot of dual-GPU marketing rhetoric.

I do imagine if you had an array of machines and could tap into the unused on-board GPU, a lot of the "little" workers could add up with no noticeable effect on the end user!

Either way, I'll keep digging!
thesnoo23
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
April 21, 2013, 02:05:39 AM
 #4

actually, my main concern isn't how much is done per processor, but rather, the power efficiency. Right now, for instance, if I run a processor, with a good one i might get as high as ~20 kH/s with litecoin, at a cost of ~125 watts, which is not viable at all. That and initial price are my main concerns when looking at mining equipment.
dialingwand (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
April 21, 2013, 02:16:17 AM
 #5

Good point. I tried hashing for litecoins on a quad-core Xeon and the power requirements were through the roof for what was a very poor rate.
thesnoo23
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
April 21, 2013, 02:23:36 AM
 #6

Good point. I tried hashing for litecoins on a quad-core Xeon and the power requirements were through the roof for what was a very poor rate.

AMD has some multi core processors that are more power efficient than previous versions, and better than a xeon, certainly, but they still aren't viable for making a profit
dialingwand (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
April 21, 2013, 02:30:58 AM
 #7

Regarding the issue at hand, was I mistaken to believe the the GPU in the A-series was somehow separate from the CPU?

I sort of imagined that I could run the GPU at full tilt without using the CPU cycles and therefore reduce heat/power consumption. I might as well tackle my assumption before tackling the physical problem itself. Wink
shakezula
Sr. Member
****
Offline Offline

Activity: 308
Merit: 250



View Profile
April 21, 2013, 02:44:49 AM
 #8

I tried this with the 6410 in my new F55 board, no such luck. When the 7950s not on the board it can be used in Bitminter but that's about it...11Mh/s.
dialingwand (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
April 21, 2013, 04:27:49 PM
 #9

Thanks for sharing. I thought it would be low, but not that low. Isn't really surprising I guess.
saneczki
Newbie
*
Offline Offline

Activity: 57
Merit: 0



View Profile
April 21, 2013, 04:41:50 PM
 #10

I got one a10 5800K and bitminter i cant mine on them (extra) with 7970s Sad
Miner_LTC_BTC
Newbie
*
Offline Offline

Activity: 9
Merit: 0


View Profile
June 18, 2013, 05:55:46 AM
 #11

I got one a10 5800K and bitminter i cant mine on them (extra) with 7970s Sad

Why ? Smiley What's wrong. Theoretically it is only an extra GPU. What system do you use?
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!