Bitcoin Forum
December 08, 2016, 04:04:12 AM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: Cutdown Graphics Card  (Read 1425 times)
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 02, 2011, 11:09:23 AM
 #1

So there's lots of talk going around about various projects to fund custom FPGA and ASIC miners, the latter of which has developmental costs in the millions (ie. far out of reach for average users here). A while back someone mentioned they couldn't understand why ATI doesn't produce GPGPU cards like Nvidia's Tesla range, and I think they have a fair point in that: for mining the two essential factors are stream processors and core frequencies - standard consumer GPGPUs go half way on this by meeting the stream processor and frequency requirements, but have enormous VRAM chips (which would be pointless for mining).

As people are prepared to invest a total of millions on an ASIC, would it not be cheaper and more worthwhile (for your average user here) to start taking reference ATI designs and cutting all the unnecessary extras which bring the price up but don't improve mining rates? I don't pretend to really know anything about this field; but from what I understand, graphics card manufacturers take a reference design, tweak the design by changing routing and components to meet a specification, buy the components in and then assemble them onto a PCB - all of which is within reach for a well organized community project.

So by taking a reference design, cutting the bus width (would this decrease costs?), reducing the memory capacity and effective speed (DDR2 / DDR3 are still readily available and cheap) and using a fan chosen purely for airflow, I reckon that would sheer off a fair amount of the cost.

Now obviously economies of scale apply here, but even still - is this realistic?


This is really just a thought I had the other day, so I'd be interested to see what thoughts you guys had.

Cheers,
Mike

1481169852
Hero Member
*
Offline Offline

Posts: 1481169852

View Profile Personal Message (Offline)

Ignore
1481169852
Reply with quote  #2

1481169852
Report to moderator
1481169852
Hero Member
*
Offline Offline

Posts: 1481169852

View Profile Personal Message (Offline)

Ignore
1481169852
Reply with quote  #2

1481169852
Report to moderator
1481169852
Hero Member
*
Offline Offline

Posts: 1481169852

View Profile Personal Message (Offline)

Ignore
1481169852
Reply with quote  #2

1481169852
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1481169852
Hero Member
*
Offline Offline

Posts: 1481169852

View Profile Personal Message (Offline)

Ignore
1481169852
Reply with quote  #2

1481169852
Report to moderator
1481169852
Hero Member
*
Offline Offline

Posts: 1481169852

View Profile Personal Message (Offline)

Ignore
1481169852
Reply with quote  #2

1481169852
Report to moderator
Dinoking
Newbie
*
Offline Offline

Activity: 17



View Profile
September 02, 2011, 03:34:29 PM
 #2

I agree that this sounds like a better option then the fpgas. Plus in theory would be much cheaper. But ATI would rather have us by there 7000 series cards for a premium lol  Wink

Bitcoin address Smiley 12C6RczmS4DVDh5uJAWXkU6aHxvfFMdNHQ
Sabi
Jr. Member
*
Offline Offline

Activity: 56


View Profile
September 02, 2011, 04:43:30 PM
 #3

Simple answer - You are under the assumption that AMD wants to produce cards for Bitcoin mining.  They do not, and I agree with their decision. 

Please contribute to help get us out of the system: 14ab6EktVtyA8dABhXQjYkH1f8xxJRLmyB
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 02, 2011, 04:57:59 PM
 #4

I don't think you understood me correctly. It's obvious no company wants to produce cards for mining (why would they?), AMD/ATI don't produce the cards, they design the chips and a reference card and that's it - they couldn't give a monkey's what manufacturers choose to do with their design.

So take the reference design => Cut out a load of stuff which isn't needed => Final result is a "mining" card with a much better MH/$.

AngelusWebDesign
Sr. Member
****
Offline Offline

Activity: 392


View Profile
September 02, 2011, 05:06:27 PM
 #5

I understand what you're suggesting, and it isn't totally crazy (unless you need "thousands" of customers for a graphics card manufacturing run instead of "hundreds").

All it would do is raise the difficulty Smiley

Seriously, though, this would have been more feasible several months ago. Now, with BTC at $8 each and most cards making 1/8th of a BTC per day (after electricity costs), MOST people -- even miners -- aren't expanding their mining farms with both hands.
geek-trader
Sr. Member
****
Offline Offline

Activity: 294


View Profile
September 02, 2011, 05:21:57 PM
 #6

To me the huge advantage of FPGAs is their very low power usage.  This lowers electricity costs, heat and noise.

What you are proposing could lower the cost of the card, but the power usage would be pretty much the same.

If FPGAs get just a little cheaper, I'm getting a few.

Make 1 deposit and earn BTC for life! http://bitcoinpyramid.com/r/345
Play my FREE HTML5 games at: http://magigames.org  BTC donations accepted.
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 02, 2011, 05:27:31 PM
 #7

Hmm you both have a fair point. I guess in the next year or so it's really MH/Watt which takes precedence over MH/$. Although the FPGAs recently released are untouchable for power consumption, they don't half cost a lot!

Knighty
Newbie
*
Offline Offline

Activity: 15



View Profile
September 02, 2011, 05:35:37 PM
 #8

If all the quotes of the 7 series being "conservatively" double the stream processors is true, then its entirely possible we'll end up with cards in the $100 range capable of hitting a quite high Mh/s. Which turns things into a different ballgame altogether.

That being said, if it ever got off the ground, I would definitely be interested in buying a card that was designed for mining.
bcpokey
Sr. Member
****
Offline Offline

Activity: 462


View Profile
September 02, 2011, 08:45:23 PM
 #9



Now obviously economies of scale apply here, but even still - is this realistic?



No it isn't. Guess how much an NVidia Tesla costs?

Spoiler: Typically over $2000, and for not that much more card than your typical quarter cost gaming card. Same thing you'd see for an ATI style mining card.
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 02, 2011, 09:21:59 PM
 #10

I am talking about taking a standard gaming card (not a workstation card) and cutting the features down. I didn't mention an ATI-esque Tesla for mining, I was referring to a comment somone else made about the lack of ATI GPGPU cards.

bcpokey
Sr. Member
****
Offline Offline

Activity: 462


View Profile
September 02, 2011, 10:30:19 PM
 #11

I am talking about taking a standard gaming card (not a workstation card) and cutting the features down. I didn't mention an ATI-esque Tesla for mining, I was referring to a comment somone else made about the lack of ATI GPGPU cards.

Do you know what a tesla is? It's based on the fermi architecture just like any other gtx 5xx gaming card. If you want a GPGPU from ATI that's more or less the same idea. You are re-vamping an existing architecture for a much smaller audience, and so you are going to pay out the butt for it, and not necessarily get much improvement.
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 02, 2011, 10:41:05 PM
 #12

Yes I do know that Tesla is based on the Fermi architecture, are you aware that they have 6GB of memory on them which would be totally useless?

You're talking about improving performance, I'm talking about cutting costs. I would have thought it would be pointless making an ATI "super GPU" equivalent to the Tesla, I'm saying it would be better to take a cheap consumer card (a HD 5850 for example) and cut the crap which isn't used. I'm pretty sure that the performance wouldn't budge, but that's got to have a fairly serious impact on the price.

Please don't take such an offensive stance, I am merely bouncing an idea off people.

deslok
Sr. Member
****
Offline Offline

Activity: 448


It's all about the game, and how you play it


View Profile
September 03, 2011, 12:24:26 AM
 #13

Has anyone tried to contact amd to simply try and purchase a gpu bga package? i have to assume that connectiong it to what are known factors is simpler than desiging a brand new asic

"If we don't hang together, by Heavens we shall hang separately." - Benjamin Franklin

If you found that funny or something i said useful i always appreciate spare change
1PczDQHfEj3dJgp6wN3CXPft1bGB23TzTM
fcmatt
Legendary
*
Offline Offline

Activity: 1106


View Profile
September 03, 2011, 03:38:19 AM
 #14

when people discuss fpga boards, like the ones recently released, are they adding in the power usage
of the host computer which has a usb port to connect it to? buying just one means having a PC on all the
time too. And buying a low power PC, while not being that expensive, has to be factored in. Lets say 200
dollars for something small and just as efficient as the fpga board. And lets triple the power usage?

thoughts?
geek-trader
Sr. Member
****
Offline Offline

Activity: 294


View Profile
September 03, 2011, 04:01:59 AM
 #15

when people discuss fpga boards, like the ones recently released, are they adding in the power usage
of the host computer which has a usb port to connect it to? buying just one means having a PC on all the
time too. And buying a low power PC, while not being that expensive, has to be factored in. Lets say 200
dollars for something small and just as efficient as the fpga board. And lets triple the power usage?

thoughts?

I guess I'm a geek, but I assumed everyone had a computer on all the time anyway?  I mean, where are all your files stored?  What do you stream to your TV from?  What does all your music play off of if you don't have a computer on 24/7?

Make 1 deposit and earn BTC for life! http://bitcoinpyramid.com/r/345
Play my FREE HTML5 games at: http://magigames.org  BTC donations accepted.
fcmatt
Legendary
*
Offline Offline

Activity: 1106


View Profile
September 03, 2011, 04:29:10 AM
 #16

when people discuss fpga boards, like the ones recently released, are they adding in the power usage
of the host computer which has a usb port to connect it to? buying just one means having a PC on all the
time too. And buying a low power PC, while not being that expensive, has to be factored in. Lets say 200
dollars for something small and just as efficient as the fpga board. And lets triple the power usage?

thoughts?

I guess I'm a geek, but I assumed everyone had a computer on all the time anyway?  I mean, where are all your files stored?  What do you stream to your TV from?  What does all your music play off of if you don't have a computer on 24/7?

I really cannot imagine having 10 of these fpga boards laying out in the living room or in some type of tray...
They use a molex connector right? They all need a usb port thus some type of hub comes into play. What good
is an laptop in that case when you need molex connectors (10 of them)? Some type of stand alone PSU with
an old laptop?

In my case, my main PC is in the living room connected to a aud/vid receiver which then goes to the tv.
I really do not want a bunch of these cards sitting around so the cat will sleep on them to stay warm.
So down in the basement they would go sitting on top of an old 1U half size rack mount server.. which aint
exactly power efficient if it contains a 350 watt power supply.

I am just trying to envision the most efficient way to hook up 10 of these boards without doubling the power
usage compared to the boards themselves. Lets say you get the dual fpga boards that draw about 15 watts..
so that is 150 watts for about 2 gh/s. Does one basically have to double or triple that power usage to get
them to be actually useful?
bcpokey
Sr. Member
****
Offline Offline

Activity: 462


View Profile
September 03, 2011, 04:55:18 AM
 #17

Yes I do know that Tesla is based on the Fermi architecture, are you aware that they have 6GB of memory on them which would be totally useless?

You're talking about improving performance, I'm talking about cutting costs. I would have thought it would be pointless making an ATI "super GPU" equivalent to the Tesla, I'm saying it would be better to take a cheap consumer card (a HD 5850 for example) and cut the crap which isn't used. I'm pretty sure that the performance wouldn't budge, but that's got to have a fairly serious impact on the price.

Please don't take such an offensive stance, I am merely bouncing an idea off people.

An offensive stance is replying in a way that you don't like? That's sort of life.

As for RAM, it is a slight cost certainly, but it's not at all the main cost of what makes a tesla a > $2k card, not by a long shot. Creating a custom layout to be printed for a tiny market is the main cost. Tens of thousands compared to tens of millions of units makes a huge difference.
Wildy
Jr. Member
*
Offline Offline

Activity: 31


View Profile
September 03, 2011, 07:32:16 AM
 #18

My apologies, I was quite tired and it just seemed like a bit of a snide comment that's all. I think the amount I can contribute here is now limited as this is going beyond the extent of my knowledge. But if you did these things I mentioned (cut memory speeds / size, cut bus width, using cheaper non-solid-state caps, cheaper high airflow fan), how much would the reference PCB design need to be changed? The latter two I don't think would require any change at all (maybe a few slight changes for the caps). The bus width I'm not sure about, but I think the only major thing on there is the change in memory. Even still, it's not as if you're drastically re-designing the whole thing, just a few changes here and there.

CrazyGuy
Legendary
*
Offline Offline

Activity: 1806



View Profile
September 03, 2011, 08:08:34 AM
 #19

Damn, I was really hoping to see some sawed off 6990 pics when I entered this thread Smiley

Now, if AMD made a cheap and energy efficient mining card, then every serious bitcoin miner would snatch them up, in turn driving up difficulty and defeating the purpose.

Also, I don't believe the rumors on the 7000 series having twice the stream processor count. If AMD's intended audience of gamers has to fight with large scale bitcoin mining operations just to get their hands on the latest series card, they may move to Nvidia...

ASICPuppy.net ASIC Mining Hardware and Accessories - GekkoScience DL580 Breakout Boards are In Stock!
Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!