Bitcoin Forum
December 09, 2016, 09:57:31 AM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 [4] 5 6 7 »  All
  Print  
Author Topic: Minimalist Spartan6-LX150 board  (Read 48431 times)
ultraimports
Full Member
***
Offline Offline

Activity: 187


View Profile
September 27, 2011, 01:45:56 PM
 #61

Once there are enough FPGAs on the network, difficulty will increase and GPUs will become unprofitable or barely
profitable for anyone paying for cooling + electricity [probably most people with more than 4-5 GPUs]. It's a self-fulfilling prophesy.

I often see this quoted but it is nonsense.  Higher difficulty will make FPGA even @ $2 per MH even MORE prohibitively expensive.  Higher difficulty benefits those w/ efficient GPU (like 5970 & 7xxx series) and moderate to low cost electricity the most.

I think you will see a difficulty spike will kill demand for new FPGA not drive it.

Take hypothetical FPGA miner $2 per MH.  150MH = $300 in cost.  Running 24/7/365 @ 15W.
Break even @ current difficulty is 25 months.
Break even @ 30% difficulty increase is 33 months.
Break even @ 50% difficulty increase is 40 months.

Currently today one could buy 5970 for <$500.  Say 3x5970 + powersupply + other components = 2.8GH for $2800.  Running 24/7/365 @ 1000W.

Break even @ current difficulty is 17 months.
Break even @ 30% difficulty increase is 25 months.
Break even @ 50% difficulty increase is 32 months.

Difficulty increases close the gap but $2 per MH is still beat by anyone w/ $0.10 electrical costs (or less).  I am interested in FPGA but these dire predictions of them killing GPU are simply unwarranted unless cost is closer to $1 per MH installed.

Remember GPU performance per watt won't be static.  The 7xxx series looks to almost double performance per watt (cutting electrical cost in half for GPU miners).  A break even of 40+ months is highly dangerous one risks being undercut by the next next gen video cards.  4 years is long enough for 2 product cycles and we will be looking @ 20nm chips (and other doubling of performance per watt).

Very good info. I guess I would ask the FPGA experts: how often do the FPGA chips increase in performance as well? Do they move as fast as GPUs? Do they follow Moore's law, essentially?
1481277451
Hero Member
*
Offline Offline

Posts: 1481277451

View Profile Personal Message (Offline)

Ignore
1481277451
Reply with quote  #2

1481277451
Report to moderator
1481277451
Hero Member
*
Offline Offline

Posts: 1481277451

View Profile Personal Message (Offline)

Ignore
1481277451
Reply with quote  #2

1481277451
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1481277451
Hero Member
*
Offline Offline

Posts: 1481277451

View Profile Personal Message (Offline)

Ignore
1481277451
Reply with quote  #2

1481277451
Report to moderator
1481277451
Hero Member
*
Offline Offline

Posts: 1481277451

View Profile Personal Message (Offline)

Ignore
1481277451
Reply with quote  #2

1481277451
Report to moderator
1481277451
Hero Member
*
Offline Offline

Posts: 1481277451

View Profile Personal Message (Offline)

Ignore
1481277451
Reply with quote  #2

1481277451
Report to moderator
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
September 27, 2011, 02:38:46 PM
 #62

Im not an FPGA  expert by any stretch but they definitely follow Moore's law, in fact arguably more easily than CPU's, as they are much simpler and you can just up the # units as your process gets smaller.  Similar to how, because CPU designs hit an IPC  and clock scaling brick wall, most of the extra transistor budget is simply spent on going from single to dual, quad and octal cores.

That said, Im not sure I agree with the above math; the market dynamic of mining leads to difficulty gravitating  towards break even point for the average miner. I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures. Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs.

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
September 27, 2011, 03:17:44 PM
 #63

Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs.

Most of the cost (60%+) comes from the actual FPGA.  It is unlikely prices will tumble.  FPGA already have economies of scale.  20 million are sold each year.  Another 1K (or even 10K) miners using FPGA isn't going to cause a massive drop in price. Maybe if one of the FPGA developers gets a massive buy order they could cut FPGA & assembly costs by 30%.  Software improvements might squeeze out another 10%-20% out of current gen FPGA but that still only gets us to ~$2/MH installed.

Yes FPGA benefit from Moore's law but so will GPU.  GPU are almost perfectly scalable.  When the process size gets cut in half then you simply double the numbers of shaders, get roughly 2x the performance and the die size (and thus cost & power) remains the same.

I have derailed this thread enough as it is.  To the OP very good work looks promising I just find the false hope of people (not you) pretending away the economic issues of FPGA (and claims of the death of GPU mining) to be naive & frustrating.
vx609e
Newbie
*
Offline Offline

Activity: 29


View Profile
September 27, 2011, 05:44:53 PM
 #64

Following
rph
Full Member
***
Offline Offline

Activity: 176


View Profile
September 28, 2011, 03:08:36 AM
 #65

I just find .. claims of the death of GPU mining to be naive & frustrating.

I've invested a lot of time into FPGA mining. Here is my thinking:

If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas.
I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin.

Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally
[in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves,
unless Bitcoin collapses entirely first, screwing over all miners. The payoff might take 2 years, but that is still a pretty decent ROI compared
to FDIC savings accounts, the stock market, treasuries, etc. My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit.

If/when 28nm or 20nm or 16nm GPUs are driving the difficulty, my 45nm FPGAs will still have better MH per watt, so they will
still be profitable operationally. And anyway I will then be adding 28nm or 20nm or 16nm FPGAs.

If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world.
GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity
[which I think is a tiny fraction of the network]. Then we'll see $50 5830s on eBay as lots of people rush to the exits.

I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin.

If a very high-end ASIC becomes the dominant technology then both GPUs + FPGAs will be unprofitable operationally.
I seriously doubt this will happen. The people with the skills and capital to make it happen could make a lot more money
with less risk building something else. [I'm not talking about a Mosis 250nm ASIC; I'm talking 90nm or better]

that still only gets FPGAs to ~$2/MH installed.

$1/MH is possible today if you build the boards yourself, or professionally in qty 100+

I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.

Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it.
It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs.
I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business,
and capture a large percentage of the coins left to be mined.

-rph

Ultra-Low-Cost DIY FPGA Miner: https://bitcointalk.org/index.php?topic=44891
ngzhang
Hero Member
*****
Offline Offline

Activity: 592


We will stand and fight.


View Profile
September 28, 2011, 03:46:53 AM
 #66

I just find .. claims of the death of GPU mining to be naive & frustrating.

I've invested a lot of time into FPGA mining. Here is my thinking:

If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas of the world.
I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin.

Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally
[in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves,
unless Bitcoin collapses entirely before then, screwing over all miners. It might take 2 years, but that is still a pretty decent ROI compared
to FDIC savings accounts, the stock market, treasuries, etc.

This is true even if, say, 28nm or 20nm GPUs are driving the difficulty. My 45nm FPGAs will still have better MH per watt, so they will
still be profitable operationally.

If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world.
GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity
[which I think is a tiny fraction of the network].

I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin.

that still only gets FPGAs to ~$2/MH installed.

$1/MH is possible today if you build the boards yourself, or professionally in qty 100+

I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.

Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it.
It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs.
I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business,
and capture a large percentage of the coins left to be mined.

-rph


In a short time, 1.5$/MH will come true by some "low-manufacture-cost-area" of the world. by my troth.
Because the FPGA mining system is really not came to a big business, so I think profession groups still not join this game. In fact, miners are still a very small group. At this time, most of people making hard work on FPGA mining system maybe really for their "love". Their technique and effort surely could make more money in other filed.

CEO of Canaan-creative, Founder of Avalon project.
https://canaan.io/
Business contact: love@canaan.io
All PMs will be unread.
fivebells
Sr. Member
****
Offline Offline

Activity: 462


View Profile
September 28, 2011, 01:19:39 PM
 #67

ngzhang, I am pretty much clueless about hardware, so I am interested in your views.  Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA?  How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?
ngzhang
Hero Member
*****
Offline Offline

Activity: 592


We will stand and fight.


View Profile
September 28, 2011, 01:43:47 PM
 #68

ngzhang, I am pretty much clueless about hardware, so I am interested in your views.  Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA?  How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?

In my opinion, none of company will take part in any mining ASICs. If they have enough resource to tape out an ASIC, I'm sure they will design a another project, not for mining. As a single person, we can behaviour by interests, but a real company can't.

And answer your question, 1/10 cost, 10X performance, 1/10 energy consumption, on a single ASIC. AT LEAST.

CEO of Canaan-creative, Founder of Avalon project.
https://canaan.io/
Business contact: love@canaan.io
All PMs will be unread.
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
September 28, 2011, 02:33:46 PM
 #69


In my opinion, none of company will take part in any mining ASICs.

Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among  others, this:
http://www.s2cinc.com/product/pd.asp?id=278

I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin.

ngzhang
Hero Member
*****
Offline Offline

Activity: 592


We will stand and fight.


View Profile
September 28, 2011, 02:57:43 PM
 #70


In my opinion, none of company will take part in any mining ASICs.

Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among  others, this:
http://www.s2cinc.com/product/pd.asp?id=278

I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin.

there are some tiny difference between standard SHA256 hashing and bitcoin hashing. So ...

CEO of Canaan-creative, Founder of Avalon project.
https://canaan.io/
Business contact: love@canaan.io
All PMs will be unread.
eldentyrell
Donator
Legendary
*
Offline Offline

Activity: 966


felonious vagrancy, personified


View Profile WWW
September 30, 2011, 11:47:08 PM
 #71

"My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit."

I am tempted to make this my new .signature

The printing press heralded the end of the Dark Ages and made the Enlightenment possible, but it took another three centuries before any country managed to put freedom of the press beyond the reach of legislators.  So it may take a while before cryptocurrencies are free of the AML-NSA-KYC surveillance plague.
gopher
Full Member
***
Offline Offline

Activity: 135


View Profile
October 11, 2011, 02:40:25 PM
 #72

I need a kit i can plug in and mine. The total package. When you can offer that (with enough hash rate), i'm sure that people will buy.
Indeed.
I am with these guys..  I am no electrical engineer..  but I can plug in a psu Smiley

You have a good point.  But to flip that around, on the GPU side it's taken as a given that you're going to buy your hardware from a company (ATI) that does not provide the mining software (or even admit it knows what bitcoin is).  But I understand that while gamers have seen GPUs before, most bitcoiners are encountering FPGAs for the first time.  They aren't scary; they're just obscenely flexible... "enough rope to hang yourself with."

I could, perhaps, put together a turn-key solution, although it would involve a lot of effort.  My two major concerns are:

1. HDL developer guilt.  It makes me slightly ill to see posts like ngzhang's "hey you lazy-ass HDL developers make your code faster so I can make MOAR PROFITZ!!!".  I'd feel queasy about selling a "solution" that bundled in somebody else's hard work.  I don't know the exact details of the fpgaminer/ztex dispute, but I can certainly empathize with the initial reaction from fpgaminer.  It would make me really happy to be providing low-cost boards to people who are interested in tweaking/tuning/improving the HDL code, but I think I've figured out now that there aren't as many of those people as I'd thought.

2. Support.  I'm happy to help out here in a casual message-board-member way.  But I'm kinda worried about lazy users buying a "turn-key" solution from me and then demanding that I hand-hold them through the whole process of configuring Xilinx's crapware drivers on their Windows host box (I haven't used Windows in almost a decade) under threat of posting negative reviews of my product ("did not work for me").  I definitely can't sell the boards for $250 if I have to budget in my own time spent on extensive tech support work.

Anyways.  Looks like the first run will be small personal-use-only, but there may be another batch of boards in November after I've figured out if it's worth taking this to the next level.

Hi big-chip-small-board,

I have been a lurker here for some time, more importantly a huge fan of your work.

May I offer you my opinion on this tricky matter.

I think that you should explore the advantage of your HDL developer quitting and streamline/optimise your work, in other words re-position it elegantly so it gets better accepted among the broader bitcoin community base.

The problems I see arise from the fact that FPGA developers try to tackle both parts of this project - design the hardware as well as develop the core functionality.

You know that this particular project requires two fundamentally different resources, one that is averagely good in both both areas will not do.

Hence, what about re-looking your marvellous hardware concept, perhaps give it a final touch and ensure that it is 100% compatible with the open source core functionality or any other firmware there is, so the people who buy your hardware can decide themselves what to run on it.

Instead of trying to excel in two areas, you then need to excel in one - design and produce great hardware - and let others develop the software and add value as they can.

This approach is very similar to the time when the GPU mining started - everyone would agree that selection of hardware and software are interlinked, but they got it from different sources.

Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

But the result is clear - at the end of the day, ATI does not write hashing code and the bitcoin script writers do not develop highly-integrated hardware - and irrespective of this division of functionalities, the bitcoin community does not have problem locating the required bits and pieces and building their rigs.

My 2 cents.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 11, 2011, 03:40:57 PM
 #73

Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

Please tell me you are kidding.
1) You are aware many of the software developers do this full time as their day job.  I am sure someone with $50K to $120K salary can afford an NVidia card.
2) That has absolutely nothing to do with why Nvidia performance is so poor.
gopher
Full Member
***
Offline Offline

Activity: 135


View Profile
October 11, 2011, 03:58:27 PM
 #74

Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

Please tell me you are kidding.
1) You are aware many of the software developers do this full time as their day job.  I am sure someone with $50K to $120K salary can afford an NVidia card.
2) That has absolutely nothing to do with why Nvidia performance is so poor.

You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.

P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 11, 2011, 04:14:42 PM
 #75

You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.

I have no reason to believe the cuda miners arent any good. Its the nvidia cards that arent as suited to bitcoin mining as amd cards, due to fundamentally different architectures. Since that makes the cards uncompetitive, it stands to reason few people will invest heavily in cuda apps that can only work on this (for bitcoin) very uncompetitive hardware. No amount of software optimization is going to turn a 140 MH/s nVidia card in to a 400 one. There is probably less than 10% untapped potential.

sirky
Sr. Member
****
Offline Offline

Activity: 407



View Profile
October 11, 2011, 04:28:10 PM
 #76

This is all true. Just look at the shader counts between NVidia and AMD cards and you have your answer. The processors (shaders) have to do the work, and NVidia cards don't have as many.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 11, 2011, 05:12:40 PM
 #77


You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.


You are missing the point.  There are VERY GOOD CUDA miners.  It is unlikely any future CUDA miner would get more than 10% more performance out of existing cards.

Nvidia hardware just happens to be ill-suited for integer math (the math used in hashing).
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 31, 2011, 08:31:43 AM
 #78

"My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit."

I am tempted to make this my new .signature
GS *own* the US treasury and whilst the USD is accepted as global reserve ccy (and energy aka oil is priced in said dollars), your FPGAs could be made utterly *useless* if GS decided the world's financial system needed revolutionary change... don't underestimate 'em.

Anyway that was entirely off-topic. I'm running an inefficient-ish but awfully good fun 7 GH/s system made from DIY store £12 flat-packed shelving units. I have not needed to turn on the central heating boiler in my UK house because the mining rigs are behaving like large-format fan heaters Cheesy

I could fit a LOT of your FPGA boards onto one of my shelf rigs. I doubt I could afford to - looks like I could get 120 of the FPGA units plus power and cooling done elegantly on the shelf unit!

But I am *more* than interested in acquiring a board filled with FPGAs (i.e. 5 daughterboards in the backplane?) - under the conditions that:

a) The kit is assembled to the point where the end-user (i.e. me) doesn't need to do any soldering more complicated than, say, splicing a custom connector to a PC standard PSU. I'm not an EE, not even an electronics hobbyist, and do NOT want to fuck up $1k with a clumsy soldering iron;
b) Getting the FPGAs mining away (pool or solo) is easy enough for a general-purpose software hacker and doesn't require EE knowledge. I mainly run Macs (because they're Unix with MS Office and work well) but all my mining rigs are Linux. I'd like to have my FPGA rig controlled by a Mac Mini or my old G4 Cube (CPU arch may cause problems if x86 libs are needed, though). I've only got 30 years coding experience but the lowest level code I know is C - unrolling loops and VHDL are *well* outside my skillset and I don't have time to learn;
c) Apart from the peripheral software, everything is pre-loaded and coded. I am not familiar with FPGAs but know that the dev kit for the units talked about here costs a fortune. I won't be tuning the code and re-loading it onto a set of 5 FPGAs, so I don't want or need that cost, but I need it to run as soon as I plug in a cable and ping some control commands down the cable;
d) The code loaded onto the FPGAs is *reasonably* efficient and not hugely sub-optimal. I don't want to spend a grand, and then find out in a couple of months about new bitstreams for the FPGAs I own... which would double my hashrate if I could re-program the things. I don't know how to do that, and I assume the SDK is needed too. From what I've read, this will not be a problem as the FOSS logic and all the proprietary optimisations aren't miles away from each other in speed?
d) ALL necessary cables are included - if they're custom then I'm happy to make them, but you HAVE to include the plugs / sockets because they may not be easily available to me in the UK (and if the connectors have 10+ pins then I'd prefer to pay for pre-made cables);
e) You are happy to ship to the UK. I will assume trust once I've spoken to you via email so am happy to provide payment up-front so long I feel everything is legit. I won't waste your time.


I can see how this technology may be a bit of a ball-ache to sell to a 16-yr-old Windows PC 'extreme gaming' enthusiast (no offence to said group, of course) due to the level of support required. However, if it's 'plug and play' to the extent that a reasonably old hacker can get working without ever getting into electronics, please let me know the price.

If running a grid of these FPGAs on your cool backplane (with gold anodised heatsinks, or anything that takes my fancy) gets a respectable hashrate (let's be very pessimistic and say 100 MH/s per FPGA, so half a gig for the rig) then I want one purely for the cool-factor...


Incidentally, whilst this will get the real EEs sneering at me here, what made me post up a firm request for quote (and if you want to sell me one, because you think I'll be able to get it running without drowning you in support emails, then I am a serious buyer) was how the design LOOKS. Yes, a competitor has questioned one aspect of the design on technical terms. I'm not qualified to comment, but the board looks tidy, elegant and with that heatsink, just really cool.

The ultra-low-cost FPGA solution (bake your own in a skillet!) thread impressed me hugely, but the complete solution is a mess of boards and wires. At the prices being quoted for these kits (you're all stuck by the cost of one major component), elegance is a massive value-add for anyone who considers industrial design important.

Hell, I'd put one board horizontally in the viewable area underneath my G4 Cube if I could cool the whole thing (and the Cube is souped up).


The only questions still vexing me are whether you'd sell one to the UK, whether that damn SDK is required (I can't call myself an academic, unless you consider professional financial qualifications 'academia'), and whether it really is just a case of plugging everything together, sticking a USB cable into a spare Mac or Linux box, and writing some code to send commands down the USB cable. If so, I'm in.

(and if I get stuck, my ex-VHDL-consultant mate would probably help, he's got a Stratix 3 dev board at home for teh lulz)

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 31, 2011, 04:51:17 PM
 #79


You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.


You are missing the point.  There are VERY GOOD CUDA miners.  It is unlikely any future CUDA miner would get more than 10% more performance out of existing cards.

Nvidia hardware just happens to be ill-suited for integer math (the math used in hashing).
Quite. The CUDA developers *have* developed good code. The hardware architecture is simply not as well-suited to the application as the ATI hardware architecture.

It's a bit like saying back in the pre-GPU days that my old quad G5 was feck-off fast at the FFTs done by the Seti project because only the best programmers could afford PowerMac G5 Quads. Yeah, those machines were silly-money, but the best programmers go where the best pay is (unless they already have enough and work for fun), and optimising code for voluntary projects on minority platforms like the old pre-Intel Mac is *not* where the big money was...

It's off-topic and potentially flamebait, but it appears that people who know more about code and hardware architecture than I do rate Nvidia more highly (elegance, quality drivers, etc.) than ATI/AMD. Given the appalling issues I've had with ATI GPUs in building my little bitcoin farm (one vendor, in three purchases totalling 10 cards, managed to send one DOA card each purchase, and one of the originally-working cards has now died. These were *all* XFX brand, so perhaps the XFX versions of Nvidia GPUs may be of similarly poor quality), I really can't tell whether ATI have bad driver code and poor hardware design, or whether OEMs are making a sow's ear out of a silk purse. I don't have any Nvidia kit - even my many Macs use ATI GPUs now.

The nightmare of ATI's Linux drivers and the 6950 cards showed that there's some funny business in the drivers - funny business that is developer time better spent on fixing bugs and increasing reliability. But the AMD hardware approach is **SO** much more appropriate to bitcoin mining OpenCL kernels that the whole ATI/Nvidia thing boils down to one thing.

Luck. Bitcoin mining is the 'killer app' for ATI's stream processor approach (at least in the 5xxx and 6xxx cards). That's just luck - there's nowhere NEAR that disparity in performance between the two platforms on their intended applications - games - otherwise Nvidia would be out of business. And if AMD's new 7xxx cards move away from simple-but-plentiful massively-parallel stream processors, you'll find that the older cards are STILL faster than the new ones. So far, I'm getting better performance from my 'outdated' 5850 cards than even the fastest 6950 I own, and the 6950 required jumping through LOADS of hoops. Oddly enough, the 'obsolete' 5850 cards are still being sold new in the UK for well over £200 - that's 'new release' pricing...

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 31, 2011, 05:41:46 PM
 #80

Not speaking on the architectural differences between NVidia and AMD but XFX are generally lower cost OEM.  Higher DOA don't surprise me.  They have good warranties though.  My impression (via dead cards and sometimes illogical bioses) is they are the Kia motors of videocards.
Pages: « 1 2 3 [4] 5 6 7 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!