ultraimports
|
|
September 27, 2011, 01:45:56 PM |
|
Once there are enough FPGAs on the network, difficulty will increase and GPUs will become unprofitable or barely profitable for anyone paying for cooling + electricity [probably most people with more than 4-5 GPUs]. It's a self-fulfilling prophesy.
I often see this quoted but it is nonsense. Higher difficulty will make FPGA even @ $2 per MH even MORE prohibitively expensive. Higher difficulty benefits those w/ efficient GPU (like 5970 & 7xxx series) and moderate to low cost electricity the most. I think you will see a difficulty spike will kill demand for new FPGA not drive it. Take hypothetical FPGA miner $2 per MH. 150MH = $300 in cost. Running 24/7/365 @ 15W. Break even @ current difficulty is 25 months. Break even @ 30% difficulty increase is 33 months. Break even @ 50% difficulty increase is 40 months. Currently today one could buy 5970 for <$500. Say 3x5970 + powersupply + other components = 2.8GH for $2800. Running 24/7/365 @ 1000W. Break even @ current difficulty is 17 months. Break even @ 30% difficulty increase is 25 months. Break even @ 50% difficulty increase is 32 months. Difficulty increases close the gap but $2 per MH is still beat by anyone w/ $0.10 electrical costs (or less). I am interested in FPGA but these dire predictions of them killing GPU are simply unwarranted unless cost is closer to $1 per MH installed. Remember GPU performance per watt won't be static. The 7xxx series looks to almost double performance per watt (cutting electrical cost in half for GPU miners). A break even of 40+ months is highly dangerous one risks being undercut by the next next gen video cards. 4 years is long enough for 2 product cycles and we will be looking @ 20nm chips (and other doubling of performance per watt). Very good info. I guess I would ask the FPGA experts: how often do the FPGA chips increase in performance as well? Do they move as fast as GPUs? Do they follow Moore's law, essentially?
|
|
|
|
P4man
|
|
September 27, 2011, 02:38:46 PM |
|
Im not an FPGA expert by any stretch but they definitely follow Moore's law, in fact arguably more easily than CPU's, as they are much simpler and you can just up the # units as your process gets smaller. Similar to how, because CPU designs hit an IPC and clock scaling brick wall, most of the extra transistor budget is simply spent on going from single to dual, quad and octal cores.
That said, Im not sure I agree with the above math; the market dynamic of mining leads to difficulty gravitating towards break even point for the average miner. I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures. Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
September 27, 2011, 03:17:44 PM Last edit: September 27, 2011, 06:03:27 PM by DeathAndTaxes |
|
Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs. Most of the cost (60%+) comes from the actual FPGA. It is unlikely prices will tumble. FPGA already have economies of scale. 20 million are sold each year. Another 1K (or even 10K) miners using FPGA isn't going to cause a massive drop in price. Maybe if one of the FPGA developers gets a massive buy order they could cut FPGA & assembly costs by 30%. Software improvements might squeeze out another 10%-20% out of current gen FPGA but that still only gets us to ~$2/MH installed. Yes FPGA benefit from Moore's law but so will GPU. GPU are almost perfectly scalable. When the process size gets cut in half then you simply double the numbers of shaders, get roughly 2x the performance and the die size (and thus cost & power) remains the same. I have derailed this thread enough as it is. To the OP very good work looks promising I just find the false hope of people (not you) pretending away the economic issues of FPGA (and claims of the death of GPU mining) to be naive & frustrating.
|
|
|
|
vx609e
Newbie
Offline
Activity: 29
Merit: 0
|
|
September 27, 2011, 05:44:53 PM |
|
Following
|
|
|
|
rph
|
|
September 28, 2011, 03:08:36 AM Last edit: September 28, 2011, 03:52:20 AM by rph |
|
I just find .. claims of the death of GPU mining to be naive & frustrating.
I've invested a lot of time into FPGA mining. Here is my thinking: If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas. I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin. Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally [in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves, unless Bitcoin collapses entirely first, screwing over all miners. The payoff might take 2 years, but that is still a pretty decent ROI compared to FDIC savings accounts, the stock market, treasuries, etc. My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit. If/when 28nm or 20nm or 16nm GPUs are driving the difficulty, my 45nm FPGAs will still have better MH per watt, so they will still be profitable operationally. And anyway I will then be adding 28nm or 20nm or 16nm FPGAs. If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world. GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity [which I think is a tiny fraction of the network]. Then we'll see $50 5830s on eBay as lots of people rush to the exits. I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin. If a very high-end ASIC becomes the dominant technology then both GPUs + FPGAs will be unprofitable operationally. I seriously doubt this will happen. The people with the skills and capital to make it happen could make a lot more money with less risk building something else. [I'm not talking about a Mosis 250nm ASIC; I'm talking 90nm or better] that still only gets FPGAs to ~$2/MH installed.
$1/MH is possible today if you build the boards yourself, or professionally in qty 100+ I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.
Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it. It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs. I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business, and capture a large percentage of the coins left to be mined. -rph
|
|
|
|
ngzhang
|
|
September 28, 2011, 03:46:53 AM |
|
I just find .. claims of the death of GPU mining to be naive & frustrating.
I've invested a lot of time into FPGA mining. Here is my thinking: If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas of the world. I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin. Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally [in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves, unless Bitcoin collapses entirely before then, screwing over all miners. It might take 2 years, but that is still a pretty decent ROI compared to FDIC savings accounts, the stock market, treasuries, etc. This is true even if, say, 28nm or 20nm GPUs are driving the difficulty. My 45nm FPGAs will still have better MH per watt, so they will still be profitable operationally. If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world. GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity [which I think is a tiny fraction of the network]. I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin. that still only gets FPGAs to ~$2/MH installed.
$1/MH is possible today if you build the boards yourself, or professionally in qty 100+ I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.
Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it. It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs. I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business, and capture a large percentage of the coins left to be mined. -rph In a short time, 1.5$/MH will come true by some "low-manufacture-cost-area" of the world. by my troth. Because the FPGA mining system is really not came to a big business, so I think profession groups still not join this game. In fact, miners are still a very small group. At this time, most of people making hard work on FPGA mining system maybe really for their "love". Their technique and effort surely could make more money in other filed.
|
|
|
|
fivebells
|
|
September 28, 2011, 01:19:39 PM |
|
ngzhang, I am pretty much clueless about hardware, so I am interested in your views. Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA? How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?
|
|
|
|
ngzhang
|
|
September 28, 2011, 01:43:47 PM |
|
ngzhang, I am pretty much clueless about hardware, so I am interested in your views. Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA? How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?
In my opinion, none of company will take part in any mining ASICs. If they have enough resource to tape out an ASIC, I'm sure they will design a another project, not for mining. As a single person, we can behaviour by interests, but a real company can't. And answer your question, 1/10 cost, 10X performance, 1/10 energy consumption, on a single ASIC. AT LEAST.
|
|
|
|
P4man
|
|
September 28, 2011, 02:33:46 PM |
|
In my opinion, none of company will take part in any mining ASICs.
Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among others, this: http://www.s2cinc.com/product/pd.asp?id=278I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin.
|
|
|
|
ngzhang
|
|
September 28, 2011, 02:57:43 PM |
|
In my opinion, none of company will take part in any mining ASICs.
Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among others, this: http://www.s2cinc.com/product/pd.asp?id=278I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin. there are some tiny difference between standard SHA256 hashing and bitcoin hashing. So ...
|
|
|
|
eldentyrell (OP)
Donator
Legendary
Offline
Activity: 980
Merit: 1004
felonious vagrancy, personified
|
|
September 30, 2011, 11:47:08 PM |
|
"My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit."
I am tempted to make this my new .signature
|
The printing press heralded the end of the Dark Ages and made the Enlightenment possible, but it took another three centuries before any country managed to put freedom of the press beyond the reach of legislators. So it may take a while before cryptocurrencies are free of the AML-NSA-KYC surveillance plague.
|
|
|
gopher
|
|
October 11, 2011, 02:40:25 PM |
|
I need a kit i can plug in and mine. The total package. When you can offer that (with enough hash rate), i'm sure that people will buy.
Indeed. I am with these guys.. I am no electrical engineer.. but I can plug in a psu You have a good point. But to flip that around, on the GPU side it's taken as a given that you're going to buy your hardware from a company (ATI) that does not provide the mining software (or even admit it knows what bitcoin is). But I understand that while gamers have seen GPUs before, most bitcoiners are encountering FPGAs for the first time. They aren't scary; they're just obscenely flexible... "enough rope to hang yourself with." I could, perhaps, put together a turn-key solution, although it would involve a lot of effort. My two major concerns are: 1. HDL developer guilt. It makes me slightly ill to see posts like ngzhang's "hey you lazy-ass HDL developers make your code faster so I can make MOAR PROFITZ!!!". I'd feel queasy about selling a "solution" that bundled in somebody else's hard work. I don't know the exact details of the fpgaminer/ztex dispute, but I can certainly empathize with the initial reaction from fpgaminer. It would make me really happy to be providing low-cost boards to people who are interested in tweaking/tuning/improving the HDL code, but I think I've figured out now that there aren't as many of those people as I'd thought. 2. Support. I'm happy to help out here in a casual message-board-member way. But I'm kinda worried about lazy users buying a "turn-key" solution from me and then demanding that I hand-hold them through the whole process of configuring Xilinx's crapware drivers on their Windows host box (I haven't used Windows in almost a decade) under threat of posting negative reviews of my product ("did not work for me"). I definitely can't sell the boards for $250 if I have to budget in my own time spent on extensive tech support work. Anyways. Looks like the first run will be small personal-use-only, but there may be another batch of boards in November after I've figured out if it's worth taking this to the next level. Hi big-chip-small-board, I have been a lurker here for some time, more importantly a huge fan of your work. May I offer you my opinion on this tricky matter. I think that you should explore the advantage of your HDL developer quitting and streamline/optimise your work, in other words re-position it elegantly so it gets better accepted among the broader bitcoin community base. The problems I see arise from the fact that FPGA developers try to tackle both parts of this project - design the hardware as well as develop the core functionality. You know that this particular project requires two fundamentally different resources, one that is averagely good in both both areas will not do. Hence, what about re-looking your marvellous hardware concept, perhaps give it a final touch and ensure that it is 100% compatible with the open source core functionality or any other firmware there is, so the people who buy your hardware can decide themselves what to run on it. Instead of trying to excel in two areas, you then need to excel in one - design and produce great hardware - and let others develop the software and add value as they can. This approach is very similar to the time when the GPU mining started - everyone would agree that selection of hardware and software are interlinked, but they got it from different sources. Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on. But the result is clear - at the end of the day, ATI does not write hashing code and the bitcoin script writers do not develop highly-integrated hardware - and irrespective of this division of functionalities, the bitcoin community does not have problem locating the required bits and pieces and building their rigs. My 2 cents.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 11, 2011, 03:40:57 PM |
|
Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on. Please tell me you are kidding. 1) You are aware many of the software developers do this full time as their day job. I am sure someone with $50K to $120K salary can afford an NVidia card. 2) That has absolutely nothing to do with why Nvidia performance is so poor.
|
|
|
|
gopher
|
|
October 11, 2011, 03:58:27 PM |
|
Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on. Please tell me you are kidding. 1) You are aware many of the software developers do this full time as their day job. I am sure someone with $50K to $120K salary can afford an NVidia card. 2) That has absolutely nothing to do with why Nvidia performance is so poor. You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.
|
|
|
|
P4man
|
|
October 11, 2011, 04:14:42 PM |
|
You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.
I have no reason to believe the cuda miners arent any good. Its the nvidia cards that arent as suited to bitcoin mining as amd cards, due to fundamentally different architectures. Since that makes the cards uncompetitive, it stands to reason few people will invest heavily in cuda apps that can only work on this (for bitcoin) very uncompetitive hardware. No amount of software optimization is going to turn a 140 MH/s nVidia card in to a 400 one. There is probably less than 10% untapped potential.
|
|
|
|
sirky
|
|
October 11, 2011, 04:28:10 PM |
|
This is all true. Just look at the shader counts between NVidia and AMD cards and you have your answer. The processors (shaders) have to do the work, and NVidia cards don't have as many.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 11, 2011, 05:12:40 PM |
|
You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.
You are missing the point. There are VERY GOOD CUDA miners. It is unlikely any future CUDA miner would get more than 10% more performance out of existing cards. Nvidia hardware just happens to be ill-suited for integer math (the math used in hashing).
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 31, 2011, 05:41:46 PM |
|
Not speaking on the architectural differences between NVidia and AMD but XFX are generally lower cost OEM. Higher DOA don't surprise me. They have good warranties though. My impression (via dead cards and sometimes illogical bioses) is they are the Kia motors of videocards.
|
|
|
|
ztex
Donator
Sr. Member
Offline
Activity: 367
Merit: 250
ZTEX FPGA Boards
|
|
November 03, 2011, 02:33:58 PM |
|
But I am *more* than interested in acquiring a board filled with FPGAs (i.e. 5 daughterboards in the backplane?) - under the conditions that:
Maybe this is what you are searching for: https://bitcointalk.org/index.php?topic=49180.0a) The kit is assembled to the point where the end-user (i.e. me) doesn't need to do any soldering more complicated than, say, splicing a custom connector to a PC standard PSU. I'm not an EE, not even an electronics hobbyist, and do NOT want to fuck up $1k with a clumsy soldering iron;
No soldering is required. A description about how a standard ATX PSU can be modified (without soldering ) for powering a rig can be found in the initial post of the topic mentioned above. b) Getting the FPGAs mining away (pool or solo) is easy enough for a general-purpose software hacker and doesn't require EE knowledge. I mainly run Macs (because they're Unix with MS Office and work well) but all my mining rigs are Linux. I'd like to have my FPGA rig controlled by a Mac Mini or my old G4 Cube (CPU arch may cause problems if x86 libs are needed, though). I've only got 30 years coding experience but the lowest level code I know is C - unrolling loops and VHDL are *well* outside my skillset and I don't have time to learn;
The software (see http://www.ztex.de/btcminer) is ready-to-use and runs on Linux. Rigs can be controlled by a single instance using the cluster mode. Hot-plugging is supported too. c) Apart from the peripheral software, everything is pre-loaded and coded. I am not familiar with FPGAs but know that the dev kit for the units talked about here costs a fortune. I won't be tuning the code and re-loading it onto a set of 5 FPGAs, so I don't want or need that cost, but I need it to run as soon as I plug in a cable and ping some control commands down the cable;
Bitstream (and Firmware) is compiled and ready-to-use. Firmware and Bitstream are uploaded by the software through USB. No JTAG programming cables or so are required. d) The code loaded onto the FPGAs is *reasonably* efficient and not hugely sub-optimal. I don't want to spend a grand, and then find out in a couple of months about new bitstreams for the FPGAs I own... which would double my hashrate if I could re-program the things. I don't know how to do that, and I assume the SDK is needed too. From what I've read, this will not be a problem as the FOSS logic and all the proprietary optimisations aren't miles away from each other in speed?
The software typically achieves 190MH/s per XC6SLX150-3 FPGA. d) ALL necessary cables are included - if they're custom then I'm happy to make them, but you HAVE to include the plugs / sockets because they may not be easily available to me in the UK (and if the connectors have 10+ pins then I'd prefer to pay for pre-made cables);
Only standard cables (which can be purchased in internet) are required. e) You are happy to ship to the UK. I will assume trust once I've spoken to you via email so am happy to provide payment up-front so long I feel everything is legit. I won't waste your time.
Article location is Germany, i.e. unless you have not valid VATIN you have to pay 19% German VAT. (But if you import from outside the EU you also have to pay UK import VAT.)
|
|
|
|
rph
|
|
November 04, 2011, 03:13:57 AM |
|
You know FPGA mining is becoming legit, when 2-3 vendors are trying to snipe customers from each others' threads. -rph
|
|
|
|
|