Bitcoin Forum
December 05, 2016, 08:34:30 AM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 4 [5] 6 7 »  All
  Print  
Author Topic: Minimalist Spartan6-LX150 board  (Read 48411 times)
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
November 01, 2011, 12:12:37 AM
 #81

Not speaking on the architectural differences between NVidia and AMD but XFX are generally lower cost OEM.  Higher DOA don't surprise me.  They have good warranties though.  My impression (via dead cards and sometimes illogical bioses) is they are the Kia motors of videocards.
Feck. I'm getting rid of them then. I'm a petrolhead and the idea of the GPU equivalent of a Kia in my rigs?Huh Arrrrgh.

 Grin Shocked Tongue Grin

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
1480926870
Hero Member
*
Offline Offline

Posts: 1480926870

View Profile Personal Message (Offline)

Ignore
1480926870
Reply with quote  #2

1480926870
Report to moderator
1480926870
Hero Member
*
Offline Offline

Posts: 1480926870

View Profile Personal Message (Offline)

Ignore
1480926870
Reply with quote  #2

1480926870
Report to moderator
1480926870
Hero Member
*
Offline Offline

Posts: 1480926870

View Profile Personal Message (Offline)

Ignore
1480926870
Reply with quote  #2

1480926870
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
ztex
Donator
Sr. Member
*
Offline Offline

Activity: 367

ZTEX FPGA Boards


View Profile WWW
November 03, 2011, 02:33:58 PM
 #82

But I am *more* than interested in acquiring a board filled with FPGAs (i.e. 5 daughterboards in the backplane?) - under the conditions that:

Maybe this is what you are searching for: https://bitcointalk.org/index.php?topic=49180.0

Quote
a) The kit is assembled to the point where the end-user (i.e. me) doesn't need to do any soldering more complicated than, say, splicing a custom connector to a PC standard PSU. I'm not an EE, not even an electronics hobbyist, and do NOT want to fuck up $1k with a clumsy soldering iron;

No soldering is required. A description about how a standard ATX PSU can be modified (without soldering Smiley) for powering a rig can be found in the initial post of the topic mentioned above.

Quote
b) Getting the FPGAs mining away (pool or solo) is easy enough for a general-purpose software hacker and doesn't require EE knowledge. I mainly run Macs (because they're Unix with MS Office and work well) but all my mining rigs are Linux. I'd like to have my FPGA rig controlled by a Mac Mini or my old G4 Cube (CPU arch may cause problems if x86 libs are needed, though). I've only got 30 years coding experience but the lowest level code I know is C - unrolling loops and VHDL are *well* outside my skillset and I don't have time to learn;

The software (see http://www.ztex.de/btcminer) is ready-to-use and runs on Linux. Rigs can be controlled by a single instance using the cluster mode. Hot-plugging is supported too.

Quote
c) Apart from the peripheral software, everything is pre-loaded and coded. I am not familiar with FPGAs but know that the dev kit for the units talked about here costs a fortune. I won't be tuning the code and re-loading it onto a set of 5 FPGAs, so I don't want or need that cost, but I need it to run as soon as I plug in a cable and ping some control commands down the cable;

Bitstream (and Firmware) is compiled and ready-to-use. Firmware and Bitstream are uploaded by the software through USB. No JTAG programming cables or so are required.

Quote
d) The code loaded onto the FPGAs is *reasonably* efficient and not hugely sub-optimal. I don't want to spend a grand, and then find out in a couple of months about new bitstreams for the FPGAs I own... which would double my hashrate if I could re-program the things. I don't know how to do that, and I assume the SDK is needed too. From what I've read, this will not be a problem as the FOSS logic and all the proprietary optimisations aren't miles away from each other in speed?

The software typically achieves 190MH/s per XC6SLX150-3 FPGA.

Quote
d) ALL necessary cables are included - if they're custom then I'm happy to make them, but you HAVE to include the plugs / sockets because they may not be easily available to me in the UK (and if the connectors have 10+ pins then I'd prefer to pay for pre-made cables);

Only standard cables (which can be purchased in internet) are required.

Quote
e) You are happy to ship to the UK. I will assume trust once I've spoken to you via email so am happy to provide payment up-front so long I feel everything is legit. I won't waste your time.

Article location is Germany, i.e. unless you have not valid VATIN you have to pay 19% German VAT. (But if you import from outside the EU you also have to pay UK import VAT.)

rph
Full Member
***
Offline Offline

Activity: 176


View Profile
November 04, 2011, 03:13:57 AM
 #83

You know FPGA mining is becoming legit, when 2-3 vendors are trying to snipe customers from each others' threads.  Roll Eyes

-rph

Ultra-Low-Cost DIY FPGA Miner: https://bitcointalk.org/index.php?topic=44891
fizzisist
Hero Member
*****
Offline Offline

Activity: 720



View Profile WWW
November 04, 2011, 03:40:10 AM
 #84

You know FPGA mining is becoming legit, when 2-3 vendors are trying to snipe customers from each others' threads.  Roll Eyes

Haha, very true! Catfish, the truth is that all of the FPGA mining products you see here can be run by anyone who has managed to mine on a GPU. In fact, I think they are even easier to use (less complicated driver installs, overclocking, fan speeds, etc., and no need to even open up your tower to install it).

rph
Full Member
***
Offline Offline

Activity: 176


View Profile
November 04, 2011, 08:39:22 AM
 #85

Haha, very true! Catfish, the truth is that all of the FPGA mining products you see here can be run by anyone who has managed to mine on a GPU. In fact, I think they are even easier to use (less complicated driver installs, overclocking, fan speeds, etc., and no need to even open up your tower to install it).

Yeah, I'd agree with that. I just hope you guys don't sell so many that the difficulty becomes driven by FPGAs instead of GPUs.
Create some OPEC-style quotas or something..

-rph

Ultra-Low-Cost DIY FPGA Miner: https://bitcointalk.org/index.php?topic=44891
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 04, 2011, 12:55:45 PM
 #86

Haha, very true! Catfish, the truth is that all of the FPGA mining products you see here can be run by anyone who has managed to mine on a GPU. In fact, I think they are even easier to use (less complicated driver installs, overclocking, fan speeds, etc., and no need to even open up your tower to install it).

Yeah, I'd agree with that. I just hope you guys don't sell so many that the difficulty becomes driven by FPGAs instead of GPUs.
Create some OPEC-style quotas or something..

-rph


Well given that FPGA have a long hardware payoff period I don't see FPGA putting much downward pressure on prices.

It will however put a floor on hashing power.  GPU are very electrical dependent.  At $3 per BTC @ current difficulty translates into roughly $0.15 per kWh on even the most efficient GPU.  Thus people who's electrical price is above the break even tend to quit and push hashing power & difficulty down.

FPGA however once bought are a sunk cost and have an electrical cost of <$0.50 per BTC meaning that they will continue to run likely no matter what.  The FPGA portion of hashing power already purchased (currently ~0%) will be "immune" to price changes.  What that means is as the FPGA portion grows the relationship between price/difficulty and hashing power will be come less linear. 

Even if prices spike I don't see a massive rush to buy FPGA but rather a slow continual rollout.  The long hardware payback period will make miners more cautious.  As an example when BTC prices hit $30 at 1.5M difficulty it became a no brainer to buy more GPU.  Even unsustianble as that would be.  The payback period was like 40 days.  If you mined for 40 days you could payoff a card.  FPGA however would still need a significant period of time to payoff the hardware so price spikes will have less influence on sales.


It will be an interesting dynamic to watch because I am sure 2012 will be the year of the FPGA.
gmaxwell
Moderator
Legendary
*
Online Online

Activity: 2016



View Profile
November 04, 2011, 01:15:03 PM
 #87

You know FPGA mining is becoming legit, when 2-3 vendors are trying to snipe customers from each others' threads.  Roll Eyes

I prefer the kind of evidence where the margins get down to 20% over COGS. Wink
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 04, 2011, 01:22:23 PM
 #88

So what is release date?  You may want to update title so people don't think the project is dead.

Got a quesiton about the backplane.  You say it has a 48W powersupply but also it has a SATA power connector?  By PSU do you mean it steps down the voltage from SATA power connector what is required for each board?

Any discounts if someone buys a backplane + 6 boards?
How heavy are the boards?  Your photo has the backplane "down" and the cards plugged into it.  Would there be an issue if the backplane was mounted vertically and the cards acting like "shelves" or would that put too much pressure on the connector.  Just getting some ideas on density and mounting.  I would want to have them enclosed in a case.
eldentyrell
Donator
Legendary
*
Offline Offline

Activity: 966


felonious vagrancy, personified


View Profile WWW
November 05, 2011, 06:54:11 PM
 #89

So what is release date?

Sorry, I wasn't aware that the title could be altered!  I've updated it now.

Got a quesiton about the backplane.  You say it has a 48W powersupply

I've switched to a 72W supply.

but also it has a SATA power connector?  By PSU do you mean it steps down the voltage from SATA power connector what is required for each board?

Exactly.  Technically it is a "DC-to-DC point of load switching regulator."

Nice part is that the 72W supply can feed from either +5V or +12V (the old 48W supply could only use +12V).

Any discounts if someone buys a backplane + 6 boards?

At the moment I am neither taking orders nor announcing a ship date nor guaranteeing that either of these things will happen.  If you have immediate need for an FPGA mining device I suggest you look into the fine offerings by fpgaminer/ngzhang or ztex (or rph although I think he said he's not selling his).

How heavy are the boards?  Your photo has the backplane "down" and the cards plugged into it.  Would there be an issue if the backplane was mounted vertically and the cards acting like "shelves" or would that put too much pressure on the connector.

That works fine.  Those connectors are seriously heavy-duty stuff.  Unfortunately they're expensive too: even in qty50 I still had to pay $2.60 per board for each pair of connectors (male+female).  But there's almost no voltage drop across the gigantic pins and they can carry more current than I'll ever need.


The printing press heralded the end of the Dark Ages and made the Enlightenment possible, but it took another three centuries before any country managed to put freedom of the press beyond the reach of legislators.  So it may take a while before cryptocurrencies are free of the AML-NSA-KYC surveillance plague.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 05, 2011, 07:30:52 PM
 #90

Well I am not looking to buy until January.  I am disappointed you aren't looking to make a commercial run.  Still hopefully you make your personal run, learn some things and come back with a "round 2" offering.

When it comes to mining I always think big.  I replaced all my hodgepodge collection of GPU with nothing but 5970s because I like the density (2.2GH/rig without extenders or dual power supplies).  That kind of thinking let me get 10GH/s in my garage.  I like your board design because once again ... density.

10GH would be ~ 50 FPGA boards.  Now I have no intention on buying 50 boards all at once but I also like to plan for the "end game".  50 boards lying around and connected by a rats nets of USB cables doesn't seem appealing to me.  Maybe it is my times working in a datacenter or maybe it is just OCD but I like to see everything in their place.

Your design seems to have higher density and provide for more efficient racking.  One backplane w/ 6 cards ~= 1.2GH/s.  If you ever offered a larger backplane of say 10 cards powered by a single PCIe connector well that is even more interest.  Take a 4U server case put 2 backplanes and a tiny ITX motherboard in it.  A low end 500/600W power supply w/ 2 PCIE power connectors could power the entire thing.  20 cards or ~4GH/ in a single 4U case.   Power density would even be low enough to rack these things up in a standard datacenter rack.

Anyways even if you don't sell any in the near future I hope you keep on the project. If you make changes for "round 2" think about density.  It is the advantage you have over other designs, an advantage some would be willing to pay a slight premium for.  Open air rigs (either GPU or FPGA) are fine for now but the "end game" goal would be high hashing density in standardized server cases.  Nobody wants a garage or office full or whirling, noisy open air rigs they just happen to be the most efficient.  GPU likely will never work in standard case do to high thermal load but FPGA ... might.
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
November 07, 2011, 02:33:25 PM
 #91


At the moment I am neither taking orders nor announcing a ship date nor guaranteeing that either of these things will happen.  If you have immediate need for an FPGA mining device I suggest you look into the fine offerings by fpgaminer/ngzhang or ztex (or rph although I think he said he's not selling his).
Cheers, I got your message eventually as well, thanks for that.

Shame that your design isn't available, I rather liked it.

However ztex seems keen for the business so initially I'll have a look at his products.

Please do keep my email on file if you do consider building a short production run - just to ping me to see if I'm still interested, or whether I've spent my entire allowance on ztex boards Wink

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
bulanula
Hero Member
*****
Offline Offline

Activity: 518



View Profile
November 07, 2011, 09:30:36 PM
 #92

FPGA is much better for me than GPUs because less heat and noise right now but the price and performance leaves much to be desired.

Guess you can say the GPUs are old, inefficient, powerful gas guzzler motors while the FPGAs are new electric vehicles to keep the car analogy going Smiley
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 09:45:31 PM
 #93

FPGA is much better for me than GPUs because less heat and noise right now but the price and performance leaves much to be desired.

Guess you can say the GPUs are old, inefficient, powerful gas guzzler motors while the FPGAs are new electric vehicles to keep the car analogy going Smiley

However they are getting close.  While GPU may be cheap there is a limit on how many can be powered by a single rig and high efficiency power supplies aren't cheap either.  A good price point for a GPU rig is $1 per MH/s.   FPGA are getting closer every day.

GPU rig $1 per MH & 2MH/W @ $0.10 per kWh.
1GH rig = $1000 hardware cost + $438 per year.  Total cost over 2 years = $2314.

FPGA Rig (22MH/W @ $0.10 per kWh)
1GH rig (@ $2.50 per MH) = $2500 hardware costs + $40 per year.  Total cost over 3 years = $2620.
1GH rig (@ $2.00 per MH) = $2000 hardware costs + $40 per year.  Total cost over 3 years = $2120.
1GH rig (@ $1.50 per MH) = $1500 hardware costs + $40 per year.  Total cost over 3 years = $1620.

Given FPGA massively lower operating costs if they even get close to GPU they are the smart place to deploy new hardware.
ElectricMucus
Legendary
*
Offline Offline

Activity: 1540


Drama Junkie


View Profile
November 07, 2011, 09:54:00 PM
 #94

Why aren't there any pcie cards with only a few fpgas & some power converters on it?

That would be the most cost effective solution as modern fpgas have native pcie endnodes and pcie even has a jtag interface built in. All we need is drivers, a 2 layer pcb and a panel sheet to mount it.

First they ignore you, then they laugh at you, then they keep laughing, then they start choking on their laughter, and then they go and catch their breath. Then they start laughing even more.
eldentyrell
Donator
Legendary
*
Offline Offline

Activity: 966


felonious vagrancy, personified


View Profile WWW
November 07, 2011, 10:30:15 PM
 #95

Why aren't there any pcie cards with only a few fpgas & some power converters on it?

Lack of demand.

Other than bitcoin, there are not many uses for large FPGAs outside of (1) development boards and (2) integration into a product of some kind (like a router) in which the user is not even aware that FPGAs are involved.  The second one is where Xilinx's large-device profits come from.

If you're buying a dev board, you're either an academic (in which case Xilinx cuts you a special deal) or you don't mind paying $4,000+ for a card with all sorts of doo-dads you'll never use since odds are the development costs of the product you're working on make expenses like this irrelevant.  That's why with each generation of chips you see Xilinx (or one of its partners) produce some uber-board with everything-and-the-kitchen-sink on it.  FWIW, many of these "kitchen sink" boards have PCIe interfaces.

The printing press heralded the end of the Dark Ages and made the Enlightenment possible, but it took another three centuries before any country managed to put freedom of the press beyond the reach of legislators.  So it may take a while before cryptocurrencies are free of the AML-NSA-KYC surveillance plague.
ngzhang
Hero Member
*****
Offline Offline

Activity: 592


We will stand and fight.


View Profile
November 08, 2011, 05:02:24 PM
 #96

Why aren't there any pcie cards with only a few fpgas & some power converters on it?

That would be the most cost effective solution as modern fpgas have native pcie endnodes and pcie even has a jtag interface built in. All we need is drivers, a 2 layer pcb and a panel sheet to mount it.

Why use PCIE interfaces? USB1.1 is much better.

CEO of Canaan-creative, Founder of Avalon project.
https://canaan.io/
Business contact: love@canaan.io
All PMs will be unread.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 08, 2011, 05:13:57 PM
 #97

Why aren't there any pcie cards with only a few fpgas & some power converters on it?

That would be the most cost effective solution as modern fpgas have native pcie endnodes and pcie even has a jtag interface built in. All we need is drivers, a 2 layer pcb and a panel sheet to mount it.

Why use PCIE interfaces? USB1.1 is much better.

I am thinking scalability and density.  Evetually if Bitcoin grows and flourishes, mining will move beyond hobbyist and garages full of open rigs full of noisy cards into high density datacenter capable designs.  Getting large number of GPU in a rack mount server is unviable due to the thermal load.  FPGA make that possible someday.  A PCIe board can supply power and data over single connector which makes deployment easier.  More importantly it provides a way to securely mount multiple FPGA using existing standards (off the shelf motherboards, rackmount chassis, ATX power supplies, etc). 

I would love someday to be able to put a FPGA array in a co-location datacenter to reduce risk of loss due to theft, power, fire, damage. A full length board would be able to mount maybe 5 FPGA for half-height board and maybe 10 for full height board.  That creates some interesting datacenter quality arrays.  A 2U server could mount 4 boards, 20 FPGAs (or more).  That is ~4GH/s on 300W in a 2U space.  A standard datacenter rack could hold a 80GH/s of hashing power, run on a single 30A 208V power connection and make things like remote power control, KVM over IP, and enterprise grade redundant power supplies more economical.



bulanula
Hero Member
*****
Offline Offline

Activity: 518



View Profile
November 08, 2011, 08:34:33 PM
 #98

Why aren't there any pcie cards with only a few fpgas & some power converters on it?

That would be the most cost effective solution as modern fpgas have native pcie endnodes and pcie even has a jtag interface built in. All we need is drivers, a 2 layer pcb and a panel sheet to mount it.

Why use PCIE interfaces? USB1.1 is much better.

I am thinking scalability and density.  Evetually Bitcoin will move beyond hobbyist and open boards into high density datacenter designs.  Getting large number of GPU in a rack mount server is simply impossible due to the thermal load.  FPGA make that possible someday. I see that as the endgame for FPGA.  A PCIe board can supply power and data over single connector.  It also make a convinent way to mount multiple FPGA inside a standardized chasis.  I would love someday to put a FPGA array in a co-location datacenter to reduce risk of loss due to theft, power, fire, damage.  

I full length board would be able to mount maybe 5 FPGA for half-height board and maybe 10 for full height board.  That creates some interesting datacenter quality arrays.  2U server could mount 4 boards or 20 FPGA total for ~4GH/s using maybe 300W for entire system  (at the wall).  A standard datacenter rack could hold a 80GH and run on a single 30A 208V power connection.  The higher density would make things like remote power control and KVM over IP economical.

Too bad the demand is too low now. I think BFL labs is scam too. I mean why go through all that development when the price of BTC can crash any day now and people will stop buying mining equipment etc. Even in other industries FPGA is almost never heard of. I never heard about FPGAs until Bitcoin etc.
aTg
Legendary
*
Offline Offline

Activity: 1302



View Profile
November 08, 2011, 08:38:43 PM
 #99

I am thinking scalability and density.  Evetually Bitcoin will move beyond hobbyist and open boards into high density datacenter designs.  Getting large number of GPU in a rack mount server is simply impossible due to the thermal load.  FPGA make that possible someday. I see that as the endgame for FPGA.  A PCIe board can supply power and data over single connector.  It also make a convinent way to mount multiple FPGA inside a standardized chasis.  I would love someday to put a FPGA array in a co-location datacenter to reduce risk of loss due to theft, power, fire, damage.  

I was thinking exactly that, but we could not start from here with that standard design for a rack?
I think that having many small modules with a single FPGA is not efficient for spending on individual fans and especially because a single USB controller could handle an entire plate of FPGA's so each module in the rack may be connected via USB to a hub and a computer within the same cabinet.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 08, 2011, 08:48:12 PM
 #100

I was thinking exactly that, but we could not start from here with that standard design for a rack?
I think that having many small modules with a single FPGA is not efficient for spending on individual fans and especially because a single USB controller could handle an entire plate of FPGA's so each module in the rack may be connected via USB to a hub and a computer within the same cabinet.

Agreed but PCIe "solves" 3 solutions

1) power distribution
2) data connectivity
3) standardized mounting
4) server sized cooling not individual board cooling

Sure you could have larger boards, and figure out a way to rig usb cables to a hub to the host, run custom power lines to each of them, and then figure out some non-standard method to securely mount and cool them  However using PCIe allows you to leverage existing technology like chassis with redundant midplane cooling, backplanes for securely mounting cards, ATX motherboards for connectivity and power.  I don't think we will see PCIe solutions anytime soon but on the other hand I can't imagine if Bitcoin is around in 5 years that the "solution" is a bunch of usb boards jury rigged inside a case connected to usb hub.

For example take a look at this "industrial chassis"
http://www.adlinktech.com/PD/marketing/Datasheet/RK-440/RK-440_Datasheet_1.pdf

Notice the midplane fans designed to cool expansion cards and the 18 expansion slots.  It uses a "single board computer" where the "motherboard" is actually mounted perpendicular to a backplane just like any other expansion card.  This is the kind of setup that is used for other "industrial" servers like cable video mulxiplexing, high speed network switching, digital signal processing, etc. 
Pages: « 1 2 3 4 [5] 6 7 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!