dbalz
Newbie
Offline
Activity: 20
Merit: 0
|
|
July 15, 2018, 05:01:34 AM |
|
Where is the chart, I can't find a link anywhere!
|
|
|
|
GPUHoarder (OP)
Member
Offline
Activity: 154
Merit: 37
|
|
July 15, 2018, 05:13:42 AM Last edit: July 15, 2018, 05:28:07 AM by GPUHoarder |
|
alll risers support 3.0 (x1). well if motherboard support only 2.0, than the riser will do 2.0 as well.
Its simple the usb3 cables dont have the quality to negotiate pcie 3.0, think about it usb3 cables are made for 8b/10b same as what pcie 2.0 run, but pcie 3.0 use 128b/130b. Now there is faster standards like USB 3.1 Gen2 or USB 3.2 that do use 128b/130b but i would imagine that they also have much shorter cables, and its not cheap EDIT: Already at pcie 2.0 these usb risers suck, if you have problems with gpu fallen of the bus errors in your syslog, and your gpu's are not too much clocked, you can usualy fix it by forcing pcie 1.0, so the risers are iffy even for 2.0, you say they can 3.0 I say NO ! 8/10 vs 128/130 has absolutely nothing to do with this. That is simply the encoding for DC Balancing/ensure there is a periodic bitflip on the transcievers so they can stay locked. It was changed in PCIe 3.0 due to improved transceivers and to lower overhead. Also for everyone checking lspci - most GPUs now adays are designed for low-power idle. They run the link at x1 unless they are actually being used for transfer! Run a bandwidth heavy test or activity and run lspci, you’ll see LnkStatus kick up to 3.0 (8 Gbps) We just spent weeks testing dozens of risers and slots on 3 dozen motherboards. I haven’t found a single riser that can’t sustain PCIe 3.0 speeds. While monitoring PCIe for any errors. I have, however, found crap “mining” motherboards that have PCIe errors on their slots even with a x1 device plugged straight in. I’ll take that actual testing over your anecdote. Software (miner) and APIs: This is a Pre-Order for the hardware. The software will launch with the hardware shipping late August. We’ve already asked devs to reach out for API and SDK information that will be available before launch. Developer boards haven’t even been sent out yet to other devs.
|
|
|
|
Marvell2
|
|
July 15, 2018, 06:27:27 AM |
|
alll risers support 3.0 (x1). well if motherboard support only 2.0, than the riser will do 2.0 as well.
Its simple the usb3 cables dont have the quality to negotiate pcie 3.0, think about it usb3 cables are made for 8b/10b same as what pcie 2.0 run, but pcie 3.0 use 128b/130b. Now there is faster standards like USB 3.1 Gen2 or USB 3.2 that do use 128b/130b but i would imagine that they also have much shorter cables, and its not cheap EDIT: Already at pcie 2.0 these usb risers suck, if you have problems with gpu fallen of the bus errors in your syslog, and your gpu's are not too much clocked, you can usualy fix it by forcing pcie 1.0, so the risers are iffy even for 2.0, you say they can 3.0 I say NO ! 8/10 vs 128/130 has absolutely nothing to do with this. That is simply the encoding for DC Balancing/ensure there is a periodic bitflip on the transcievers so they can stay locked. It was changed in PCIe 3.0 due to improved transceivers and to lower overhead. Also for everyone checking lspci - most GPUs now adays are designed for low-power idle. They run the link at x1 unless they are actually being used for transfer! Run a bandwidth heavy test or activity and run lspci, you’ll see LnkStatus kick up to 3.0 (8 Gbps) We just spent weeks testing dozens of risers and slots on 3 dozen motherboards. I haven’t found a single riser that can’t sustain PCIe 3.0 speeds. While monitoring PCIe for any errors. I have, however, found crap “mining” motherboards that have PCIe errors on their slots even with a x1 device plugged straight in. I’ll take that actual testing over your anecdote. Software (miner) and APIs: This is a Pre-Order for the hardware. The software will launch with the hardware shipping late August. We’ve already asked devs to reach out for API and SDK information that will be available before launch. Developer boards haven’t even been sent out yet to other devs. darn so to use the miners you have to have acorn running ? so if you have say one nest acord adaptor board with 1 or two acorn installed would the miner be implemented for all cards on the rig or only the acorn boosted gpus?
|
|
|
|
dsmn
Jr. Member
Offline
Activity: 68
Merit: 4
|
|
July 15, 2018, 08:09:05 AM |
|
8/10 vs 128/130 has absolutely nothing to do with this. That is simply the encoding for DC Balancing/ensure there is a periodic bitflip on the transcievers so they can stay locked. It was changed in PCIe 3.0 due to improved transceivers and to lower overhead.
Also for everyone checking lspci - most GPUs now adays are designed for low-power idle. They run the link at x1 unless they are actually being used for transfer! Run a bandwidth heavy test or activity and run lspci, you’ll see LnkStatus kick up to 3.0 (8 Gbps)
We just spent weeks testing dozens of risers and slots on 3 dozen motherboards. I haven’t found a single riser that can’t sustain PCIe 3.0 speeds. While monitoring PCIe for any errors. I have, however, found crap “mining” motherboards that have PCIe errors on their slots even with a x1 device plugged straight in. I’ll take that actual testing over your anecdote.
Software (miner) and APIs: This is a Pre-Order for the hardware. The software will launch with the hardware shipping late August. We’ve already asked devs to reach out for API and SDK information that will be available before launch. Developer boards haven’t even been sent out yet to other devs.
I must have some real shitty risers then, because they refuse to do 3.0 in any of my boards including normal non mining boards, not only on the chipset's pcie ports, but also the cpu's x16 port And for the lspci and idle thing, if you get 2,5 GT/s that could be it yeah, But im getting the 5 GT/s. And its while mining eth so the bus is used. And yes some chipsets have shitty pcie that will make errors with just a standard network card, and its not only on mining boards, ive seen it on normal b250 boards also.
|
|
|
|
vuli1
Jr. Member
Offline
Activity: 238
Merit: 3
|
|
July 15, 2018, 08:48:59 AM |
|
I thiink on Z77 I had max Pci e 1.1 with Celeron and hd 6870. But on Z170 all RX are runing on 3.0 and 006 ver . eitherway, 1.0 is more than enough for mining cards. For Acron, not sure, didnt pay attention.
|
★ PRiVCY ➢ Own Your Privacy! ➢ Best privacy crypto-market! ★ ✈✈✈[PoW/PoS]✅[Tor]✅[Airdrop]✈✈✈ (https://privcy.io/)
|
|
|
Longsnowsm
|
|
July 15, 2018, 12:46:30 PM |
|
I think this looks very interesting, especially the 215+ stand alone mentioned on the web site. I am going to hold off and hope you guys share your experiences and recommendations on the right combo for mobo etc for these rigs. I would like to avoid some of the fumble farting around that I typically do on these things and just get something working without spending the next weeks and months trying to figure out how to make something work.
Thanks to all you guys in this thread that are mentioning what you are building with and what you are thinking. Looking forward to the reports on how this works. I would love to see how this works in stand alone mode.
|
|
|
|
ronnieb
Jr. Member
Offline
Activity: 199
Merit: 1
|
|
July 15, 2018, 01:57:10 PM |
|
whats the cheapest board I can get away with with 4 rx 580s and a acorn.
|
|
|
|
ronnieb
Jr. Member
Offline
Activity: 199
Merit: 1
|
|
July 15, 2018, 01:58:38 PM |
|
Why not set up 32 Acorn CLE 215+s on a octominer and let them mine by themselves using little power.
The Octominer is completely starved of PCIE Lanes, which is precisely what you need to run lots of Acorns. The Acorn 215+ needs PCIE 2.0 x4 (or equivalent) speed to operate and ideally the GPU(s) it is paired with would have that speed as well. Every mining specific motherboard that I am aware of limits you to 1x speed (either gen. 2.0 or 3.0), which is too slow to be as effective. Take a few minutes to research PCI Express lanes, what the different generations are relative to each other speed wise. Then research your motherboard(s) and see how many PCIE lanes it has (don't forget ones from the CPU too) and how are they routed on your motherboard. You will find that the cheaper motherboards that we like to get as miners are poorly suited for Acorn acceleration. All I have is cheap boards, looks like I may need an upgrade
|
|
|
|
dragonmike
|
|
July 15, 2018, 02:18:22 PM |
|
Can anybody tell me if I could make good use of a 101 in the M.2 slot of my 7x-RX570 rigs powered by weaksauce Celerons 1620's?
|
|
|
|
crazydane
|
|
July 15, 2018, 03:22:04 PM Last edit: July 15, 2018, 05:51:07 PM by crazydane Merited by vapourminer (1) |
|
I changed a 2nd Asus PRIME Z270-A from Gen2 to Gen3. This one is running Win10 and 7x 1080Tis. Unfortunately one of the GPUs dropped out after the change: That one GPU is not being assigned an Interrupt Line. All 7 GPUs in this rig are identical and use identical risers. Switched back to Gen2, and all 7 GPUs are working again. I'll try this on another Win10 rig (identical this this one) to see if I run into the same issue.
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4256
Merit: 8587
'The right to privacy matters'
|
|
July 15, 2018, 03:26:53 PM |
|
I thiink on Z77 I had max Pci e 1.1 with Celeron and hd 6870. But on Z170 all RX are runing on 3.0 and 006 ver . eitherway, 1.0 is more than enough for mining cards. For Acron, not sure, didnt pay attention.
I have 4 or 5 z170 boards from biostar https://www.amazon.com/Biostar-RACING-Z170GT7-Intel-Motherboard/dp/B06XDKHN1SI think that these boards with the nest x2g and acorn 215+ will be true beasts
|
|
|
|
TheYankeesWin!
|
|
July 15, 2018, 03:29:28 PM |
|
phil
if you buy 4x x2g nests = 400 if you buy 8x acorn 215+ = 2800
3200 to fill the board = $$
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4256
Merit: 8587
'The right to privacy matters'
|
|
July 15, 2018, 03:31:17 PM |
|
@ yankees I have 12 1080ti's and 2 1050ti's
I don't need to fill the boards
I could do 2 nests per board and 2 1080ti's
|
|
|
|
Marvell2
|
|
July 15, 2018, 04:40:25 PM |
|
the jury is out anyways on weather it will be better to buy next gen cards vs these fpga deals too many moving oarts imo, need new motherboards specific cpus etc , how bout i grab a gtx 1180 for $600 that hashes at stright up twice the hash rate of an 1080 and worry about fpga later when new 8 card boards with better features come out.
It makes no sense to replace hardware (mother boards and cpus) with new gear unless you are building a new system or app specific boards are released.
fuck risers
|
|
|
|
bswilmington
Newbie
Offline
Activity: 39
Merit: 0
|
|
July 15, 2018, 05:32:11 PM |
|
the jury is out anyways on weather it will be better to buy next gen cards vs these fpga deals too many moving oarts imo, need new motherboards specific cpus etc , how bout i grab a gtx 1180 for $600 that hashes at stright up twice the hash rate of an 1080 and worry about fpga later when new 8 card boards with better features come out.
It makes no sense to replace hardware (mother boards and cpus) with new gear unless you are building a new system or app specific boards are released.
fuck risers
where have you found the hashrates of the new GPUs? I can't even find a release date
|
|
|
|
Pennywis3
|
|
July 15, 2018, 05:51:58 PM |
|
the jury is out anyways on weather it will be better to buy next gen cards vs these fpga deals too many moving oarts imo, need new motherboards specific cpus etc , how bout i grab a gtx 1180 for $600 that hashes at stright up twice the hash rate of an 1080 and worry about fpga later when new 8 card boards with better features come out.
It makes no sense to replace hardware (mother boards and cpus) with new gear unless you are building a new system or app specific boards are released.
fuck risers
I seriously doubt you will get 1180 for 600$
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4256
Merit: 8587
'The right to privacy matters'
|
|
July 15, 2018, 07:22:14 PM |
|
MARVELL2 = correct on fuck risers As to what is best to do buy a shit ton of these acorns or wait. Waiting is better. Buying 2 and a nest at 330+330+99 = 769 + shipping is also okay Buying 02 of these with or without nests is hard to do it is a lot of coin up front and a clear picture is not here. Ie a 2 card nest does x hash at x watts = I don't know If I knew a 2 card nest did 40mh of x16r at 60 watts I would like that. as a 1080ti does 20mh at 200 watts
|
|
|
|
ZedZedNova
Sr. Member
Offline
Activity: 475
Merit: 265
Ooh La La, C'est Zoom!
|
|
July 15, 2018, 08:57:43 PM |
|
MARVELL2 = correct on fuck risers As to what is best to do buy a shit ton of these acorns or wait. Waiting is better. Buying 2 and a nest at 330+330+99 = 769 + shipping is also okay Buying 02 of these with or without nests is hard to do it is a lot of coin up front and a clear picture is not here. Ie a 2 card nest does x hash at x watts = I don't know If I knew a 2 card nest did 40mh of x16r at 60 watts I would like that. as a 1080ti does 20mh at 200 watts From what I understand by reading the info on the squirrels web pages and reading the discord (before the message traffic went crazy, mostly with the same questions over and over), the CLE-215 and CLE-215+ consume up to ~15 watts each. So your proposed scenario of 2 card nest would consume ~30 watts. The question then becomes one of what the hash rate of the CLE-215+, the only one so far that GPUHoarder and team say can mine stand-alone, would be on X16R. The stand-alone mining numbers have not been released yet, but should be coming Soon TM. When a CLE-215+ is used to boost a 1080Ti in a 1:1 configuration, you get a 76% improvement (1.76x) in hash rate on X16R, so the 20mh of the 1080Ti alone becomes ~35mh when boosted, and the combined total power is 216 Watts. Additionally, it appears that using a CLE-215+ can allow dual mining. If I understand correctly, the CLE-215+ handles/shares the core heavy algorithm while the GPU handles the memory heavy algorithm.
|
No mining at the moment.
|
|
|
OurManInHavana
Member
Offline
Activity: 91
Merit: 10
|
|
July 15, 2018, 09:23:46 PM |
|
With current low prices for the coins an Acorn can help with, all the scenarios I've looked at show that they'll pay for themselves in 12-18 months. Is that what other people are coming up with too?
Can't wait to see some real $/day improvements when they ship: so many GPU/MB combo's to try!
|
|
|
|
dsmn
Jr. Member
Offline
Activity: 68
Merit: 4
|
|
July 15, 2018, 10:06:31 PM |
|
With current low prices for the coins an Acorn can help with, all the scenarios I've looked at show that they'll pay for themselves in 12-18 months. Is that what other people are coming up with too?
Can't wait to see some real $/day improvements when they ship: so many GPU/MB combo's to try!
Hmm they do x2 x3 the speed on some algorithms if your gpu's are set up right. If thats not real $ per day improvements I dont know what is hehe, the big problem right now is that we are in a major reversal trend.
|
|
|
|
|