Why won't rest of the PCI-e slots get detected after installing a graphic card in the x4 slot?

JBaETHJBaETH Member Posts: 40
I am setting up a mining rig comprising of 5 graphic cards using this mainboard, Gigabyte P35-DS3P. It has the following number of PCI-e slots, as seen in the Gigabyte link:
  • 1x x16 PCI-e slot
  • 3x x1 PCI-e slots
  • 1x x4 PIC-e slot
I have been able to plug in 4 graphic cards using powered PCI-e risers/extenders, in the following order and they work fine:
3x R9 280x cards in each of the x1 slot.
1x R9 280x card in the x16 slot using x1 riser cable.

Now when I plug in the 5th card in x4 slot, only 1 card would get detected in Windows's 'device manager' section i.e. the one plugged in the x4 slot. I have tried installing it with and without the risers but same results.

Can someone suggest some workaround for this? Thanks!

Comments

  • JukeboxJukebox Member Posts: 554 ✭✭✭
    edited March 2016
    Look mindfully to mo-bo maual, page 8
    http://download1.gigabyte.eu/Files/Manual/motherboard_manual_ga-p35-ds3p_2.0_e.pdf

    Upper PCI-E 16x slot is controlled by P35 northbridge, providing 16 PCI-E lanes.
    Bottom PCI-E 16x (4x for real) slot is controlled by ICH9R southbridge.

    ICH9R southbridge have only 6 PCI-E lanes. 2 of them used by Gigabit LAN and Gigabyte SATA 2 chip.
    4 of them used for bottom PCI-E 16x OR for 3 PCI-E 1x slots.

    You can try to disable Gigabyte SATA 2 chip in BIOS - maybe you can use bottom PCI-E 16x together with 3 PCI-E 1x slots
  • LogicaluserLogicaluser Member Posts: 214 ✭✭
    Fascinating, I'd never really considered using core 2 duo era boards.... (shit I actually have a EP35-DS3R somewhere)
    What kind of hashrate are you getting off those 280X's on such a setup?
  • adasebadaseb Member Posts: 1,043 ✭✭✭
    Tape up your GPUs pins so you force it to run on 1x mode instead of 16x.

    Just use clear tape, only leave the 1x rail uncovered.
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,254 mod
    You can also try and force 1x detection by hotwiring the PCI-E slots...
  • JBaETHJBaETH Member Posts: 40

    Fascinating, I'd never really considered using core 2 duo era boards.... (shit I actually have a EP35-DS3R somewhere)
    What kind of hashrate are you getting off those 280X's on such a setup?

    My intention is to reduce the ROI period as much as I can. So I found the cheapest board in the market that would have the maximum PCIe slots.

    4 R9 280x are running on 1150/1500 and generating ~88MH/s for nyoc. Not bad I'd say.

    Only if I could get the 5th card running.
  • JBaETHJBaETH Member Posts: 40

    You can also try and force 1x detection by hotwiring the PCI-E slots...

    Hi @o0ragman0o, thanks for your reply. Actually, I don't need to hotwire since the x1 slots are already getting detected. I took your advice nonetheless but no success. All x1 slots disappear from the Device Manager once the GPU is installed in the x4 slot, on-board or via riser cables.
  • JBaETHJBaETH Member Posts: 40
    adaseb said:

    Tape up your GPUs pins so you force it to run on 1x mode instead of 16x.

    Just use clear tape, only leave the 1x rail uncovered.

    Why would I want to tape it when I already have x1 to x16 riser that can be plugged into the x16 slot.
  • entelechyentelechy Member Posts: 24
    @JBaETH Are you running Windows or Linux out of curiosity?
  • JBaETHJBaETH Member Posts: 40
    entelechy said:

    @JBaETH Are you running Windows or Linux out of curiosity?

    Windows 7
  • JBaETHJBaETH Member Posts: 40
    Jukebox said:

    Look mindfully to mo-bo maual, page 8
    http://download1.gigabyte.eu/Files/Manual/motherboard_manual_ga-p35-ds3p_2.0_e.pdf

    Upper PCI-E 16x slot is controlled by P35 northbridge, providing 16 PCI-E lanes.
    Bottom PCI-E 16x (4x for real) slot is controlled by ICH9R southbridge.

    ICH9R southbridge have only 6 PCI-E lanes. 2 of them used by Gigabit LAN and Gigabyte SATA 2 chip.
    4 of them used for bottom PCI-E 16x OR for 3 PCI-E 1x slots.

    You can try to disable Gigabyte SATA 2 chip in BIOS - maybe you can use bottom PCI-E 16x together with 3 PCI-E 1x slots

    This is by far the most useful answer.
    And oh my Mobo even has a manual :wink:
    I'll take your advice and try to turn off SATA 2 Chip to see if it can operate.

    Thank you so much for taking out time and doing research that I should have done myself. JukeBox, you're amazing.
  • MrYukonCMrYukonC Member Posts: 627 ✭✭✭
    @JBaETH @Jukebox

    Man, the motherboard situation is a disaster. I've been using a particular Gigabyte that worked without issues with 6 GPUs active (I have 3 rigs that use it).

    But, they've apparently stopped making it, so I ordered an MSI Intel-based (Z170-A PRO LGA 1151) and it's a disaster.

    It has 6 PCIe slots, but with all 6 GPUs plugged in, it only recognizes 5 of them and the onboard NIC doesn't work. WTF?

    And there is no option in the bios to disable some of the SATA ports, which are also on the PCIe bus.

    So, in order to get the 6th GPU recognized, I had to disable the onboard NIC, but that leaves me without networking, which is obviously required. So, I had to order a USB NIC (since the USB doesn't run through the PCIe bus). Total disaster.
  • JBaETHJBaETH Member Posts: 40
    MrYukonC said:

    @JBaETH @Jukebox

    Man, the motherboard situation is a disaster. I've been using a particular Gigabyte that worked without issues with 6 GPUs active (I have 3 rigs that use it).

    Which mobo are we talking about exactly?
    MrYukonC said:

    But, they've apparently stopped making it, so I ordered an MSI Intel-based (Z170-A PRO LGA 1151) and it's a disaster.

    It has 6 PCIe slots, but with all 6 GPUs plugged in, it only recognizes 5 of them and the onboard NIC doesn't work. WTF?

    Were you able to see the 6th card in the 'device manager' or 'lspci' prior to disabling the onboard NIC? Did you also have to use the '6gpu mod' to get it to work and did you eventually got the 6th card working?
  • MrYukonCMrYukonC Member Posts: 627 ✭✭✭
    edited March 2016
    @JBaETH
    JBaETH said:

    MrYukonC said:

    @JBaETH @Jukebox

    Man, the motherboard situation is a disaster. I've been using a particular Gigabyte that worked without issues with 6 GPUs active (I have 3 rigs that use it).

    Which mobo are we talking about exactly?
    GA-F2A88X-UP4 -- http://amazon.com/gp/product/B00FBCCKIW

    It's a rock solid MOBO. Never had to mess with anything in the BIOS. All 6 cards detected on first try, onboard LAN works with the cards present, etc. Even has a power button on the board, which is nice.
    JBaETH said:


    MrYukonC said:

    But, they've apparently stopped making it, so I ordered an MSI Intel-based (Z170-A PRO LGA 1151) and it's a disaster.

    It has 6 PCIe slots, but with all 6 GPUs plugged in, it only recognizes 5 of them and the onboard NIC doesn't work. WTF?

    Were you able to see the 6th card in the 'device manager' or 'lspci' prior to disabling the onboard NIC? Did you also have to use the '6gpu mod' to get it to work and did you eventually got the 6th card working?
    Prior to disabling the onboard NIC, lspci would only list 5 of the GPUs. After disabling the NIC, the 6th one showed up.

    I wasn't able to mine though, because I didn't have a damn network connection at that point. :|

    Have a USB wireless NIC arriving today intended for another rig, but will hijack it until tomorrow when the USB wired NIC shows up. Talk about a 3-ring circus!

    Edit: What is the "6 GPU mod"?
  • JBaETHJBaETH Member Posts: 40
    MrYukonC said:

    @JBaETH

    Edit: What is the "6 GPU mod"?

    This.
    Good luck mate! :)

or to comment.