Sunday, March 27, 2016

Hacking GPU PCIe power connections

Until recently, I never thought much about PCIe power connectors.  Three 12 power and three ground wires was all I thought there was to them.  I thought it was odd that the 8-pin connectors just added two more ground pins and not another power pin, but never bothered to look into it.  That all changed when I got a new GPU card with a single 8-pin connector.

My old card had two 6-pin connectors, which I had plugged a 1-2 18AWG splitter cable into.  That was connected to a 16AWG PCIe power cable, which is good for about 200W at a drop of under 0.1V.  My new card with the single 8-pin connector wouldn't power up with just a 6-pin plug installed.  Using my multi-meter to test for continuity between the pins, I realized that it's not just a row of 12V pins and a row of ground pins.  There was continuity between the three 12V pins, and between three of what I thought were five ground pins.  After searching for the PCIe power connector pinout, I found out why.
Diagram edited from

Apparently some 6-pin PCIe cables only have 2 12V wires, 2 ground, and a grounded sense wire (blue in the diagram above).  With just two 12V wires, a crap 18" 20AWG PCIe power cable would have a drop of over 0.1V at 75W.  Since the 8-pin connector has three 12V pins, it can provide 50% more power.  My 6-pin 16AWG PCIe cable would have voltage drop of only 40mV at 75W, so I just needed to figure out a way to trick the GPU card into thinking I had an 8-pin connector plugged in.  The way to do that is ground the 2nd sense pin (green in diagram above).

I didn't want the modification to be permanent, so soldering a wire to the sense pin was out.  The PCIe power connectors use the same kind of pins as ATX power connectors, and I had an old ATX power connector I had cut from a dead PSU.  To get one of the female contacts out of the ATX connector, I used a hack saw to cut apart the ATX connector.  Not pretty, but I'm no maker, I'm a hacker. :-)  I stripped the end of the wire (red in the first photo), wrapping the bare part of the wire around the screw that holds the card bracket in the case.  I powered up the computer, and the video card worked perfectly.

Looking for a cleaner solution, I decided to make a jumper wire to go between the sense pin and the adjacent ground.  I also did some searching on better ways to remove the female contacts from the connectors.  For this, has a good technique using staples.  When the staples aren't enough to get the contacts out, I found a finish nail counter-sink punch helps.

Here's the end result, using a marrette (wire nut) to make the jumper:

See my related post Powering GPU Mining Rigs.


  1. Awesome! I'll give it a try. Thank you :)

  2. Thanks for the info on the pinouts, also the link to the staple tip! Resolved a couple of issues for me. Actually, though, the 8-pin connector provides only 50% more power; that is 150% as much as the 6-pin. People get into trouble a lot when using percentages for comparison.

    1. Good eye. I've been meaning to correct that typo for ages, and finally did.

  3. Funny because certain configurations I have been coming across at work prompted me to look this up. We have end-users that will often times purchase GTX 1080s to stick into our machines. And these are typically Precision 5810 and 3600 models. And the last guy that I seen set one of these up plugged in the 6 pin connector and I told him it's probably not going to post because it requires an 8 pin connector and these Precision's only come with a 6-pin connector, by default. We decided to try it and it indeed displayed and worked.
    Now, I don't know that it's performing optimally and these are developers and Engineers that do leverage the power of these cards, which is why they have them...but is there any way to tell if these are being leverage the full capacity? I would imagine that if the 6 pin connector wasn't going to work you wouldn't get any display, or poor performance and I've heard no complaints.

    I read an article stating that you can indeed deliver 150 watts over 6 pin connector, but 6-pin connectors are only rated at 75 Watts. So does that mean that there are circumstances where a graphics card can draw 150 watts of power over six pins? Or was that just a hypothetical notion?

    1. The 75 watts for a 6-pin is just a standard; they are supposed to be designed with thick enough wires and pins to handle 75W without overheating. Lots of them can easily handle 100W or more. Some cheap crap 6-pin power cables might even melt with 50W, especially if the wires aren't sized correctly:

    2. Hi, so I did some research on the matter and found your findings to be perfectly in line with what i've found elsewhere and already knew. As a general rule PSU 12v rails can do 18A, this equates far more than a mere 150 watts that 8 pins are rated for. I know you know that already though, wouldn't recommend running either at 20A but if you, for instance, did a bios hack on a gpu to further overclock it you could easily add 5-20 watts per 12v rail and be fine :)

    3. Like Ralph said it's nothing more than a standard and partially a MASSIVE safeguard, you could more than likely pull 150 watts on your average PSU's 12v with a 6 pin and be fine, 5-20 watts over 75 would be more than safe if you were interested in a slight bios hack. NEVER pull over 75 watts on the PCI-E Slot though, as far as I can tell that WILL burn out the slot.