Friday, June 3, 2016

When does 18 = 26? When buying cheap cables.


I recently bought some cheap molex to PCI-e power adapters from a seller on AliExpress.  Although there are deals for quality goods on AliExpress, I was a bit suspicious when I ordered these given just how cheap they were.  PCI-e power connectors are supposed to be rated for 75W of power carried over 2 conductors at 12V, which means 3.1A per conductor.  In order to avoid a large voltage drop the wires used are usually 18AWG, although 20AWG wires (with 1.6x the resistance) would be reasonably safe.

When the package arrived, I inspected the adapter cables, which were labeled 18AWG.  Despite the label, they didn't feel like 18AWG wires, which have a conductor diameter of 1mm.  I decided to do a destructive test on one of the adapters by cutting and stripping one of the wires.  The conductor measured only 0.4mm in diameter, which is actually 26AWG.  The first photo above shows a real 18AWG wire taken from an old ATX PSU next to the fake 18AWG wire from the adapter cables.

When I opened a dispute through AliExpress, things got more amusing.  I provided the photo, as well as an explanation that real 18AWG wire should be 1mm in diameter.  The seller claimed "we never heard of this before", and after exchanging a couple more messages said, "you can't say it is fake just because it is thin".  At that point I realized I was dealing with one of those "you can't fix stupid" situations.

So what would happen if I actually tried to use the adapter cables on a video card that pulls 75W on the PCI-e power connector?  Well you can find posts on overclocking sites about cables that melted and burst into flames.  If you have a cheap PSU without short-circuit protection, when the insulation melts and the wires short, your power supply could be destroyed.  And if that happend I'm sure the AliExpress seller is not going to replace your power supply.  How much hotter the cables would get compared to genuine 18AWG cables is a function of the resistance.  Each gauge has 1.26 times more resistance than the previous, so 20AWG has 1.26^2 = 1.59 times the resistance of 18AWG.  The 26AWG wire used in these cheap adapter cables would have 1.26^8 or just over 6 times the resistance of 18AWG wire, and would have a temperature increase 6 times greater than 18AWG for a given level of current.

It could make for a fun future project; create a resistive load of 75W, take an old ATX PSU, hook up the adapter cables, and see what happens.  People do seem to like pictures and videos of things bursting into flames posted on the internet...


Thursday, May 26, 2016

Installing Python 2.5.1 on Linux


Perl has been my go-to interpreted language for over 20 years now, but in the last few years I've been learning (and liking) python.  Python 2.7 is a standard part of of Linux distributions, and while many recent distributions include Python 3.4, Python 3.5.1 is not so common.  I'm working on some code that will use the new async and await primitives, which are new in Python 3.5.  I've searched Extra Packages for Enterprise Linux and other repositories for Python 3.5 binaries, but the latest I can find is 3.4.  That means I have to build it from source.

While the installation process isn't very complicated, it does require installing gcc and associated build tools first.  Since I'm installing it on a couple servers (devel and prod), I wrote a short (10-line) install script for rpm-based Linux distributions.  Download the script, then run "sh py35.sh".  The python3.5 binary will be installed in /usr/local/bin/.

When installing pip packages for python3, use "pip3", while "pip" will install python2 packages.  And speaking of pip, you may want to update it to the latest version:
sudo /usr/local/bin/pip3 install --upgrade pip

Friday, April 22, 2016

More about mining


In my last post, I gave a basic introduction to ethereum mining.  Since there is not much information available about eth mining compared to bitcoin mining, and some of the information I have found is even wrong, I decided to go into more detail on eth mining.

Comparing the bitcoin protocol to ethereum, one of the significant differences is the concept of uncle blocks.  When two miners find a block at almost the same time, only one of them can be the next block in the chain, and the other will be an uncle.  They are equivalent to stale blocks in bitcoin, but unlike bitcoin where the stale blocks go unrewarded, uncle blocks are rewarded based on how "fresh" they are, with the highest reward being 4.375 eth.  An example of this can be found in block 1,378,035. Each additional generation that passes (i.e. each increment of the block count) before an uncle block gets included reduces the reward by .625 eth.  An example of an uncle that was 2 generations late getting included in the blockchain can be found in block 1,378,048.  The miner including the uncle in their block gets a bonus of .15625 eth on top of the normal 5 eth block reward.

Based on the current trend, I expect the uncle rate to be in the 6-7% range over the next few months.  With the average uncle reward being around 3.5 eth (most uncles are more than one generation old), uncles provide a bonus income to miners of about 4%.  Since uncles do not factor into ethereum's difficulty formula, when more uncles are mined the difficulty does not increase.  The mining calculators I've looked at don't factor in uncle rewards, so real-world returns from mining in an optimal setup should be slightly higher than the estimates of the mining calculators.

Another thing the calculators do not factor is the .15625 eth uncle inclusion reward, but this is rather insignificant, and most pools do not share the uncle inclusion reward.  Assuming a 6% uncle rate, the uncle inclusion reward increases mining returns by less than 0.2%.  If your pool is down or otherwise unavailable for 3 minutes of the day, that would be a 0.21% loss in mining rewards.  So a stable pool with good network connections is more important than a pool that shares the uncle inclusion reward.  Transaction fees are also another source of mining revenue, but most pools do not share them, and they amount to even less than the uncle inclusion reward in any case.

Finding a good pool for ethereum mining has been much more difficult than bitcoin, where it is pretty hard to beat Antpool.  For optimal mining returns, you need to use stratum mode, and there are two main variations of the stratum protocol for eth mining; dwarf and coinotron.  Coinotron's stratum protocol is directly supported by Genoil's ethminer, which avoids the need to run eth-proxy in addition to the miner.  Coinmine.pl and miningpoolhub.com support coinotron's stratum protocol, while nanopool, f2pool, and mininpoolhub support dwarf's protocol.  Miningpoolhub is able to support both on the same port since the json connection string is different.

Coinmine.pl and coinotron only have servers in Europe, and half the time I've tried to go to coinotron's web site it doesn't even load after 15 seconds.  Miningpoolhub has servers in the US, Europe, and Asia, and has had reasonable uptimes.  As well, the admin responds adequately to issues, and speaks functional english.  They have a status page that shows enough information to be able to confirm that your mining connection to the pool is working properly.  I have a concern over how the pool reports rejected shares, but the impact on mining returns does not appear to be material.  Rejected shares happens on other pools too, and since I am still investigating what is happening with rejected shares, there is not much useful information I can provide about it.

So for now my recommended pool is ethereum.miningpoolhub.com.   My recommended mining progam is v1.0.7 of Genoil's ethminer, which added support for stratum connection failover where it can connect to a secondary pool server if the first goes down.  The Ethereum Foundation is supporting the development of open-source mining pool software, so we may see an ideal eth mining pool in the near future, and maybe even improvements to the official ethminer supporting stratum protocol.

Saturday, April 16, 2016

Digging into ethereum mining

After bitcoin, ethereum (eth) has the highest market capitalization of any cryptocurrency.  Unlike bitcoin, there are no plug-and-play mining options for ethereum.  As was done in the early days of bitcoin, ethereum mining is done with GPUs (primarliy AMD) that are typically used for video gaming.

The first ethereum mining I did was with a AMD R9 280x card using the ethereum foundation's ethminer program under Windows 7e/64.  The installer advised that I should use a previous version of AMD's Catalyst drivers, specifically 15.7.1.  Although the AMD catalyst utilities show some information about the installed graphics card, I like GPU-z as it provides more details.  After setting up the software and drivers, I started mining using dwarfpool since it was the largest ethereum mining pool.

As an "open" pool, dwarf does not require setting up an account in advance.  One potential problem with that is the eth wallet address used for mining does not get validated.  I found this out because I had accidentally used a bitcoin wallet address, and dwarfpool accepted it.  After fixing it, I emailed the admin and had the account balance transferred to my eth wallet.

Dwarf recommends the use of their eth-proxy program, which proxies between the get-work protocol used by ethminer, and the more efficient stratum protocol which is also supported by dwarfpool.  Even using eth-proxy, I wasn't earning as much ethereum as I expected.

The ethereum network is running the homestead release as of 2016/03/14, which replaced the beta release called frontier.  The biggest change in homestead was the reduction in the average block time from 17 seconds to 14.5 seconds, moving half way to the ultimate target of a 12-second block time.  I wasn't sure if the difference in the results I was getting from mining was due to the calculators not having been updated from frontier or some other reason.  After reading a comment in the ethereum mining forum, I realized returns can be calculated with a bit of basic math.

The block reward in ethereum is 5 eth, and with an average block generation time of 14.5 seconds, there is 86400/14.5 * 5 = 29793 eth mined per day.  Ethereum blockchain statistics sites like etherscan.io report the network hash rate which is currently around 2,000 gigahashes per second.  A R9 280x card does about 20 megahashes per second, or 1/100,000th of the network hashrate, and therefore should earn about 29,793/100,000 or 0.298 eth per day.  The manual calculations are in line with my favorite eth mining calculator (although it can be a bit slow loading at times).  Due to the probabilistic nature of mining, returns will vary by 5-10% up or down each day, but in less than a week you can tell if your mining is working optimally.

Using the regular ethminer, or even using eth-proxy, I was unable to get pool returns in line with the calculations.  However using Genoil's ethminer, which natively supports the stratum protocol, I have been able to get the expected earnings from ethereum.miningpoolhub.com.  Dwarf uses an unsupported variation of the stratum protocol, so I could not use Genoil's ethminer with it.  I briefly tried nanopool, but had periods where the pool stopped sending work for several minutes, even though the connection to the pool was still live.

Both the official ethminer and Genoil's version were built using MS Visual C++, so if your system doesn't already have it installed, you'll need MS Visual Studio redistributable files.  Getting the right version of the AMD Windows catalyst drivers for ethminer to work and work well can be problematic.  Version 15.12 works at almost the same speed as 15.7.1, however the crimson version 16 drivers perform about 20% slower.

For me, as a Linux user for over 20 years, the easiest setup for eth mining was with Linux/Ubuntu.  I plan to do another post about mining on Ubuntu.

Sunday, March 27, 2016

Hacking GPU PCIe power connections


Until recently, I never thought much about PCIe power connectors.  Three 12 power and three ground wires was all I thought there was to them.  I thought it was odd that the 8-pin connectors just added two more ground pins and not another power pin, but never bothered to look into it.  That all changed when I got a new GPU card with a single 8-pin connector.

My old card had two 6-pin connectors, which I had plugged a 1-2 18AWG splitter cable into.  That was connected to a 16AWG PCIe power cable, which is good for about 200W at a drop of under 0.1V.  My new card with the single 8-pin connector wouldn't power up with just a 6-pin plug installed.  Using my multi-meter to test for continuity between the pins, I realized that it's not just a row of 12V pins and a row of ground pins.  There was continuity between the three 12V pins, and between three of what I thought were five ground pins.  After searching for the PCIe power connector pinout, I found out why.
Diagram edited from http://www.overclock.net/a/gpu-and-cpu-power-connections

Apparently some 6-pin PCIe cables only have 2 12V wires, 2 ground, and a grounded sense wire (blue in the diagram above).  With just two 12V wires, a crap 18" 20AWG PCIe power cable would have a drop of over 0.1V at 75W.  Since the 8-pin connector has three 12V pins, it can provide 150% more power.  My 6-pin 16AWG PCIe cable would have voltage drop of only 40mV at 75W, so I just needed to figure out a way to trick the GPU card into thinking I had an 8-pin connector plugged in.  The way to do that is ground the 2nd sense pin (green in diagram above).

I didn't want the modification to be permanent, so soldering a wire to the sense pin was out.  The PCIe power connectors use the same kind of pins as ATX power connectors, and I had an old ATX power connector I had cut from a dead PSU.  To get one of the female contacts out of the ATX connector, I used a hack saw to cut apart the ATX connector.  Not pretty, but I'm no maker, I'm a hacker. :-)  I stripped the end of the wire (red in the first photo), wrapping the bare part of the wire around the screw that holds the card bracket in the case.  I powered up the computer, and the video card worked perfectly.

Looking for a cleaner solution, I decided to make a jumper wire to go between the sense pin and the adjacent ground.  I also did some searching on better ways to remove the female contacts from the connectors.  For this, overclock.net has a good technique using staples.  When the staples aren't enough to get the contacts out, I found a finish nail counter-sink punch helps.

Here's the end result, using a marrette (wire nut) to make the jumper:

Wednesday, January 20, 2016

LED low power limbo: light below 1uA


Anyone reading this blog has likely noticed how LED efficiency has significantly improved in the last decade.  If you follow the old rule-of-thumb and use a 330-Ohm series resistor to power a modern LED from a 5V supply, looking directly at the LED will leave a dot floating in your vision for a few minutes like a camera flash.  Series resistors for 0603 SMD LEDs like those on the Baite Pro Mini board are often around 1K-Ohm, and even then I find them too bright.  Even when powered through a MCUs ~40K-Ohm pull-up resistor, I find LEDs clearly visible.  This got me wondering, how low can you go?

I started with a cheap (<$2 for a bag of 100) 5mm blue LED with a 470K resistor, powered from a 3.3V supply.  The room was lit with 2 800 lumen CFL bulbs, and the LED was still clearly visible.  The voltage across the resistor (measured with a meter that has 10M input impedance) was 856mV, so solving for I in V=IR means that the current was only 1.8uA.  The next step was to try 2 470K resistors, and although it was dim, it was still clearly visible, especially when looking directly into the LED.  In the photo above the LED looks brighter than it does with the naked eye due to the light sensitivity of the camera being different than the human eye.


The largest resistors I have in my collection are 1M-Ohm, and I really wanted to try at least 4.7M-Ohm.  Without an obvious solution, I put my breadboard away in a drawer for a few days.  My inspiration came when I thought of my TL431A voltage references.  With a 270Ohm resistor and a TL431 I made a simple low-current 2.5V supply.  With the 2.5V supply and a 470K series resistor, the LED was still visible!  The current through the LED was about 0.2uA, and the voltage drop across the LED, which normally is around 3V with 20mA of current, was slightly above 2.3V.  I found that the LED was brighter when I didn't look directly at it, which is to be expected with dim objects since the most light sensitive cells in our retina (the rods) are located outside the center of our vision.  After adding the 2nd 470K resistor back into the circuit, with just 144nA of current, the LED was still faintly perceptible.

Rather than digging out the 1M resistors, I grabbed a 1n4148 diode and connected it in place of one of the 470K resistors.  If you didn't read my diodes, diodes everywhere post, you might think a 1n4148 would drop about 0.6V from the supply, leaving too little to light the LED.  But with all diodes, the lower the current, the lower the voltage drop.  With the 1n4148 and a single 470K resistor, the blue LED was no long obviously visible (sometimes I thought I noticed some blue out of the corner of my eye), and the voltage drop across the 1n4148 was just 157mV.  The current through the LED now 103nA.  With my camera pointed directly at the LED and the room lights still on, I could still clearly see blue light.  To be sure I cycled the power on the circuit a few times, and the blue light came on and off as expected.

Since even lithium coin cell batteries that you might use for a low-power project have an internall leakage current in the hundreds of nano-amperes, I had reached the end of the practical application of the experiment.  But like any good hacker (hat tip to Jamie and Adam), I wanted to see how far I could take the experiment.  A human is able to detect when a few dozen photons enter their dark-adjusted eye.  The next step was pretty simple; turn off the lights.

Once I turned off the lights, the blue LED was again visible.  My next step was adding the 1n4148 back into the circuit, along with the 2 470K resistors(after turning the room lights back on).  At this point the current was only 17nA, and I was questioning whether I would be able to see anything, even after my eyes adjusted to the dark.  I turned out the lights and went to bed.

While in bed I wondered how much light, quantitatively, was being emitted by the LED.  The light is the result of electromagnetic waves emitted by atoms in the LED gets stimulated by an electron.  Super-high efficiency LEDs can supposedly emit 1 quantum of light (i.e. a photon) per 3 electrons.  My cheap LEDs are nowhere near as efficient, perhaps emitting 1 quantum of light for every 300 electrons.  Dust off the old physics texts, and you can figure out that 100nA of current is about 600 billion electrons per second, and 17nA is a bit more than 100 billion electrons per second.  The chance of seeing light at 17nA was seeming more likely.

After about 10 minutes in bed letting my eyes adjust to the dark, I got up to look at the test circuit.  With only the faint green glow of the ethernet link lights from my router a couple meters away, I stared at the blue LED.  I thought I could make out a fuzzy ball of light when I got very close (~10cm) away from the LED.  I cycled the power on the circuit a couple times, and sure enough the light came and went.

Getting back to the practical applications of this experiment, think of a wireless sensor running on a CR2032 coin cell.  Using the internal 35K Ohm pull-up on an AVR MCU to power a blue indicator LED will use 10-15uA of current while still making the LED easily visible.  Blinking the led for 100ms out of every 5s will consume an average of just 300nA, while making a useful heartbeat indicator.

Thursday, January 14, 2016

Lessons in buying Bitcoin


While bitcoin is far from mainstream, with it making headlines like Mark Zuckerberg's nemesis twins Tyler and Cameron launching Gemini, I figured I'd learn how to use bitcoin.  Aside from nerds using it to tip on reddit and github, bitcoin doesn't have much practical use.  Personally my interest is primarily educational, so if any bitcoin related business opportunities arise in the future, I may be able to capitalize on them.

With their logo at the start of this post, you can probably guess that I recommend coinbase for Canadians and Americans looking to buy small amounts of bitcoin (under US$500 worth of bitcoin).  This recommendation comes after I've looked at the offerings from several bitcoin exchanges including bitstamp, bitfinex, Kraken, cex.io, and Canadian exchange QuadrigaCX.  I registered for an account at bitstamp and coinbase, and traded bitcoin on the latter.

Bitstamp supports funding (sending money to bitstamp so you can buy bitcoin) from Canadian bank accounts.  Any funds are converted to USD, but finding out the exchange rate takes some work.  I emailed bitstamp on Christmas eve asking for their foreign exchange fees, and received a reply back on the 28th:
to view our exchange rates, please see the following link and click on the "Corporate exchange rates" for the correct rates: http://www.raiffeisen.si/en/pripomocki/exchange_rates/ .Please note that all currencies are converted to USD free of charge by our bank.
I checked the exchange rates, and found that their bank adds about 0.6% to the spot rate for CAD/USD.  On top of that you'd have to add their trading fee of 0.1% for a limit order or 0.2% for a market order.  Adding that to their $1 minimum e-check fee means that the total fees to buy $100 in bitcoin would be $1.70.  That's reasonable compared to most other exchanges, but you'll have to pass their account verification first.  Despite providing a 300dpi high-quality jpeg scan of my driver's license, my account verification request was denied with the message, "the quality of the image/scan cannot be accepted according to UK AML standards."  In other words, no bitstamp for me!

For coinbase I couldn't completely figure out their fees until I actually set up an account.  Some of their support pages refer to 0% maker and 0.25% taker fees on their exchange.  Other support pages will refer to a 1% fee for buying bitcoin.  In the end I figured out both are correct, since there are two ways to buy and sell bitcoin.

With coinbase, unlike bitstamp, you can fund your account without ID verification.  You will need a cell phone with a US or Canadian number for a basic residency check.  Once your account is setup and you choose to deposit funds to your "CAD wallet", you are presented with the following options:

When I chose "Deposit with Interac", I wasn't able to proceed and was given an error message that I need to verify my account.  A good account interface design wouldn't have presented the option, and instead would have it greyed out with a note that the option is available after ID verification.  To use the bank account, you need to provide your bank, transit, and account number.  After a day or two you'll see a small deposit to your account (for me it was under 50c).  You need to log into your online banking to see the amount of the deposit, and then enter the exact amount in your coinbase account to link with your bank account.  Once that is done, you can take funds from your bank account, and after a few days the funds will become available for buying bitcoin.

Being a bit impatient, I decided to provide ID verification so I could use Interac Online in order to get funds instantly into my account.  A few minutes after uploading same jpeg file of my license that bitstamp refused to accept, I got an email stating my identity has been verified.  In addition to making the Interac Online available, verifying my account increased my limits from $500/day to $3000/day (not that the $500 limit was a problem for me).  However I still couldn't use Interac Online because, "Interac online is not available for visa debit card holders."  If you have a relatively new bank card (issued in the last couple years) with the Visa debit logo in the corner, you're out of luck.

My wife's account, however, doesn't have the Visa debit logo.  After noticing this, and reading about the referral program that gives $10 in BTC for the new account and for my account, I set up another account for my wife.  Now that I knew how coinbase worked, the process was a lot quicker.  I helped he set up the account, and had her account verified with a copy of her license.  Then I used Interac Online to withdraw C$149 from her account, leaving C$148 in her CAD wallet after the $1 fee was deducted.  I then chose to buy bitcoin, entered $148, which left $146.52 after the 1% fee, and completed the transaction at the quoted exchange rate (about C$620/BTC).  A few minutes later I got an email about my invitation bonus, and at the same time an additional US$10 worth of BTC showed up in my wife's account.

So what about those 0% maker and 0.25% taker fees?  For that you need to use the coinbase exchange, which you can do by clicking on "exchange" from your account, and then click on log in with coinbase.  The interface is similar to a discount stock broker, with a list of bid and ask prices along with a calculation of the current spread.  With a Canadian account you can only trade CAD/BTC, but you can view the USD/BTC order book.  The spreads on the USD/BTC exchange are usually only 1-2c, while spreads of $1-$2 are common for CAD/BTC.   Because of that I was able to get better prices than I could if I was trading USD/BTC.  I tried a couple small (<0.1BTC) limit sell orders, with a price a few cents below the lowest listed sell order price.  The first one filled in about 10 minutes, and the second filled in less than a minute.  Both, as expected, had no fees as "maker" orders.

As long as the invitation program continues, the net fees to get C$160 in bitcoin is actually negative.  After paying $2.50 in fees to buy $146 in bitcoin, you'll get US$10(C$14) in bitcoin as a bonus.  Probably not worth the trouble for most people, but certainly worth it for the nerds and geeks that want to give out bitcoin tips.