How To Build A Litecoin GPU Miner
Constructing a Litecoin Miner
Let's be honest with each other - we all missed the boat on mining Bitcoins easily at home. Other than investing tens of thousands of dollars into specialized equipment or buying large swaths of computing power from a cloud, the ability to self-generate Bitcoins is now out of your reach. Fortunately there are many other alternatives, although not nearly as popular or mainstream as Bitcoin, that are within reach. The challenge becomes which one to take risk on.
Litecoins, Peercoins, Feathercoins and More
Consider it both a blessing and a curse, the open-source nature of Bitcoin has allowed anybody to take the code and make their own digital currency. As such, as of January 2014, there are 80 different coins tracked in circulation. Whether a digital currency is in circulation or not means very little, after all, most are "worth" less than a dollar and have less than $1000 in daily volume. Others, like Dogecoin began as a joke and yet despite still having a minuscule value it drives an enormous number of transactions. It's always risky especially when indicators from nation states and financial institutions can vary so widely. So how does one determine what else to deal with?
Jump back to the market capitalization for starters. Its pretty easy to eliminate everything that basically has no volume - or is it? Those coins could be ideal to treat as penny stocks. Their current processing difficulty is very low so pointing powerful hardware at them for a few weeks or months could yield thousands of coins. Just as with penny stocks, they offer a mammoth return because its so easy for them to double, triple, or quadruple their value. But it's just as likely they will fizzle out. Ideally, there are some coins in the top 10 for volume and value that have a difficulty level within normal computational difficulty. Or perhaps the coin was modified somewhat to account for the perceived problems with Bitcoin like the 51% issue, the susceptibility to ASICs, Denials of Service, and others.
One development that keeps mining in the hands of normal people is a transition from SHA256 hashes to Scrypt hashes. SHA256 hashes were quite easy to develop hardware for using FPGAs and ASICs which is what led to the Bitcoin processing explosion at the end of 2013 when miners switched over to dedicated hardware. Scrypt is also a Key Derivation Function but it differs from SHA256 in that its design was to be computationally difficult via large memory requirements. In a nutshell, the various stages of Scrypt rely upon previously derived bit string vectors (randomized of course) in a sense making the algorithm itself dependent on its own feedback. Needless to say, although some ASICs have been produced for this, none of them are yet available to the public meaning Scrypt based digital currencies are still within the computational reaches of home mining rigs. Alpha Technology is currently accepting pre-orders for their $2200 and $9000 Scrypt compatible units capable of 5MHash/s and 25MHash/s respectively. While there are many to choose from, Litecoins have emerged as the runner-up to Bitcoin, not so much as a contender but more analogous to being silver and gold.
Perhaps to keep the description more focused, I'm just going to quote straight from the CGMiner README file - "Scrypt mining, AKA Litecoin mining, for GPU is completely different to SHA256 used for Bitcoin mining. The algorithm was originally developed in a manner that it was anticipated would make it suitable for mining on CPU but NOT GPU. Thanks to some innovative work by Artforz and mtrlt, this was proven to be wrong. However, it has very different requirements to Bitcoin mining and is a lot more complicated to get working well. Note that it is a RAM dependent workload, and requires you to have enough system RAM as well as fast enough GPU RAM. If you have less system RAM than your GPU has, it may not be possible to mine at any reasonable rate."
Why Litecoin versus any of the others? Right now, Litecoin is second only to Bitcoin and actually growing faster. The Scrypt based coin is experiencing its first major "bump" just as Bitcoin did several times before (each time with doubters believing it would not last). One year ago, a $100 investment into Litecoins would be worth $30,000. According to its inventor, Charlie Lee, a former Google employee, "It’s a year and a half behind Bitcoin in age and maturity. Litecoin is the silver to Bitcoin’s gold. It has taken 2nd place in digital currency because it was created early and it was fair." He adds, "Litecoin versus Bitcoin is like Facebook versus Google Plus. It would be hard for Plus to overtake Facebook. But if something catastrophic happens to Bitcoin, I could see Litecoin positioned to overtake it."
Regardless of which digital coin you opt to mine at home, what do you do with them? Considering Bitcoin is about the only currency that has broken the markets for public use in purchasing goods, most of the alternate currencies don't exactly empower you to do much. Unless your intent is to simply sit on your stash hoping for the value to rise, you need an exchange to convert them into something else. In many cases, the most optimal exchange is not to convert the alternate currencies back into state sponsored fiat. Instead, the exchanges generally favor a direct conversion into Bitcoins and perhaps into dollars after that. For exchanging, I chose Cryptsy because it was simple and allowed for converting practically every conceivable alternate coin into either Bitcoins or Litecoins. From there, I chose CoinBase for converting Bitcoins back into US dollars. (NOTE: If you click the CoinBase link, sign-up, and actually use their service, we BOTH get a $5 referral award.)
Pick a Pool
Even with the iMac I already have and the hardware I was about to assemble, the realistic likelihood of going solo at Litecoins is pretty poor. I could spend a large amount of time computing against a block to earn 25LTC only to find that a pool already solved it, rendering my work moot and stale. Of course, I could always fork out some serious money to buy a whole farm of hardware to achieve an insane hashrate (with corresponding electric bill). But that's a ridiculous idea so obviously this is why being a member of a pool is useful since it's much better to gain fractions of LTC cumulatively and frequently than wait a long time to still have nothing.
There are plenty of options out there, but I chose WeMineLTC to start with. My reasons were poor, I just needed something to try it out and they had no fees. Anyway, my iMac's AMD Radeon HD 6970M is not awesome at this sort of thing (iMac's use mobility chips) and could only produce about 80KHash/s which nets a super lame 0.02 LC per day at the current 3998.2 difficulty. According to WeMineFTC's on-line calculator, I could expect to make $183 a year like that while burning even more in electricity. Yes!
Many of the pools also offer automatic shifting for maximizing profitability. For instance, MultiPool, Coinotron, and WeMineAll have interfaces into the various coin exchanges. They continually check the exchange rate for easy to mine Scrypt currencies (like Feathercoins, Dogecoins, etc) and identify which one is the most profitable. What does that mean? At different points in time, you can actually make more Bitcoins by mining an alternate currency and rapidly exchanging it than you could by simply mining Bitcoins - and fortunately, those advantages work for Scrypt based currencies. These pools automatically dispatch the most profitable work to you and some even perform the Bitcoin exchange as well.
There are actually intelligent things to consider when joining a pool. A very simple, but potentially lucrative pool feature is to find one that deals in more than just Litecoins. This is called merged mining. At present, unlike for Bitcoins, I have not found any cases of merged mining for Scrypt coins but this is probably due to the nature of the computation not crossing streams well. If one does turn up, though, you therefore get the benefit of mining multiple crypto-currencies for the same computing power.
Fees and payout plans are also a huge factor to consider. Obviously picking a pool that does not tax your earnings enormously is advised as those percentage takes for pool maintenance (and operator profit) will really nickel and dime you over time. Identifying a 0-10% skim off your mining is easy, just browse the Wiki comparison pages and read the pool's fine print. More importantly, you need to understand the payout system. Right now, PPLNS (Pay Per Last N Shares) are becoming very common. The different mechanisms have arisen due to miners "pool skipping" where the computational whales jumped into a pool with a simpler block to effectively steal all the shares before jumping again. An excellent description of the different reward systems can be found on the Bitcoin wiki.
Another factor is the mining pool's size. Obviously a big pool is going to solve blocks faster than a small pool because their collective processing power is greater. However, unless you're contributing a mammoth amount of that horsepower yourself, you are a very small fraction of that pool and your shares will be correspondingly small. Participating in a smaller pool increases your fractional share of the block reward but you run the risk of those shares being stale if a bigger pool solved it first.
The bottom line is, get in a pool. The water's great.
Getting Hardware
I've been running Apple gear for nearly a decade, so it's been a long time since I've messed around with hardware. That is discounting the fun I had with the ancient Apple Cube which was more upgradeable than most people realized. But I digress, even if this whole coin mining endeavor fails completely, at least it was fun getting back into system building again. Right? Following arcane steps to make Linux recognize proprietary drivers is always a good time.
Unless you have some equipment on hand to turn into a mining rig, you're going to need a motherboard, processor, and RAM. Actually, you may want to opt for a new motherboard anyway just to have a lot of PCI express slots to plug a bunch of video cards into. In truth, the CPU and RAM can be minimal since you should be running Linux and the GPUs are doing all of the work. One note, however, is that unlike SHA256D mining which was almost purely computational, Scrypt mining with GPUs will leverage your system RAM; but not to the degree that you need to go purchasing more than 8-16GB. If your machine will do nothing but mine, save the money, otherwise it could be handy to beef some parts up so the machine can take on a few additional tasks.
- GIGABYTE GA-G1.Sniper A88X Motherboard - This bad boy has two PCIe 16x and three PCIe 1x slots which aligns to my target goals.
- AMD A8-5600K Trinity 3.6GHz (3.9GHz Turbo) Socket FM2 Quad-Core CPU with AMD Radeon HD 7560D - AMD processors are certainly cheaper than Intel and for a few bucks more you can get extra hashes out of the on-board Radeon.
- Crucial Ballistix Sport 8GB 240-Pin DDR3 SDRAM - What can I say ... more RAM more fun!
The most critical piece, of course, are the GPUs. At present, buying nVidia is pointless so stick with AMD due to design philosophy. Remember, these cards were originally designed primarily with gamers in mind so 3D graphics performance was the driving factor, not parallel computation of encryption hashes. AMD went with a VLIW (very long instruction word) architecture with heaps of shaders to simply pump massive amounts of data in parallel. nVidia was actually more elegant, using fewer shaders at a higher clock rate and a more multipurpose chip ... but that doesn't help you. Although I chose from the following devices, always consult the most recent hardware matrices to see the latest in GPUs with expected hashrates and power consumption data.
NOTE: As of right now, the proprietary AMD drivers apparently struggle sometimes with non-homogenous GPU setups - i.e. an R9 290 and a 7950, etc. Try and keep your multiple card setups using the same hardware to maintain some sanity. (Something I wished I knew before buying them ....)
- SAPPHIRE Radeon HD 7950 3GB - The 7950s have long been regarded as one of the most optimal GPUs for power consumption, price and decent hashrates. But they're becoming harder and harder to find for that very reason.
- ASUS Radeon R9 290 4GB - The 290s and 290Xs use the new ATI Hawaii chips and are near the top of the food chain and can churn 850KHash/s apiece. Later on, however, this proved to be a problem getting them configured to run in Linux and play well with other devices.
- 1X To 16X PCI-E Extension Extender Cable Ribbon Riser Card - A riser card/ribbon is absolutely critical to multi-GPU mining because the units take up two slots apiece and there are limited PCI-E slots available. Riser cables allow you to mount additional GPUs off the motherboard which has the added bonus to improving airflow to all the cards.
If you can get your hands on a VisionTek CryoVenom R9 290 that would make the ideal mining card due to their water cooling solution that will nearly halve the running heat plus they've already overclocked it, too. This obviously prolongs the card's life and keeps your rig quieter (hugely important when your significant other complains about the computer gear in the living room/bedroom). Or it allows you to go crazy overclocking the thing ... your choice.
The power supply is usually an afterthought to many computer builders, but its worth some thinking for a coin miner. Each Radeon 290 can consume up to 330 watts running at full bore. There is also still the baseline power consumption to consider for the processor, drives, RAM, fans, etc. Running four two five GPUs can easily require 1200-1600W alone. One way to save a little money is to spend a little extra up front for more efficient power supplies that don't waste energy as heat or through poor voltage conversion and buffering. Another is to simply get two units and chain them together (two units together often cost less than a single) ideally getting each to its most efficient load rather than maxing one out. Unlike power supplies from years gone by, modern ones come with modular plugs allowing you to attach "just the right amount" of equipment to reduce cable clutter. This is important because you should make sure the power supply includes enough cables to specifically juice the PCI-E cards (which draw power from both the motherboard AND directly off the power supply).
- RAIDMAX RX-1000AE 1000W
- RAIDMAX RX-850AE 850W
- 1 x 24PIN TO 2 x 24PIN Cable - This is handy for running multiple power supplies to chain them together so that the motherboard's power switch can activate both simultaneously.
Finally, bring everything to a close with the remaining peripherals like a case, keyboard, mouse, etc. Once the unit is up and running, you really won't need the keyboard, mouse, or monitor anymore so its ideal to just borrow these from a friend or transplant them temporarily from another computer. The case is really a matter of personal preference. If you want the rig to look nice, get a nice case but make sure its adequately ventilated. Open air cases are ideal for keeping the components secure and organized but there are plenty of build examples across the Internet showing bare hardware strewn about a table or packed into a milk crate. Bench cases are readily available from standard PC consumer sites, but to run a lot of GPUs, you'll probably need a customized case that will hold the cards "off-board" (requires riser cables) in order to fit them and allow them airflow. Another advantage of the custom mining cases is the ability to hold more than one power supply.
- eBay: Open Air Mining Case - A custom case is needed to hold more than two GPUs at a time in order to get them off the board and spaced for appropriately to avoid cooking in their own heat.
If you can't figure out how to put your hardware together ... this venture isn't for you. Slap yourself in the face.
Configure Radeon GPUs on Linux
This is one of the stages where you'll beat your head against the wall and perhaps opt to simply use Windows. You have to use the proprietary AMD drivers because there is no support for OpenCL via the open source Radeon driver. Speaking of which, if you do opt to somehow get that driver working, it can be found in the Linux 3.13 kernel release and will handle non-homogenous mixes of multiple Radeons much better, though at the cost of OpenGL gaming performance and power consumption. There's also a project for an open OpenCL library ... but that doesn't do any good right now as you're not trying to play games, save power, or use a generic library.
Your system may have already fired itself up using the open source Radeon driver so you'll need to kill that off. Open the /etc/modprobe.d/blacklist.conf
file with your favorite text editor. Simply add "blacklist radeon
" (without the quotes) to the bottom of the file. Now reboot your computer. Doing this upfront will at least help alleviate issues with the open source and proprietary drivers fighting with one another later on. A driver thunderdome usually ends up with no winners.
After the reboot, quickly confirm the open source driver is no longer loaded. Look for the driver listed in the configuration
line from this command's output and make sure it does not say radeon
.
sudo lshw -c video
The following procedure documents my battles with Ubuntu 13.10, but for the RHEL/CentOS/Fedora users, an excellent guide was put together on the bitcointalk forums. To get started, there are going to be pre-requisite dependencies to pull down before installing the AMD Catalyst driver. For me, using Ubuntu 13.10, the driver whined about not having debclean
and dpkg-buildpackage
. This was solved using the dependency list on the silverlinux blog.
sudo apt-get install dh-make dh-modaliases execstack libxrandr2 libice6 libsm6 libfontconfig1 libxi6 libxcursor1 libgl1-mesa-glx libxinerama1 libqtgui4
You may as well also install these extra packages now, too. These dependencies are necessary after the Catalyst driver is built to perform the installation.
sudo apt-get install lib32gcc1 libc6-i386 dkms
The second step is to acquire the AMD Catalyst drivers. As of 8 January 2014, version 13.12 was stable and 13.11 was beta (seems numerically backward). With that file downloaded and unzipped, run it as a super-user. After decompressing itself, the file self identified as version 13.251 (really helps with confusing the user - thanks AMD) and pops up screen prompting for installing the pre-built driver or to generate a driver specific to your distribution. I chose the latter - optimized performance right?! You'll be prompted for the specific distribution (RedHat and Suse are listed) which it auto-detected at the bottom. After the drivers are built, install all of the fglrx
files and reboot the system.
sudo ./amd-catalyst-13.12-linux-x86.x86_64.run sudo dpkg -i fglrx*.deb sudo reboot
Awesome ... it failed. Choosing the pre-built binary managed to install fine. Also downloading the 13.11 beta version and performing a distribution based build, everything managed to install fine. Reading around the Internet reveals that nobody seems to have the same fortune as each other even following the same steps verbatim. Go figure. If the drivers are failing you, make sure you remove them completely before trying different versions. There are plenty of cases where various Internet users report doing the same thing two, three and even four times over makes it work. Why? Who knows. Blame the NSA's bugs lurking in your machine.
sudo apt-get remove --purge fglrx*
Once you think everything is functioning, you can also get the AMD utilities to report on the GPUs they've detected using the command:
aticonfig –lsa
If everything went well and all the cards are identified, issue the following command to force a new X configuration to be built. NOTE: This command will need to be reissued any time there is a change to the configuration of your GPUs - e.g. change their slots, add one, remove one, etc.
aticonfig --adapter=all --initial --force
There's really no need to go all out setting up your X configuration file. If you only have a single GPU, its a fairly moot and easy task. But multiple GPUs are going to be highly dependent on your local configuration and needs. Frankly, my system is running headless and only needed X as a dependency for the AMD driver build.
Configure CGMiner 3.7.2
CGMiner is pretty much the defacto standard to use for GPU mining - whether for SHA256D or Scrypt based coins. You can download the source for version 3.7.2 from http://ck.kolivas.org/apps/cgminer/3.7/. Grabbing this version is important as there are plenty of newer releases. 3.7.2 was the last version to support GPU mining (because Bitcoin had migrated to ASICs at this point). Obviously it is possible to simply download a pre-built binary but that wouldn't be very nerdy of you. Besides, compiling a version native to your hardware results in a tighter, leaner binary that makes use of the performance enhancing features available specifically on your system.
Of course, just downloading and compiling the source would be too easy. Naturally Murphy's Law will get involved and throw up a bunch of dependencies your system is not prepared for. On an Ubuntu 13.10 system, you will also need to install the following packages:
sudo apt-get install libcurl4-openssl-dev sudo apt-get install libncurses-dev sudo apt-get install opencl-headers
That's enough to get CGMiner to compile, but it won't be accessing the GPUs yet. You must install the AMD SDK to provide the necessary OpenCL (beyond the API libraries) support into the tool. The latest SDK can be downloaded from AMD's SDK page along with historical versions. Version 2.9 is the most recent and after decompressing the archive, the whole package can be installed from AMD's provided script.
tar xvf AMD-APP-SDK-v2.9-lnx64.tgz sudo ./Install-AMD-APP.sh sudo reboot
The ADL support isn't entirely necessary, but it allows CGMiner to have insight into the card's environment - operating temperature, fan control, etc. ADL version 6 can be downloaded from AMD's Display Library page. For this one, simply decompress the zip file and locate the include
directory. This just has to be moved into the CGMiner compilation path where it will be automatically included during the build. Assuming both are unzipped within the same parent:
cp include/* ./cgminer/ADL_SDK/
With all of the necessary dependencies installed, its time to run the configuration utility which locates all of the necessary resources. It is important to add the --enable-opencl
and --enable-scrypt
command line options. These switches enable support for GPU computations and the scrypt hashing algorithm.
./configure --enable-opencl --enable-scrypt
At this point, at the end of the configuration builder, you should see output similar to the following. If its different, check your spelling or scroll up to identify if there were particular dependency issues specific to your system.
------------------------------------------------------------------------ cgminer 3.7.2 ------------------------------------------------------------------------ Configuration Options Summary: libcurl(GBT+getwork).: Enabled: -lcurl curses.TUI...........: FOUND: -lncurses OpenCL...............: FOUND. GPU mining support enabled scrypt...............: Enabled ADL..................: SDK found, GPU monitoring support enabled Avalon.ASICs.........: Disabled BFL.ASICs............: Disabled KnC.ASICs............: Disabled BitForce.FPGAs.......: Disabled BitFury.ASICs........: Disabled Hashfast.ASICs.......: Disabled Icarus.ASICs/FPGAs...: Disabled Klondike.ASICs.......: Disabled ModMiner.FPGAs.......: Disabled Compilation............: make (or gmake) CPPFLAGS.............: CFLAGS...............: -g -O2 LDFLAGS..............: -lpthread LDADD................: -lcurl compat/jansson-2.5/src/.libs/libjansson.a -lpthread -lOpenCL -lm -lrt Installation...........: make install (as root if needed, with 'su' or 'sudo') prefix...............: /usr/local
With a proper configuration file generated, issue the make
command and a wait a few moments while CGMiner builds itself. That's all it takes and now CGMiner is ready to run. But you only wish it was that easy - you still need to tune your GPUs for optimal hashrates and inevitably (especially with multiple GPUs), something will still decide not to work. Tuning hashrates is an art into itself and there are plenty of guides to include Justin Soo's tips on achieving 1MHash/s with the 290 and 290x GPUs.
In multiple GPU setups, a common error that presents itself is that OpenCL and ADL see a different number of cards. Unless you have a monitor plugged into everything (or a fake dongle plug), the OpenCL components may not "see" as much as the ADL libraries. This can be addressed using the --gpu-map
to inform the system of the mapping between OpenCL and ADL GPUs. Another strange occurence that some folks have and some folks don't is the need to run CGMiner with sudo
.
Otherwise, at this time simply go ahead and fire up CGMiner pointed at your preferred mining pool and let her process data. At a bare minimum, a command-line like the following will fire everything up with pure default values (bear in mind, those defaults are not very efficient).
export GPU_MAX_ALLOC_PERCENT=100 export GPU_USE_SYNC_OBJECTS=1 ./cgminer --scrypt --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x
Configuring CPUMiner
If you're mining with GPUs, you'll find that CPUs really don't contribute much to your hashrate. But if you've built the rig and the CPU is otherwise idling, it couldn't hurt to get them cranking away at a digital currency as well. CPUs can be somewhat effective if you find an alternate coin that is newer and therefore has a low computational difficulty (such as FeatherCoin).
One of the fastest miners for CPU only work is Pooler's CPUMiner. The source tree for CPUMiner can be found on GitHub at https://github.com/pooler/cpuminer. A simple git clone will acquire the code and put it into a directory that I've unimaginatively named "cpuminer" as follows:
git clone https://github.com/pooler/cpuminer.git cpuminer
There are definitely pre-built binaries available for your target platform from the repository. However, nothing really beats a custom rolled compilation for speed when its matched to your hardware. On my iMac, using a locally compiled version doubled my hashrate from insanely slow to slightly faster than insanely slow (compared to GPUs). Your hash rate mileage may vary, of course, but if you've gotten this far, go full geek and compile it.
Unfortunately, unless you've been installing packages already, a stock distribution is probably missing some of the install dependencies for CPUMiner as well. The usual culprit is aclocal
which is easy to acquire.
sudo apt-get install autotools-dev sudo apt-get install automake
Once you have the necessary dependencies, run the included configuration utilities (basically just follow the README instructions):
./autogen.sh ./configure CFLAGS="-O3" make
At this point, you should have a compiled binary. Opt into your favorite mining pool and then run the utility. It helps to drop it into a background process so that it continues regardless of the window or terminal state. Then just run a quick process list to identify the instance for the process id. Use the renice command to lower its priority so that the CPU mining operation won't impact the system performance (particulary slowing down the queuing of tasks to the GPUs).
./minerd --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x --background ps -A | grep minerd renice 19 ##### <-- where # is the identified pid
Now watch that CPU just mint coins ... at the rate of days, weeks, months. My iMac's 3.4GHz i7-2600 CPU (hyperthreaded quadcore) can get about 35KHash/s and the 3.6GHz AMD A8-5600K Trinity (quadcore) in my coin mining rig can sustain about 25KHash/s. At the present difficulty of 3931, the paltry power of a 35KHash/s CPU will net you 0.01LTC per day. However, having thousands or millions of CPUs doing this can be profitable for criminals - its becoming more common for some malware to incorporate mining software for that purpose.
What About Windows?
While I really did enjoy constructing a mining rig and fighting through Linux configurations and drivers to get it configured properly - in the end, it was still finicky. There are a lot of guides out there where identical solutions only work for certain hardware configurations, etc. so ultimately everyone is really somewhat on their own hoping one arcane combination will work for them. I spent several days fighting with my Linux system to get it working. Out of frustration, perhaps in a 2am fit of cyber rage, I forsook Linux and spoke Bill Gates name three times while sacrificing several nerd points and installed Windows 7 to another partition.
Literally within 30 minutes, the whole system was up and running. Just throw down a stock installation of Windows 7, load all the requisite drivers for your base hardware, and install the latest Catalyst drivers from AMD (at least version 13.12 if you're running newer Hawaii based Radeons). Just download and decompress the CGMiner Windows binary and you're ready to start processing. The nice thing about Windows was AMD actually cares about their Windows customers because of the 3D games market and makes an all-inclusive driver that actually does work - no kernel matching/patching, no additional SDKs to support OpenCL, no additional libraries to support ADL, no obscure dependent libraries, no crazy configuration files hidden in /etc/
. Everything is just part of the native driver package.
I hate Windows ... but dammit, it was just easier to get going and each day the rig isn't running is a day it's not trying to pay for itself. Just as with the Linux system, this bare minimum command-line will fire everything up with pure default values.
setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_USE_SYNC_OBJECTS 1 cgminer.exe --scrypt --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x
Despite Windows being so easy and quick to setup, it does have its annoyances. I really wanted to run the system "headless" and just ssh into a Linux system to check in on the box's health and GPU statuses. I found that whether I used Microsoft's Remote Desktop Connection or VNC, the mere act of connecting to the box like that managed to diddle something in the AMD driver. What? I have no idea ... again, I'm going to blame the NSA just because its hip right now. What would happen though is that CGMiner would often freeze up requiring a restart and the ADL link was lost resulting in not knowing card temperatures without a reboot. Subsequently, the fans would go apeshit. Needless to say, I've decided it's not that big a deal to just push the power button on a monitor and move a mouse. But still, I didn't want to do that. (I've actually since switched to having CGMiner record results to a log file sync'd via DropBox so that I can check on the status without actually connecting to the host).
Optimizing GPU Hashrates
Great, the mining rig is up and running with default values but those hashrates are abysmally small. There are many ways to start bumping up performance. The quickest is to search the Internet for people with the same hardware and see what configurations they chose for intensity, threads, concurrence, memory rates, and GPU rates. But even though those values work for a 7950 or an R9 290 in somebody else's system, there is still a lot of variation in base hardware and even card manufacturing process to require personal tweaking. As an obvious example, different OEMs cool cards differently meaning some are able to handle hotter runs on a sustained basis than others. Before you start plugging in values, its important to understand all the statistics displayed by CGMiner. The official forum thread on Bitcoin Talk is from the developer and details what all the information means. Make sure you can find your hashrate, accepted, hardware errors, and intensity in the CGMiner interface before moving on.
A nice feature of CGMiner is that you can run it specifically against particular devices. Obviously the configuration settings for a Radeon 7950 will differ from a Radeon R9 290 so runnng CGMiner in a single instance for all your devices will result in poor performance. Use the cgminer.exe -n
command option to enumerate all the devices CGMiner sees. The following output will inform me of the corresponding device number to use the --device 0
, --device 1,2,3
, and --device 4
options to run different instances instances of CGMiner for independent Radeon 7950, Radeon R9 290, and CPU embedded Radeon 7560 control.
[2014-01-18 16:06:04] CL Platform 0 vendor: Advanced Micro Devices, Inc. [2014-01-18 16:06:04] CL Platform 0 name: AMD Accelerated Parallel Processing [2014-01-18 16:06:04] CL Platform 0 version: OpenCL 1.2 AMD-APP (1348.5) [2014-01-18 16:06:04] Platform 0 devices: 5 [2014-01-18 16:06:04] 0 Tahiti [2014-01-18 16:06:04] 1 Hawaii [2014-01-18 16:06:04] 2 Hawaii [2014-01-18 16:06:04] 3 Hawaii [2014-01-18 16:06:04] 4 Devastator [2014-01-18 16:06:04] GPU 0 AMD Radeon HD 7900 Series hardware monitoring enabled [2014-01-18 16:06:04] GPU 1 AMD Radeon R9 200 Series hardware monitoring enabled [2014-01-18 16:06:04] GPU 2 AMD Radeon R9 200 Series hardware monitoring enabled [2014-01-18 16:06:04] GPU 3 AMD Radeon R9 200 Series hardware monitoring enabled [2014-01-18 16:06:04] GPU 4 AMD Radeon HD 7560D hardware monitoring enabled [2014-01-18 16:06:04] Failed to ADL_Overdrive5_FanSpeedInfo_Get [2014-01-18 16:06:04] 5 GPU devices max detected [2014-01-18 16:06:04] USB all: found 7 devices - listing known devices [2014-01-18 16:06:04] No known USB devices
There is really no substitute for reading the CGMiner README file included with the source and the binary. It fully details the command-line options and what their effect is on hashing. Armed with this knowledge, the settings discovered by folks around the Internet make a lot more sense and you can actually troubleshoot your own configuration. Nothing more annoying than the n00b that just copies their values and starts whining for help cluelessly. The most important are as follows:
- --intensity XX (-I XX) -- The intensity is arguably one of the most important settings to apply. It defaults to 8 which produces horribly slow hashrates but allows you to burn out up to 20. As intensity increases, the GPU may actually start overwriting its own RAM while its still in use. This becomes evident from an increase in hardware errors in the statistics. Maxing out this value will drastically decrease your hashrate due to these errors.
- --shaders XXX -- Shaders are the computational building blocks within GPUs for rendering all sorts of 3D special effects. They are specifically the core workers performing the hashes for mining. The number of shaders can usually be determined from manufacturer specifications.
- --thread-concurrency - The thread concurrency instructs CGMiner as to how much work is performed simultaneously. This number is typically a multiple of the number of shaders on the GPU.
- --gpu-threads - The README file recommends that you simply don't touch this. But a lot of Internet settings claim success with it. The value correlates to the number of threads assigned to a given core and typically affects CGMiner's stability. This field generally only goes between a 1 and a 2.
- --lookup-gap -- The README file also recommends that you simply don't touch this. It relates to a balance between RAM usage and performance. The value should never exceed 2 and is calculated internally from the thread concurrency anyway.
- --worksize XX - This value must be a multiple of 64 and no greater than 256. The README suggests it may have an effect, but its fairly marginal.
- --gpu-engine -- This value defines the clockrate for the GPU itself. Settings here will vary based on underlying hardware.
- --gpu-memclock -- This value defines the clockrate for the GPU RAM itself. Settings here will vary based on underlying hardware.
To walk through these examples, I'll use the Radeon R9 290. Proper tuning is important as there are a lot of folks on the Internet just pumping up high hashrates without checking that the work is really valid. Hitting 1MHash/s is not impressive with hardware errors and few accepted shares to a pool. A good starting point for tuning is to use the shaders option. As the settings are tweaked, this field will no longer be used but it helps CGMiner establish a good baseline estimate to other options. The Radeon R9 290 has 2560 shaders. This value is actually going to guide the determination of thread concurrency.
setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_USE_SYNC_OBJECTS 1 cgminer.exe --scrypt --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x --shaders 2560 --device 1,2,3
When this command is run, a binary file is created that contains information about what CGMiner is doing. The filename is important here. In the Radeon R9 290's case, it was scrypt130511Hawaiiglg2tc12800w256l4.bin
Make note of the number following tc as that is the calculated thread concurrency. Also, for reference, the R9 290 is hashing a mere 14.4KHash/s using nothing but the default shaders and intensity. In this case, CGMiner has applied a calculated multiplier of 5 to the shaders for 12800 concurrent threads. At this point, it is irrelevant to use the --shaders
option and the --thread-concurrency
setting should be used instead.
Another good way to determine the possible thread concurrency is to walk up the intensity until you encounter hardware errors. When the values aren't set, the intensity alone can drive CGMiner to compute possible threads. Fire up CGMiner sequentially from a low intensity, shut it down and start it again by slowly start working it up until the first hardware errors appear (all without specifying shaders or threads). On my Radeon R9 290 the hardware errors began at 13 (but just barely). Look at the most recently generated binary file to identify the valid concurrent threads. This time, CGMiner calculates 30592 concurrent threads for intensity 13. Now when the --thread-concurrency
option is set, it is possible to raise the intensity while keeping that value fixed (and avoiding further hardware errors). Bear in mind, having hardware errors is okay so long as they account for less than 1% of the accepted work the GPU is performing. Setting the thread concurrency manually and then setting a higher intensity results in increased hashrates without the hardware errors.
setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_USE_SYNC_OBJECTS 1 cgminer.exe --scrypt --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x --thread-concurrency 30592 --intensity 15 --device 0,1,2
The following results shows the improvement to hashrate when increasing intensity at a fixed thread concurrency. For the Radeon R9 290, I opted for the previously determined 30592 concurrent threads which pretty much kept the hardware errors at bay. You can see there comes a point of diminishing returns. Just because you can run intensity 20 doesn't mean that it's worth it (think heat and card damage over time).
Intensity | Hashrate |
---|---|
8 | 14.4KHash/s |
9 | 28.2KHash/s |
10 | 62.1KHash/s |
11 | 125.6KHash/s |
12 | 256.5KHash/s |
13 | 503.8KHash/s |
14 | 597.6KHash/s |
15 | 629.1KHash/s |
16 | 720.3KHash/s |
17 | 779.5KHash/s |
18 | 789.7KHash/s |
19 | 792.5KHash/s |
20 | 792.6KHash/s |
Although not in the README file, there is a --gpu-powertune XX
. According to forums on Reddit, this setting allows the card to adjust its voltages as needed for different settings to meet higher performance requirements. While none of the other settings can have an adverse effect on the hardware, voltages are the only user configurable setting that may cause damage so use such configurations with caution. The Internet postings in the 700KHash/s range at intensity 20 utilize a powertune of 20, however, nobody ever states what their hardware error rate is, temperatures, or voltages so its hard to know if they safely sustained those levels are were bragging about a peak hashrate.
You could stop there easily hitting 780KHash/s with little effort simply by slowly eeking up the intensity ... but you know you won't. Those last settings for the clockrates are just too enticing. You could read the README file again, or browse a forum post on TekSyndicate about walking up the clockrates. The GPU memclock represents megahertz and should be slowly incremented in steps of 25Mhz until CGMiner cannot start. Then back off that crash point by a safe amount - roughly 50Mhz - and proceed to testing the GPU engine speed. This value is also in megahertz and there is an optimal ratio between the engine speed and memory speed that falls between 0.55 and 0.60. Once again, start CGMiner with incrementally rising engine speeds. This time its not instability you're looking for but peak hashrate. There will come a point where additional engine Mhz will decrease the hashrate. Its easy to find a good starting point by checking out the technical literature on the device - AMD reports the rule of thumb from Guru3D was to stick within 5% of default (1075MHz engine and 5400MHz memory) - this of can increase depending on your ability (and risk aversion) with manipulating card voltages and employing exotic cooling solutions. That sort of configuration would look like the following:
setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_USE_SYNC_OBJECTS 1 cgminer.exe --scrypt --url=stratum+tcp://usa.wemineltc.com:80 --userpass=VnutZ.www:x --thread-concurrency 30592 --intensity 15 --gpu-memclock 5400 --gpu-engine 1075 --device 1,2,3
Bear in mind, my Radeon R9 290s were made by Asus - all manufacturers have slight variances in their versions and what not. The overclocking numbers from other Internet users were not successful on my GPUs and led to glorious Windows 7 BSODs. Needless to say, I've left them alone at intensity 17 on stock clock rates and sustain a very stable 792KHash/s per card.
EDIT: I've since added the @--gpu-fan 65@ line to my configuration to run the fans constantly at 65% which is above what they ran at when managed internally by the card. Since doing so, the card temperature dropped from 94°C to 85°C and my hash rate increased to 850KHash/s each on intensity level 19. The temperatures can be lowered into the 70°C range with fan speeds above 70% at the cost of enormous noise and only a very little small increase in hash rate. Including the box fan I'm using for cooling, the entire system is consuming only 1080W of power according to the Kill-A-Watt meter.
Summary
It is still possible to play in the relatively early Litecoin mining world but there is still an investment to put up in order to participate. CPUs are barely powerful enough but high end GPUs (especially multiple units) can make effective hashing rigs. While it is fortunate that a lack of ASICs has kept the mining to "home use" type equipment, it does require the power hungry GPUs that consume astronomical amounts of electricity. Perhaps Litecoins aren't the sexy, hip digital currency, but they are still a means to obtaining Bitcoins via the trade exchanges and have shown a somewhat proportional relationship in price swing to Bitcoins, too.
As far as what to build with your rig - it's really down to what sort of pain you're willing to put up with. Windows systems can be up and mining in less than an hour. Taking this approach, of course, comes at the cost of nerd credibility and high-end performance. Linux is much more irritating to configure and get stable. It comes with mammoth street cred, a simpler ability to script automated runs, and far less bloat overhead. Regardless of which operating system you choose, make sure you build the rig with adequate airflow and consider the fan noise it will generate (or your significant other might hurt you).
Consider a Litecoin donation to LSMfqFpueWKAe6AoMPaWFRAhodfSHtZJr4 if you're feeling generous and found this article helpful. Thanks!
Resources
- Vea, Matthew. "Getting Started With Bitcoins In A Post GPU World", VnutZ, accessed 20 January 2014 from http://www.vnutz.com/articles/Getting_Started_With_Bitcoins_In_A_Post_GPU_World
- Hern, Alex. "Bitcoin Me: How To Make Your Own Digital Currency", The Guardian, accessed 20 January 2014 from http://www.theguardian.com/technology/2014/jan/07/bitcoin-me-how-to-make-your-own-digital-currency
- "Crypto-Currency Market Capitalizations", CoinMarketCap, accessed 20 January 2014 from http://coinmarketcap.com/
- Vaishampayan, Saumya."Dogecoin Transactions Outpacing Those In Bitcoin; Here’s Why That’s Not Surprising", Market Watch, accessed 20 January 2014 from http://blogs.marketwatch.com/thetell/2014/01/14/dogecoin-transactions-outpacing-those-in-bitcoin-heres-why-thats-not-surprising/
- Hern, Alex. "Bitcoin Should Not Be Seen As A Currency, Warns Ernst & Young", The Guardian, accessed 20 January 2014 from http://www.theguardian.com/technology/2013/dec/11/ernst-young-warn-bitcoin-payment-problems
- Liu, Alec. "Bitcoin's Fatal Flaw Was Nearly Exposed", Motherboard, accessed 20 January 2014 from http://motherboard.vice.com/blog/bitcoins-fatal-flaw-was-nearly-exposed
- "Weaknesses", Bitcoin Wiki, accessed 20 January 2014 from https://en.bitcoin.it/wiki/Weaknesses
- "Key Derivation Function", Wikipedia, accessed 20 January 2014 from http://en.wikipedia.org/wiki/Key_derivation_function
- "Scrypt", Wikipedia, accessed 20 January 2014 from http://en.wikipedia.org/wiki/Scrypt
- Southurst, Jon. "Alpha Technology Takes Pre-Orders For Litecoin ASIC Miners", Coin Desk, accessed 20 January 2014 from http://www.coindesk.com/alpha-technology-pre-orders-litecoin-asic-miners/
- "Index of /apps/cgminer/3.7", kolivas.org, accessed 17 January 2014 from http://ck.kolivas.org/apps/cgminer/3.7
- "Main Page", Litecoin Wiki, accessed 20 January 2014 from https://litecoin.info/
- Wile, Rob. "WHAT IS LITECOIN: Here's What You Need To Know About The Digital Currency Growing Faster Than Bitcoin", Business Insider, accessed 20 January 2014 from http://www.businessinsider.com/introduction-to-litecoin-2013-11
- Hill, Kashmir. "A $100 Worth Of Litecoin A Year Ago Is Worth $30,000 Today", Forbes, accessed 20 January 2014 from http://www.forbes.com/sites/kashmirhill/2014/01/13/a-100-worth-of-litecoin-a-year-ago-is-worth-30000-today/
- "Charlie Lee Talks About Litecoin, Bitcoin, and Coinbase", YouTube, accessed 20 January 2014 from http://www.youtube.com/watch?v=_P9h6aIemp0
- "Cryptsy", Cryptsy, accessed 16 February 2014 from https://www.cryptsy.com/users/register?refid=183594
- "Coin Base", CoinBase, accessed 16 February 2014 from https://coinbase.com/?r=52c225
- "We Mine LTC", WeMineLTC, accessed 20 January 2014 from http://www.wemineltc.com
- "How Does Merged Mining Work?", Bitcoin StackExchange, accessed 8 January 2014 from http://bitcoin.stackexchange.com/questions/273/how-does-merged-mining-work
- "Comparison Of Mining Pools", Litecoin Wiki, accessed 20 January 2014 from https://litecoin.info/Mining_pool_comparison
- "Mining Pool Reward FAQ", Bitcoin Wiki, accessed 8 January 2014 from https://en.bitcoin.it/wiki/Mining_pool_reward_FAQ
- "Power Mac G4 Cube", Wikipedia, accessed 15 January 2014 from http://en.wikipedia.org/wiki/Power_Mac_G4_Cube
- Wilson, Tracy. "How PCI Express Works", How Stuff Works, accessed 15 January 2014 from http://computer.howstuffworks.com/pci-express.htm
- "GIGABYTE GA-G1.Sniper A88X FM2+ / FM2 AMD A88X (Bolton D4) HDMI SATA 6Gb/s USB 3.0 ATX AMD Motherboard", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16813128653
- "AMD A8-5600K Trinity 3.6GHz (3.9GHz Turbo) Socket FM2 100W Quad-Core Desktop APU (CPU + GPU) with AMD Radeon HD 7560D", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16819113281
- "Crucial Ballistix Sport 8GB 240-Pin DDR3 SDRAM DDR3 1333", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16820148653
- "Why A GPU Mines Faster Than ACPU", Bitcoin Wiki, accessed 16 January 2014 from https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU
- "Very Long Instruction Word", Wikipedia, accessed 16 January 2014 from http://en.wikipedia.org/wiki/Very_long_instruction_word
- "Mining Hardware Comparison", Litecoin Info, accessed 16 January 2014 from https://litecoin.info/Mining_hardware_comparison
- "SAPPHIRE 11196-19-CPO Radeon HD 7950 3GB 384-bit GDDR5 PCI Express 3.0 x16 CrossFireX Support Plug-in Card Video Card ", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16814202071
- "ASUS R9290-4GD5 Radeon R9 290 4GB 512-Bit GDDR5 PCI Express 3.0 Video Card", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807
- "1X To 16X PCI-E Extension Extender Cable Ribbon Riser Card Cable Adapter Cord", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=9SIA3XT18W8386
- "VisionTek CryoVenom R9 290", VisionTek, accessed 20 January 2014 from http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0
- "RAIDMAX RX-1000AE 1000W ATX12V v2.3 / EPS12V SLI Certified CrossFire Ready 80 PLUS GOLD Certified Modular Active PFC Power Supply ", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16817152044
- "RAIDMAX RX-850AE 850W ATX12V v2.3 / EPS12V SLI Certified CrossFire Ready 80 PLUS GOLD Certified Modular Active PFC Power Supply", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16817152043
- "APEVIA CVT24Y 12" 1 x 24PIN TO 2 x 24PIN Cable", NewEgg, accessed 3 January 2014 from http://www.newegg.com/Product/Product.aspx?Item=N82E16812201037
- "Open Air Mining Rig Computer Case", eBay, accessed 31 January 2014 from http://www.ebay.com/sch/i.html?_trksid=p2050601.m570.l1313&_nkw=+Aluminum+Open+Air+Mining+Rig+Computer+Case++&_sacat=0&_from=R40
- Larabel, Michael. "13 Reasons Linux 3.13 Is Going To Be Very Exciting", Phoronix, accessed 17 January 2014 from http://www.phoronix.com/scan.php?page=news_item&px=MTUxNTk
- "libclc", llvm.org, accessed 17 January 2014 from http://libclc.llvm.org/
- Viceroy. "Building A Rock Solid Multi-GPU Linux Mining Rig With CEntOS 6.0", Bitcoin Talk, accessed 17 January 2014 from https://bitcointalk.org/index.php?topic=170516.0
- "Minimal Headless OpenCL + cgminer on Ubuntu 13.04 Server", Silver Linux, accessed 17 January 2014 from http://silverlinux.blogspot.com/2013/10/minimal-headless-opencl-cgminer-on.html
- "AMD Graphics Drivers and Software", AMD, accessed 17 January 2014 from http://support.amd.com/en-us/download
- "Downloads", AMD, accessed 17 January 2014 from http://developer.amd.com/tools-and-sdks/heterogeneous-computing/amd-accelerated-parallel-processing-app-sdk/downloads/
- "Display Library (ADL) SDK", AMD, accessed 17 January 2014 from http://developer.amd.com/tools-and-sdks/graphics-development/display-library-adl-sdk/
- Soo, Justin. "Litecoin GPU Mining With AMD R9 290 And R9 290X – Sweet Spot For 1000KHash/Sec", Rumor City, accessed 17 January 2014 from http://rumorscity.com/2013/12/03/litecoin-gpu-mining-with-amd-r9-290-and-r9-290x-sweet-spot-for-1000khashsec/
- Cyber Druid. "The 30 Second Dummy Plug", OverClock, accessed 17 January 2014 from http://www.overclock.net/t/384733/the-30-second-dummy-plug
- "pooler / cpuminer", GitHub, accessed 14 January 2014 from https://github.com/pooler/cpuminer
- Pichel, Abigail. "Cybercriminals Unleash Bitcoin-Mining Malware", Trend Micro, accessed 16 January 2014 from http://about-threats.trendmicro.com/us/webattack/93/Cybercriminals%2BUnleash%2BBitcoinMining%2BMalware
- "AMD Catalyst™ Display Driver", AMD, accessed 18 January 2014 from http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
- ckolivas. "CGMINER ASIC FPGA miner monitoring RPC linux/win/osx/mips/arm/r-pi 3.10.0", Bitcoin Talk, accessed 18 January 2014 from https://bitcointalk.org/index.php?PHPSESSID=3udnml4g0vrs6agatvc9s98503&topic=28402.msg357369#msg357369
- "Shader", Wikipedia, accessed 18 January 2014 from http://en.wikipedia.org/wiki/Shader
- "What Exactly Does The Powertune Option In CGMiner Do?", Reddit, accessed 18 January 2014 from http://www.reddit.com/r/litecoinmining/comments/1ihcvk/powertune_option_in_cgminer/
- infam0usne0. "How To Tweak Your Settings And Squeeze The KH/s Out Of Your Card Using CGMiner", Tek Syndicate, accessed 18 January 2014 from https://teksyndicate.com/forum/litecoin/how-tweak-your-settings-and-squeeze-khs-out-your-card-using-cgminer/137706
- "AMD Radeon™ R9 290 Graphics Card Delivers Stunning UltraHD Performance For Just $399", AMD, accessed 20 January 2014 from http://www.amd.com/us/press-releases/Pages/amd-radeon-r9-290-2013nov05.aspx
- Hagedoorn, Hilbert. "AMD Radeon R9-290 Review - Overclocking", Guru3D, accessed 20 January 2014 from http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,30.html