My latest toy: 24KH/s in a single box!

My latest toy: One box, eight Xeon Phi 7220 PCI cards, and a total (sustained!) hash rate of about 24kH/s for monero/sumo/etn/etc … Yeehaw!

This box is actually already up and running for over a week, but since I realized that a lot of people might actually like hearing about this I decided to finally share this build … after all, I think this is the highest-performing single-box build (for cryptonight/monero) out there right now….

Okay, where did this start? When I started writing lukMiner earlier this (wait – now “last”) year Intel had just announced some PCI version of the x200 Xeon Phis, the 7220 cards. While googling around on where to get one I found a page from Exxact Corp, which offered a system with 8 such cards for around $20k or so. Got me curious ;-).

Unfortunately, when I tried to place an order I was told that those cards had been un-announced, and would not be released after all…. However; I could never forget this setup, and eventually found some cards on ebay. So after eventually scrounging some of those cards together (and initially having quite some trouble finding some motherboards that they would actually run in#@!!@), I finally got Exxact to sell me a no-cards-included version of that box they had listed earlier this summer. Once it arrived I popped in the cards, installed the matching software stack, and here we are – eight 7220 cards, each doing 2800-2850 H/s (running at about 80-85 degrees C), plus two low-end Xeons in the host …. a bit of linux magic to start it all automatically upon boot, and la-voila – we have a machine that DwarfPool and NiceHash say makes an average of 24kH/s, at a power draw of only about 2.4kW… (And all in all, I paid less than $12k for the parts!).

The pic above is from when I assembled it in my basement… had to actually pull power cords from two different rooms to not pop my circuit breakers, and had to eventually move it to a co-hosting data center due to “somewhat excessive noise” (think “jet engine”) … but still, that was one heck of a fun project!

Happy Mining!

 

 

Published by

lukMiner

To learn more about me, look at the "About" page on http://lukminer.org

23 thoughts on “My latest toy: 24KH/s in a single box!”

  1. Hey, can i get the name of the motherboard you ended up using ?

    It should be good to go as long as the cards have designated slots and the main cpu is xeon right ?

    Greets 🙂

    Like

    1. Unfortunately it’s not that easy. First of all you can’t even get those cards all that easily – I was just lucky to see some on ebay, but they’re very rare. If you ever get them, they only work in certain motherboards – I have one x99 board that runs them, but at least 5 others that don’t (don’t ask – no clue).

      That particular machine I have them in is a SuperMicro SYS4028GR-TR (https://www.supermicro.com/products/system/4u/4028/sys-4028gr-tr.cfm). Follow that link to get the exact specs.

      Liked by 1 person

  2. I’ll be going for the ASUS X99-A II Intel X99 LGA 2011-v3 ATX
    Thanks for the reply and great work on the miner !
    I’ll keep you posted on whether or not i’ll get them hashing 🙂

    Like

    1. The 5110 has about 25% of the hashrate of the x200s and uses the same amount of power. In other words, unless you have free/very cheap electricity it’s likely to not be worth it

      Like

      1. I kind of agree, but not entirely – in fact, it depends very much on how much you pay for those cards. In terms of revenue and power draw, the x100s are still comparable to a GTX1060 or 1070, so using them for mining isn’t all _that_ outlanding of a thought: at order 200-250 Watts and order 600-650H/s they’ll certainly pay for the power, even at full residential prices.

        Now if you want to purchase something _new_, then most likely the x200s are the better deal (five times the revenue at same power draw and only about twice the cost) – but if you already have some of the x100s – or a good line to get them cheaply – then using them still makes total sense.

        Like

      2. Its kind of ridiculous how much these Xeon Phis are selling for. In some cases they are double or triple msrp. This is a cool project but I can’t imagine this being worthwhile for anyone unless they can get these well under a $1000. The AMD Vegas are the best bang for the buck right now. I still browse ebay occasionally though in hopes I can get a halfway cheap Phi 🙂

        Like

      3. For the x100s I agree – I bought some for development, but would not buy them for actual mining… at least not at what they’re listed for on ebay :-/

        For the x200s, the story is different, though: If you talk to a distributor such as Exxact, you can get a 4-node 7250 system (e.g., https://www.exxactcorp.com/Exxact-TS2-210339-IPS-E210339) for order $5.5k right now (and that includes memory and OPA cards that you wouldn’t need for mining, and might be able to sell on ebay). At that price, you’re getting a 7250 (which should actually be even faster than the 7220s in this blog) for less than $1500 – and without needing a host machine to put them in, without having to fight with drivers (I already spent two days getting my Vega to work, still no luck!@#!@), and in a form factor you can move to any co-hosting facicility without any issues ….

        I have no doubt that the Vegas are good (I’ll know more when I ever get mine to work!@#!@)… but at least for myself, I’m now going with the system listed above 😉

        Like

  3. I was initially curious if the new ASUS B250 19-slot mobo would work for these cards, but after reading this, I have my doubts. I currently have 8x of the 7220’s as well – just need to finish out the setup.

    Like

    1. If you have no other use for them: for the right price I’d take them ;-).
      No, seriously – they definitely don’t work in all motherboards. I can confirm this machine works, and can confirm a smaller 1U 2card machine works (can find the exact specs); I even have three working in a regular workstation – but I have at least 10 machines I tried them in which they didn’t work at all.

      Like

      1. I’m looking through some different options to help with cooling. Obviously lowering the ambient temp of the room will help with concerns to airflow, but is there anything else that can be utilized to help with cooling in your system?

        Like

      2. There’s basically two options – buy a professional rackable server that’s optimized for a lot of airflow going through the “GPU” slots; or add some custom cooler fans to the back fo the card (for one of my builds I took these: https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313.TR0.TRC0.H0.Xcustom+cooler+xeon+phi.TRS0&_nkw=custom+cooler+xeon+phi&_sacat=0 … the blue stuff didn’t actually fit (was intended for x100s, the 7220s have different power connectors :-/), but the blower itself worked. I’ll send another post on that second system later.

        Liked by 1 person

  4. I know these things are fussy about what motherboard they’ll run in, but does anyone know if they’ll run in a x1 slot? This application doesn’t need high performance IO anyway, so it would be nice to use every slot available (assuming it works with the motherboard).

    What were the symptoms of the systems they wouldn’t run in? They wouldn’t show up at all, or was it something else?

    Like

    1. No clue. I have one in a x8 slot, but that’s all I know.

      As to symptoms of incompatible motherboards: I’ve seen many different ones. Some don’t even boot with cards plugged in. Others boot, but immediately reboot. Some boot, and don’t show up in lscpi. And yet others boot, have the cards show up in lspci and micctrl, have them properly start in the mpss service (ie, you can even log in), but then see the cards “hang” (with DMA errors in dmesg) after some while. The latter is also what happens if the cards overheat (say, going over 90), so maybe that’s actually not a MOBO issue, but simply my inadequate cooling when I played with it.

      That 8-card system can actually take 10 cards (it has 10 pci slots), but with 10 cards also starts showing the DMA errors – no idea if that’s because there’s then too many PCI agents, or if it’s simply overheating.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s