Wow. What a community response. Unbelievable. Last week I posted on my latest toy, a rig that used eight Xeon Phi 7220 cards in a single 4u-server to achieve a total of roughly 24kH/s for cryptonote coins (in case you missed it, the original post is right below).
I had expected some interest in that merely because that was, to my knowledge, the new speed record for a single-node(!) mining rig…. but man, was I wrong: three thousand unique visitors in the first 48 hours, on a blog that didn’t even exist until two days before. And tons and tons of interesting comments on questions, on both the blog and reddit.
Based on that feedback from the last three days, it’s now become very clear that there’s two follow-ups I’ll have two write (else I’ll drown in emails :-/). The first topic that came up again and again was “Vegas vs Phis” in terms of mining revenue, profitablity, etcpp …. and I promise, I’ll write one – but I’ll first wait for my 4×7250 node to arrive, so bear with me. The second big group of questions that came up again and again revolved around “how can I replicate that build” – i.e., how do the Phis work at all, where can I get them, and if I have them, how can I build a machine that’ll take them. This latter question is what this post will be about.
Mining on Phi 7220 cards…
First off, it is kind of hard to get those 7220 cards – if you don’t already have some, you’ll probably have a hard time to get some. I got mine off ebay, but that seller – at least right now – doesn’t list any any more, so manybe they’re gone. Also, be sure to not confuse the x100 cards (3120, 5100, 7120, etc) with those newer x200 phis – the old ones will be about 5x slower, so think hard if that’s worth it. Quite frankly, if you’re interested in mining with Phis you’ll be best off buying one of the 4×7250 self-bootable machines (I’ll write more on that once mine arrives).
That said, if you already have 7220 cards – or miraculously found a good source for them – you’ll have to find a way of actually hosting them, of getting them to run, and mining on them. In terms of mining software, lukMiner will run on them, and will run rather profitably. To get it to run, though, you’ll need a copy of Intel’s MPSS 4.x software stack to drive those 7220 cards – and since Intel took that product down that isn’t all to easy to get any more. I still have an older copy from back-then-when, but don’t have permission to share that, so you’ll have to find somebody else to share that with you (If anybody that has a copy wants to share, please feel free to send a comment with a link!). Once you get both hardware and MPSS stack up, you copy luk-xmr-phi to the card, and run it; so that part is easy.
Now to the tricky question: How to actually build a rig with these cards. The problem here is that they seem to be working in only certain motherboards, and since they got pulled off the market there’s no support. So all I can do is go and share the three successful builds I managed to create – without any warranty whatsoever that they’ll work on your side.
Building a Rig – Option #1: the 24kH, 8-card, 4U server
The build I used in my original post is a professional, off-the-shelf server from Exxact Corp. I used theirs because I know they had listed this product with Phis in the summer, so had pretty high confidence that would still work. For those interested in more details: It’s a 4U server with two Xeon CPU slots, a C612 chipset (which seems to be important), 10 full-length PCI slots, and then (8+6) PCI power connectors, which works out perfectly. Since I didn’t want to take any risks I bought a complete system from Exxact, with CPUs, memory, disks, 2x1600W power (fully redundant, so 4 PSUs actually), and everything else except the Phi cards, which I already had. In total – and with shipping – that set me back something like $6k, just as much as the cards themselves.
Of course, I could probably have built that thing from parts for much less (the two un-used redundant PSUs alone are worth several hundred bucks) … but already having $6k in Phi cards on the line I didn’t want to take any risks – and quite frankly, so far I’ve been extremely pleased with this purchase (and Exxact have been most helpful so far, too!). For anybody wanting to get this system, here a pic of both the rig, as well as the exact sticker of that machine (and I’m sure Exxact would be happy to help, too – just mention what it’s for, they’ll remember me :-/).
One final note: The careful reader will have seen that I mentioned ten PCI slots, yet my build uses only eight cards. Yes, I did fit 10 cards in there, but ran into some issues. First of all, I blew some circuits in my basement … and worse, at home (where I built the rig) I only had 110V power, which wasn’t all that good for the 2x1600W PSUs. And finally, when I did get it to boot the machine became unstable with 10 cards – maybe bec.ause of heat, or maybe because the drivers don’t like that many cards, I don’t know. Eight work, 10 I’m not sure. Might get back to trying, but for now I don’t have any spare cards any more, anyway. Here a pic with 10 cards, but again, right now I only run eight. I’m not crazy. Not really, anyway.
Option #2: A Cheaper, but still professional rig
Since $12k in a single rig is admittedly a somewhat scary thought I also played around with finding cheaper, smaller options. After looking primarily for the same board generation and C612 chipset I ended up with a SuperMicro SYS-5018GR-T server.
Initially this didn’t produce enough air flow to cool the cards, but after a bit of “friendly persuasion” on behalf of the fans (ie, cutting the two control cables of the fans to make them go full blast – see pic) that worked out of the box, too.
Got the barebone for $1100 off ebay, plus a refurbished Xeon, and a single DIMM of memory …. all together probably around $1400 (plus cards) – not that much off what you’d pay for a typical desktop PC to put GPUs in – but in rackable form-factor, so you can actually farm it out to a co-hosting place.
Option #3: The totally Stone-Soup, Do-it-yourself Build
OK, before I went ahead and bought all these servers and cards for now close on $20k I (obviously?) first did some simpler tests, buying only a single card, and testing that in something I already had. “Luckily” I had lots of un-used workstations lying around that I could test with ….. the reason I say “luckily” in quotes is that the reason I have those in the first place is that they started out as GPU mining rigs, but since “several” of those GPUs have died the mining death over the last few months I now have some unused workstations :-/. (yes, one of the reasons I switched to mining on Phis is that I simply had too many GPUs die on me – in particular a certain brand, but don’t want to offend anybody, so will keep that part to myself).
Anyway – tested many different machines, and most didn’t work. Either they didn’t boot at all, or they booted but didn’t show the cards, or had too old BIOSes (they need above 4GB decoding), etc. Finally found one of my machines that took that card, and for everybody that wants to replicate, here the specs:
- Motherboard (likely the most important part): Asrock X99 Deluxe
- CPU: Some X core Xeon E5 bought off ebay – probably won’t matterl
- A cheapo PCI 1x GPU to drive a monitor (won’t need it for mining, though).
- Three Xeon Phi 7220 cards (started with one, but then put in two more)
- EVGA 850GQ (850W) PSU
- Phanteks Enthoo Pro M case (won’t matter), and a single DIMM of 8GB of RAM.
- Lots of fans.
- CentOS 7.3 with MPSS 4 stack and lukMiner 0.8.6.
And la voila, here we are:
In terms of cooling, a regular workstation’s case fans won’t be enough. Not by a long shot they won’t. At first, I added this semi-professional Lasko fan:
This obviously blows air to the inside rather than out of the case, but if you leave the case open that’s OK (and if not, it’s strong enough to make all other case fans go backwards, too, LOL).
Eventually, however, that setup looked a bit shaky even by my standards, so went ahead and scouted for some smaller fans to cool this. Typical case fans won’t do, even if mounted right behind the cards. Eventually, however, I found some higher-powered fans on ebay (see, for example, here for a listing). The blue (printed) shroud doesn’t actually fit the x200s (they have their power connectors organized differently from the x100s), but the case wouldn’t have had enough space to host those shrouds, anyway, so simply took them off – the fans are strong enough to make enough air go through the cards even without a perfect fit. Oh, and of course: Duct tape is your friend In the following two pics, left one shows two such fans connected with two shashlik skewers and soem duct tape (I didn’t say it was professional grade, did I?); the right one shows those fans mounted right behind the cards – one fan does one card, the other does two.
Again, I used the trick of messing with the fans’ control cables, and simply have them go full blast (image on the left shows the fan connector cable cut open to allow a four-pin fan connector to connect to a two-pin 15V connector – the control wires aren’t, but cut simply not connected, so they go full blast. Right image: That stuff connected to a 15V molex). With that, the machine is now up and running for three weeks, no issues whatsoever (well, had to fix a few issues with hung nicehash connections in the miner, but the hardware works all right).
As mentioned above, I’m not sure that those builds can easily be recreated – for example, I have some other x99 boards that do not work, so I have no clue why this one does Ie, no warranties, and your mileage may vary). Anyway – for those that have some of those cards I hope this info will at least open a path to getting them up and running. As such: