Sometime in – h*ck I don't know, 2006? – I last built my personal computer.

At the time, it was a combination Serious Desktop Workhorse and File Server.

In time, I've replaced the former with a ${day_job}-provided machine, have relegated my personal computing to a dedicated VM isolated from those work concerns (plus devices), and the machine I had built has become a very unbalanced NAS server. It had far too much compute power (and thus baseline energy consumption) for the menial task it was left with. As well, the size of the RAID array was great at the time, but in an era of 4K video and TimeMachine backups, it's limited.

So, I decided to replace it with a proper NAS device.

My goals were the following:

  • order-of-magnitude storage increase: the current box has 1.8TB; shooting for 18-24 T

  • a bit more redundancy: current box is RAID10, and had a regular issue with /dev/hdc initially that thankfully resolved after a few replacements. Shipping from newegg to VT is ~2d if a drive does fail, but I'd rather have 2 drive-fail capability, especially with that drive size / rebuild time.

  • serious power reduction: the previous box (old i7) drew 140W idle, and I was shooting for ~40W;

  • simplicity in OS management: the machine had been running linux-4.4.1 forever, and grub-0.99, because it's too fragile to change and I'm a scaredy-pants

As such, the plan became:

  • 5-6 6±2 TB drives in a RAID5 or 6 or Z2 configuration

  • explicitly lower-power CPU, limited memory

  • (eventually-)read-only thumb-drive boot, OS managed via virtual machine


I want to take a moment as early as possible to recognize Brian Moses' DIY NAS builds, which have been a significant guide for this build; by that I mean I stole his plans on the hardware side, with some minor tweaks.

I wound up with the following hardware/costs:

component description cost =2255
case SilverStone DS380B 150
psu Corsair SF450 85
motherboard ASRock Z270M-ITX 130
cpu Intel Core i5-7600T 255
memory Ballistix DDR4 2666 (2×8GB) 185
drives (a) ×3 WD Red 8TB NAS - WD80EFZX 725
drives (b) ×3 Seagate IronWolf 8TB NAS - ST8000VN0022 725

The SilverStone because of the hot-swappable drive bays.

The ASRock because of 6 SATA6 ports native.

The Core i5-7600T because of the balance of high benchmarking scores and modern features with a TDP of only 35W.


I decided to stick with Gentoo as the OS, because I love it and more importantly I'm comfortable with it.

In advance of building the new server ("earth", to complement fire (the firewall), air (the wifi), and water (my personal VM)), I decided to at least upgrade the software side of my current server (phoenix) with a thumb-drive OS build.

So, this OS build is not only going to be the basis for the next server, but it is going to take over the current server's OS.

I've leveraged UUID- and LABEL-based configuration in grub and /etc/fstab in order to have the OS image work in both the virtualized and real environments.

In particular:

In reality I have 4 1TB drives in a btrfs RAID10 configuration.

In the virtualized environment, I have 4 1GB "drives" in the same configuration.

Both mounted as "/data", but in /etc/fstab, it's:

    LABEL="DATA" /data btrfs defaults,noatime,compress=lzo 0 0
    /data/home /home none bind 0 0

So no matter which is booting, the same thing is mounted.

For the boot disks, it's all:

    # /dev/sda2 /boot ext2 defaults 0 0
    UUID=563893f3-c262-4032-84ac-be12fddff66b /boot ext2 defaults 0 0
    # /dev/sda3 / ext4 noatime 0 0
    UUID=489dd7ad-a5e5-4727-8a9c-b11cca382038 / ext4 noatime 0 0

So that no matter where the image is booted (virtual, thumbdrive, whatever) the mounts work fine.


See part 1 for the next in the series.


Published

Category

Posts

Tags

Contact