Home Lab Series: Storage v2

This is all about my journey to a more modern Home Lab, a few months ago I upgraded all my networking kit and compute its now time to upgrade the storage. I already have a patching rack and a server rack at home for all my Cisco networking gear so I decided that I would go down the route of buying a 2U Rack Mount server for my Home Lab to serve as a storage server. I am very familiar with DELL hardware so that was a simple choice.

I decided on the DELL PowerEdge R510 which is a member of the DELL 11th Generation Family of Servers, which after a BIOS upgrade to v1.12.0 is capable of running a pair of INTEL Xeon X5675s which are Hex Core 3.06GHz 95w TDP Hyper-Threaded CPU and supporting 8 x 16GB DDR3 memory modules giving a total capacity of 128GB of RAM.

 

DELL PowerEdge R510

r510-b

 

 

 

 

 

 

 

I purchased a InnoDisk 16 GB SATA III Internal SSD, SATADOM, iSLC Flash, 5V which I plugged into one of the onboard SATA headers on which I was planning to install FreeNAS v9.10, but it turns out the onboard SATA ports are disabled when the 12 port back plane is connected so I ended up using an 8Gb SanDisk USB attached to the internal header on the side of the backplane.

I purchased iDRAC 6 Enterprise Card which I recently upgraded to v2.85. In order to run the pair of INTEL Xeon X5675 CPUs and the 14 SSDs that I planned to purchased I had to replace the DELL 750w Power Supplies with the DELL 1100w Power Supplies. I also purchased an INTEL X520-SR2 card which is a dual port 10 Gbit SFP+ ethernet network cards so this gives me a total of 2 x 10 Gbit ethernet links and also a 100Mbit dedicated DELL iDRAC link to add to the onboard 2 x 1 Gbit ethernet links.

The DELL PowerEdge R510 – 12 Bay comes with 2 internal 2.5″ bays which are also connected to the SAS back plane, the SAS backplane is connected to a DELL PERC H700 with 1Gb onboard battery backed cache.

 

r510-c

r510-e

 

 

 

 

 

 

 

 

Since I am planning on using FreeNAS to expose iSCSI and NFS over the 10GbE network card that I installed I purchased a pair of Samsung NVMe SSDs for the ZIL and L2ARC for which I have also purchased a pair of Lycom PCIe to NVMe adapter cards. I have finally decided on 14 x SSDs from the Samsung EVO range purely because I had a load left over from PC upgrades.

2 x 512GB Samsung 950 PRO 2.5″ SSD, M.2 (22×80) PCIe 3.0 (x4) NVMe 1.1 SSD, UBX, 3D V-NAND, Read 2500MB/s, Write 1500MB/s, 300k IOPS ( these are brand new )

 

img_2861

img_2860

 

 

 

 

 

 

 

 

14 x 120GB Samsung 840 EVO 2.5″ SSD, SATA III 6Gb/s, AHCI, 2-bit MLC V-NAND, 540MB/s Read, 410MB/s Write, 94k/35k IOPS ( these are temporary old SSDs which were removed from old PCs but WOW they perform! )

 

r510-d

img_2807

 

 

 

 

 

 

 

 

 

 

FreeNAS uses ZFS to store all the data.

ZFS has a bunch of features that ensures all the data will be safe but not only that it has some very effective read and write caching techniques.

The write cache is called the ZFS Intent Log (ZIL) which is stored on a drive or drives of your choice, and the read cache is called the Adjustable Replacement Cache (ARC) and there is also a Level 2 Adjustable Replacement Cache (L2ARC) which is stored on a drive or drives of your choice.

 

r510-f

 

 

 

 

Since I have 2 x 10GbE worth of bandwidth which will be limited due to the INTEL X520-SR2 cards being installed in a PCIe x4 Generation 2 slot which can deliver a maximum of 2 Gigabytes per second therefore limiting the cards maximum potential of 2.5 Gigabytes per second, I accepted this due to it being far faster than I could possibly ever need (I’m going to regret that statement), the maximum amount of data which will arrive per second will be 4 Gigabytes. Therefore the ZIL should be sized at 4 Gigabytes multiplied by 10 seconds to give 40 Gigabytes which is a reasonable amount of space for today but I plan to upgrade to 2 x dual port 10GbE cards and also 2 x dual port 8GbFC cards therefore to allow for future growth I partitioned the NVMe SSDs into 64 GB for the ZIL and the remaining space for the L2ARC.

Leave a Reply