Search This Blog

Saturday 3 February 2018

Increase Synology DS412+ NAS capacity

We take a lot of photos and have a lot of videos and after a few years we have started to run out of storage on our main Network Attached Storage (NAS).


It is time to add more storage to our Synology DS412+ NAS. It was using 4x 2TB.

With larger capacity drives available it would have been nice to drop down to use only 3 disks and have a hot standby. I did a lot of checking and there is no way to reduce the number of disks within an existing array. I'm not prepared for the days of downtime to do backups and restores so I have to stick with the full 4 disks that I am using.

Synology Hybrid Raid (SHR) volumes support mixed sized disks so I am upgrading only some of them from 2TB to 8TB disks.

The way the fault tolerance works, the calculation for space is fairly easy. The capacity equal to the largest disk is lost to parity bits.  Another way to think about it is to just add up the total capacity of all the disks and deduct the capacity of the largest and that gives you the usable space.

I decided to swap two disk so my capacity goes from a usable 5.4TB to 10.8TB.
Due to the way hard disk manufacturers calculate size and the overhead of file systems a disk rated at 2TB only has about 1.8TB of usable space and an 8TB disk has a little over 7.2TB of usable space!


I have also bought an additional 8TB drive to keep as a cold spare in case of disk failure. I trust the Western Digital Red NAS drives that I have but no matter how good they are, drives are a mechanical device and can fail.

There are already a few good sets of instructions about how to increase the capacity of the array so I won't go in to much detail here.

The DS412+ that I have supports hot swapping of drives so that means the thing can carry on running as normal while the work takes place. The DS412+ processor is also fast enough so that unless you are particularly sensitive, the performance remains at an acceptable level throughout the process. Just slightly slower logins and response in the user interface but access to files from the network is not noticeably changed.

The steps:

  • Backup all your data
  • Just in case you didn't read that, make sure you have a good backup before you start.
  • * Storage manager *
  • Check that all disks are normal and that there is no existing repair in progress.
  • Pull out one of the drives (I have a hot swap model, you may have to power yours off to do this bit)
  • Put in a larger capacity drive
  • * Storage manager *
  • -- Manage
  • ---- Repair
  • ---- Next
  • Select the new drive, which is probably the only choice and already ticked
  • ---- Next
  • Read and accept the warning. Only continue if you are sure
  • Wait, many hours...



After the first drive has completely repaired you can do the next drive.

In my case it was the backups before I started that took the time. Luckily my monthly backups had just run but my less frequent photo backup was months out of date so I had to run that. In my case that took nearly two days! It's the increase in that type of data which requires me to increase the capacity in the NAS.

The repair of the first drive took less than 11 hours. I didn't sit and watch it so I can't be more accurate. The second drive took less than 6 hours.

The first disk does not add any more space because it is taken up with the parity information but on completion of the repair on the second drive the Synology automatically expands the volume.

Job done. I now have nearly double the capacity.

--

As a historical note, going back about 30 years, my first IBM PC clone had a massive, for the time, storage of 80MB. That was in 2x 40MB 5.25" hard drives.

The total capacity I have connected to my home network today is now over 200,000 times that!

==

Note:
I have now expanded the capacity on several Synology NAS boxes and all but one has worked cleanly. On just one I have ended up with a 'System Volume error' being repeatedly reported in the log. I was unable to repair this with any of the Synology tools so I had no choice but to backup everything then delete and recreate the volume. Easy enough to do but the backups took days especially as I took multiple copies of the most critical data.

==

No comments :