dataDyne - 9.5TB network storage tank
The 250gb hard drive in my MacBook just isn’t enough anymore, so I bought some equipment to set up a NAS (Network Attached Storage). The equipment I bought:
- 4x SAMSUNG Spinpoint F2EG HD154UI 1.5TB 5400 RPM 32MB Cache SATA 3.0Gb/s 3.5” Internal Hard Drive
- 1x TRENDnet TEG-PCITXR 10/ 100/ 1000/ 2000Mbps PCI Copper Gigabit Network Adapter
- 1x SYBA SD-SATA-4P PCI SATA Controller Card
- 1x Sunbeam PSU-H680-REV-US-BL 680W ATX 12V 2.0 Power Supply
I also bought a 4gb CF card, and a CF to IDE adaptor, because i had originally planned on using freeNAS as the operating system, and free up space in the case (by not needing a hard drive to install the operating system on). But i decided to go with Ubuntu instead, and ditch the CF card route. I chose Ubuntu because of limitations in the software raid implementation in freeNAS. FreeNAS doesn’t have the ability to grow an existing RAID, so if down the line i wanted to add another hard drive, freeNAS wouldn’t allow it, but Ubuntu would.
I’m going to set up the 4 1.5tb drives in a raid 5, which will protect me from losing data if a drive fails, and give me 4.5tb of useable space (space * drives-1, 1.5 * 4-1.5).
I traded the CF card and adapter to Mike for his clear computer case, old motherboard (P4 3.06GHz), and 40gb IDE hard drive.
10/15/09 - I have installed Ubuntu onto the 40gb hard drive, and configured the operating system, now I am just waiting for newegg to deliver the 4 port sata card, and my power supply. I already received the 4 hard drives, and they are sitting in the case, mounted, and ready to be plugged in!
10/16/09 - I got the power supply and the sata card today. I put all of the equipment in, fired it up, and everything worked 99%! The power supply was able to power the 4 SATA drives and the 1 IDE drive just fine, no problems there. The bios saw the SATA card, and asked if i wanted to configure a hardware raid (bonus, i didn’t know it could do that!). When it booted the IDE drive however, GRUB hanged, but it fixed itself after a reboot.
I formatted each drive, gave each a new primary partition, and formatted them as Linux Software Raid Auto. I started the mdadm command to create the new raid at 7PM, it’s 1 AM now, and i still have about 10 hours to go! haha. Four 1.5tb hard drives takes a lot longer to join together than the 4 virtual 1GB hard drives i tested on.
Output from cat /proc/mdstat Every 2.0s: cat /proc/mdstat Sat Oct 17 01:06:04 2009 Personalities : [linear] [multipath] [raid0] [raid1] [raid6] [raid5] [raid4] [raid10] md0 : active raid5 sde1 sdd1 sdc1 sdb1 4395407808 blocks level 5, 64k chunk, algorithm 2 [4/3] [UUU_] [=======>.............] recovery = 36.7% (538709632/1465135936) finish=596.2min speed=25896K/sec
10/17/09 - After 15 hours of building the raid, I started writing a filesystem to it
(ext4) and boom, the power goes out. After ten minutes I power it back on, tell the raid
to rebuild and
--assume-clean so it wouldn’t resync for 15 hours, and write the filesystem to it.
Now everything is up and running, and I am now serving 4.5 tb of storage to my apartment!
Pictures coming up soon.
All the drives seem to be running very cool (probably because the SATA card is limited to 1.5GBs and they are all 5400 RPM).
10/28/09 - I did some case modding on dataDyne. I added 12 LED’s to the side plexi-glass window to complement the 2 LED’s that are in the power supply! haha
5/10/10 - I just bought 2 more drives for dataDyne and a new 4 port SATA card! It took about 3 days to reshape the array and grow the file system to accommodate the new drives, but it is worth it. dataDyne’s usable space just jumped from 4.1tb to 6.8tb!
System Drive /dev/sda: WDC WD400BB-00FJA0: 33°C Raid Drives /dev/sdb: SAMSUNG HD154UI: 23°C /dev/sdc: SAMSUNG HD154UI: 25°C /dev/sdd: SAMSUNG HD154UI: 29°C /dev/sde: SAMSUNG HD154UI: 26°C /dev/sdf: SAMSUNG HD154UI: 26°C /dev/sdg: SAMSUNG HD154UI: 25°C Filesystem Size Used Avail Use% Mounted on /dev/sda1 36G 4.0G 30G 12% / /dev/md0 6.8T 4.0T 2.5T 63% /mnt/raid
1/15/11 - This update is a little old, but I figured I would toss it up here. I’ve moved dataDyne into a new case, the Norco RPC 4020. This case offers 20 Drive bays, and I plan to fill it completely when I can gather the necessary funds to do so. Right now I have a total of 8 drives which gives me 9.5TB usable space. Pictures coming soon!
dave@[datadyne]:~/$ sudo mdadm --detail --scan ARRAY /dev/md0 level=raid5 num-devices=8 metadata=00.90 UUID=46f72581:23e7243c:d8d955ac:4f7a50a5 dave@[datadyne]:~/$ sudo df -h /dev/md0 Filesystem Size Used Avail Use% Mounted on /dev/md0 9.5T 5.7T 3.3T 64% /mnt/raid