Turning trash from work into a home server

  • I salvaged some parts and pieces from a test/dev hardware refresh and general storage cleaning at work. I bought some used parts off eBay, picked up a Rosewill 4u case (JUNK), and an EVGA G2 power supply to build a NAS for a home lab. I've tried FreeNAS, OmniOS w/ Nappit, Nexentastor, and Windows Server 2012 R2 as my storage backend.


    My initial plan was an all-in-one box running FreeNAS with VirtualBox in a jail for my virtualization needs. I scraped that after not getting the networking features I wanted with VB in a jail which was multiple NICs so I can have guests on different networks. That led me to thinking a discreet storage and hypervisor solution was the way to go since I do have some complete servers suitable for use as a virtual host.


    After trying all those storage platforms I was all set to move forward with Windows Server 2012 R2 w/ Storage Spaces as my storage and proceed to deciding between Xenserver, ESXi, or Proxmox as my hypervisor until...


    Based on a suggestion from someone on another forum I decided to give OMV a try. I am really glad I did! OMV gives me back the AIO goal I started with which saves me on physical space, heat generation, noise, utility bills, and UPS capacity with the network flexibility I need. All with a nice UI since I am a very novice Linux user. It fits all of my needs and I just really like everything about it so far.


    I built a server:

    • Supermicro X8DT6-F motherboard (onboard LSI 2008 flashed with IT firmware)
    • Supermicro LGA1366 heatsinks and fans
    • 2 x Intel Xeon X5670 CPU 6c/12t @ 2.933 GHz
    • 12 x 8GB (96GB) DDR3-1333 ECC memory
    • IBM BR 10i RAID controller (flashed with IT firmware)
    • 12 x HGST 2TB 7200RPM SATA2 enterprise NAS drives (I have 32 of them total)
    • 2 x Intel 710 100GB SATA2 SSD
    • EVGA G2-850 power supply
    • Rosewill 4U server case
    • 2 x Rosewill 4in3 hotplug SATA cages
    • 2 x Cyberpower 1500 UPS

    *Items in blue were saved from the ewaste bins at work.


    Planned uses:

    • MythTV or similar PVR backend and any support services needed
    • Streaming recorded and live content to up to 6 set top boxes. Most likely Raspberry Pi's running Kodi or something similar
    • Home automation/security camera DVR
    • Minecraft server and Mumble/Teamspeak or similar servers for the kids
    • Shared storage and backup target for 5 Windows desktops/laptops
    • 2 or 3 general purpose Windows/linux guests


    The Rosewill case and drive cages will be replaced ASAP. I do not like it at all.


    In just a few hours over the past 2 days I have installed and updated OMV with VirtualBox and ZFS plugins, provisioned a RAIDZ2 pool with my 2TB drives, and installed Windows and Linux guests for testing. I've hit a couple bumps along the way but a quick search of the forums provided solutions to almost everything so far.


    My initial OMV install is to a Sandisk USB flash drive, but I will probably wipe everything and start over with a new install on the Intel SSDs connected to the onboard Intel SATA2 ports. I'd like to figure out how to mirror them either with the Intel RAID functionality if Debian supports it or software RAID. I will probably also redo the ZFS pools and make a new one with 2 x 6 drive RAIDZ2 vdevs. Or, I may just do a pool of all mirrored vdevs.


    I'm really having fun with this!

  • And what's the exact location of these bins?


    You have so many (free) toys to play with - i am jealous <img src="http://forums.openmediavault.org/wcf/images/smilies/smile.png" alt=":)" />


    Have fun!


    I've been testing various combinations of software since the middle of March to get where I am right now so they were hauled away a couple months ago, sorry. I did share some of my bounty with two co-workers though. CPU and RAM mostly. I talked them out of these hot and loud data center 7200 RPM drives. :whistling:

  • Who the hell throws away those kind of things? Dude, that thing in an overkill of a home server!!!! Don't you want to share the secret location of this "e-waste bin of happiness"? You, sir, have a lot of fun with the new toys. You just got an awesome set of parts.


    PS: Get rid of the USB install, you just got 2 Intel SSDs. Hell, you can put them in Raid 0 and install OMV there just for the sake of it!

    Custom mini-ITX build
    Coolcube Mini, Intel Desktop Board DQ77KB, Intel Core i7-3770S, 8 GB DDR3 Ram, 64 GB Trascend mSata SSD (OS), X3 1TB HDD pooled + parity

    Dell Optiplex 960 sff (deprecated) - link


    Dell Optiplex FX160 (repurposed) - link


    "If you can't find it in Google, it simply doesn't exist!" - The Internetz


  • Wow.


    I can tell you that my work's bin has things like 2.5" 20GB PATA laptop drives... not 2TB HGST SATA drives! I agree with the others here... you need to start an ebay shop!


    On a slightly serious note, as a dumpster diver myself, those drives will have probably had a hard life, so I would recommend that you don't RAID0 anything, RAID1 at the least, but preferably RAID6 everything because (IMHO), you'll be swapping out drives for a while until you've sifted the worst out of your collection.


    I agree with Eryan:

    Get rid of the USB install


    If you're planning to use this as a Myth backend, be aware of the old Mythweb and the new Myth WebFrontend that they have - just in case it conflicts with the OMV web GUI - and the myriad other web GUIs from the OMV plugins :)


    Personally, I use another device as my Myth backend, then rsync the data from that to my OMV NAS periodically, but that's just because of they way I've built my systems over time.

  • PS: Get rid of the USB install, you just got 2 Intel SSDs. Hell, you can put them in Raid 0 and install OMV there just for the sake of it!


    USB install already gone. Reinstalled on the SSD last night, but I still need to mirror them. I went ahead and installed the flash plugin again just to cut down on the writes. Only issue now is to find a way to mount them as all of my actual drive bays are full.


    On a slightly serious note, as a dumpster diver myself, those drives will have probably had a hard life, so I would recommend that you don't RAID0 anything, RAID1 at the least, but preferably RAID6 everything because (IMHO), you'll be swapping out drives for a while until you've sifted the worst out of your collection.


    Roger that. Smart data says they have between 30k and 40k hours of use on average. I know they saw very little actual read/write use though I doubt that has much to do with their longevity going forward for me. I plan to use ZFS and right now have them split into two 6 drive RAIDZ2s. I've cycled most of them through the system running smart short and long tests and haven't encountered any bad ones yet. Before I put any real data on them I will run badblocks.

Jetzt mitmachen!

Sie haben noch kein Benutzerkonto auf unserer Seite? Registrieren Sie sich kostenlos und nehmen Sie an unserer Community teil!