file system missing

  • My four HDDs were connected to a faulty serial SATA power. The thing was working fine till I made a restart today and power for two HDDs failed, upon checking the web interface first it showed me only 2 HDD and later none.


    I fixed the power issue but once the system came up there are no RAIDS to display in RAID Management, and the file system tab show n/a.


    From another thread I tried the following command which too didn't do any help.


    Code
    # rm -f /etc/monit/conf.d/*
    # omv-mkconf monit
    # service monit restart



    This is the output for cat /etc/fstab :



    This is the output of cat /etc/mtab :



    Code
    cat: /etc/mtab: No such file or directory


    This is the output for blkid :



    Code
    /dev/sdc: UUID="12505788-fecf-a058-19b9-265d0b406889" UUID_SUB="68e2d1d1-dde4-133d-656c-7478577d409e" LABEL="JsMediaVault:MediaRaid" TYPE="linux_raid_member"
    /dev/sda: UUID="12505788-fecf-a058-19b9-265d0b406889" UUID_SUB="10cfbb8a-a494-4d3c-29c2-838c15cdd297" LABEL="JsMediaVault:MediaRaid" TYPE="linux_raid_member"
    /dev/sdd: UUID="12505788-fecf-a058-19b9-265d0b406889" UUID_SUB="cbfe5fc0-6a29-8df0-f7f3-2a6bbba7193b" LABEL="JsMediaVault:MediaRaid" TYPE="linux_raid_member"
    /dev/sde1: UUID="e6c1d965-45a7-4d78-82f9-1892174952cc" TYPE="ext4" PARTUUID="9e89de26-01"
    /dev/sde5: UUID="4d396a6c-e23a-4ce8-b319-3d007a83fa35" TYPE="swap" PARTUUID="9e89de26-05"
    /dev/sdb: UUID="12505788-fecf-a058-19b9-265d0b406889" UUID_SUB="30e79b55-c5de-d8a3-5213-6ddce7bb988d" LABEL="JsMediaVault:MediaRaid" TYPE="linux_raid_member"


    The data is really important. ||:(

    • Offizieller Beitrag

    The first thing I would do is try to determine if the NAS itself is still healthy. (Otherwise, testing drives is pointless.)
    The PC you're using as a NAS may have other unknown problems created by the power supply.


    You're going to need to burn a few diagnostic tools - live CD's. Go here, do a quick read, and make a choice. 5 Rescue CD's Since it's a bit more geared to what happened to you, I lean toward the "The Ultimate Boot CD", but there are plenty out there and you can burn more than one.


    If a rescue CD indicates there's problems in the NAS (it fails memory tests, CPU tests, tests, or other - be through):
    I'd install one of the drives in another PC temporarily (you just need a sata and a power connection, it doesn't have to be permanently installed. Take the side of a case off, disconnect the internal drive, and hook it up using the existing cables.)


    Boot up on a live Gparted CD (you can get a live CD here-> Gparted)
    (Alternately, you could go the drive manufactures web site. Some have live CD diagnostic tools.)


    If the partitions still exist, you should see them. If they don't, I'd check my cable connections to be sure, but your drives may be toast.


    In any case, you need to prepare yourself for the idea that your drives may be dead, and your NAS PC may be mortally wounded as well.
    _______________________________


    Lastly, as you're now aware, RAID is not backup.


    Please read and heed (in the future) the very next line of my signature below.

  • I will check up on with the Live CD


    I tried a few commands and these were the outputs


    fdisk -l :


    mount :





    • Offizieller Beitrag

    To create another RAID even the HDD are not being shown.


    I had turned on the Flash Memory briefly before the restart.


    And, No I really don't have a backup. ||

    I looked at the screen captures from this post. The fact that the hard drives show up at all, under Physical drives, is promising. However, you'd have to wipe the drives, in Physical drives, before creating a new file system or (heaven forbid) another array. If you do that, wipe the disks, any chance of data recovery is out the window.


    First, test your NAS "extensively". Again, if something is wrong with it, looking at the hard drives is a waste of time. (And don't dismiss the idea arbitrarily, thinking that your NAS PC is OK. Some of the voltages that go to your drives also go to the MOBO.)


    If the NAS tests OK, and you want to take a shot at reconstituting the array, this is a good guide to do it. -> Linux RAID Recovery
    _______________________________________________________


    If you manage to get your array operating again, give serious thought to getting your data off of it and setting up real backup.

  • I was away from home and only a remote to my home pc available, so i kept on trying out commands from other similar threads,


    fdisk -l :


    Which lead me to more search and finally issuing this :


    mdadm --assemble /dev/md127 /dev/sd[abcd] --verbose --force


    Well, this brought the raid up, with an output that only two HDDs are functioning. Anyways, backing up the data and will try another fresh install. Or may be another Distro. Seems usually the help here is sparse or selective.

  • I looked at the screen captures from this post. The fact that the hard drives show up at all, under Physical drives, is promising. However, you'd have to wipe the drives, in Physical drives, before creating a new file system or (heaven forbid) another array. If you do that, wipe the disks, any chance of data recovery is out the window.
    First, test your NAS "extensively". Again, if something is wrong with it, looking at the hard drives is a waste of time. (And don't dismiss the idea arbitrarily, thinking that your NAS PC is OK. Some of the voltages that go to your drives also go to the MOBO.)


    If the NAS tests OK, and you want to take a shot at reconstituting the array, this is a good guide to do it. -> Linux RAID Recovery
    _______________________________________________________


    If you manage to get your array operating again, give serious thought to getting your data off of it and setting up real backup.

    Thank you for your help.

    • Offizieller Beitrag

    Thank you for your help.


    I've been away from, for the last few days, myself. (Unfortunately, there's no Internet where I was. Yeah, it's "that" remote.)


    BTW: Support for RAID issues is sparse "everywhere". When an array goes south, the news is usually not good and troubleshooting tends to go down the "RAID rabbit hole". Recovery, if it's possible at all, must be done in a specific sequence that can easily be botched. Few, if any, want to touch it or be the bearer of bad news.


    In any case, I hope you got your data off of the array without loss.
    _____________________________________________________________


    Seriously, give some thought to a new approach that includes full backup. If you look at my signature (below), you'll see how I do a complete server backup and note that doesn't have to be expensive. I'm using a Raspberry PI (with OMV) and rsync'ing the main servers data folders to a 4TB USB drive on the R-PI. You could, just as easily, retask an old PC to do the same thing by adding a big drive to it or 2 of the drives from your RAID array.


    As it seems, many folks want a single mount point which is made of up multiple drives. While RAID does that, there are other methods of doing it as well. LVM2 and UnionFS will do the same thing, without virtually eliminating the available tools for recovering data.


    (Frankly, I don't understand the reason why people want to consolidate drives into a single mount point. While it may help somewhat, in administration, things can get real complicated when there's a failure. When a folder is shared to the network, network clients have no idea which of the servers physical drives the share is on. In practical terms, it doesn't matter.)
    ______________________________________________________________


    So what was the final outcome of the drives - the two that didn't respond? Are they toast or did a wipe and format bring them back?

Jetzt mitmachen!

Sie haben noch kein Benutzerkonto auf unserer Seite? Registrieren Sie sich kostenlos und nehmen Sie an unserer Community teil!