Hi everyone!
On Friday evening we had a power outage and everything went down for hours. As my colleague had saved an important file to server which needed to be sent to the client, I took one of the raid disks out to access it via a docking station at home to finalize the report and send it on time.
Now, when coming back to the office some colleague had switched the NAS on already, before I could put the disk back (not sure if this had an impact on my problem).
The disk is back in the NAS, but seems like the NAS doesn't recognize the disk and says raid 1 "clean / degraded":
When clicking on "Details", this is what appears:
Version : 1.2
Creation Time : Fri Dec 21 20:09:10 2012
Raid Level : raid1
Array Size : 976761424 (931.51 GiB 1000.20 GB)
Used Dev Size : 976761424 (931.51 GiB 1000.20 GB)
Raid Devices : 2
Total Devices : 1
Persistence : Superblock is persistent
Update Time : Thu Mar 13 10:10:01 2014
State : clean, degraded
Active Devices : 1
Working Devices : 1
Failed Devices : 0
Spare Devices : 0
Name : openmediavault:raid1
UUID : 85c070e3:0215692c:f667e7dc:78abd5b2
Events : 1712
Number Major Minor RaidDevice State
0 0 0 0 removed
1 8 32 1 active sync /dev/sdc
Alles anzeigen
So from what I can see it is missing /dev/sdb which was the disk that has been removed.
When I click on "Recover" the disk doesn't show up:
How do I get OMV get to recognize the second disk for the raid1 without having to delete everything on the disk? Is there any chance?
My setup is the following:
HP Microserver N36L
Disk1: 250GB (OMV installation only)
Disk2: 1TB (Raid1)
Disk3: 1TB (Raid1)
Disk4: 1TB (Backup disk for Raid1, which will be mirrored via Rsync daily or at least 2-3 times a week).
Then for additional Backup I have two external HD of 1TB which will mirror the Raid1 in turns, always on Friday.