Hi,
I added today two drives as a RAID 1 array in my OMV server. Unfortunately, when I moved my computer, one of those drives was unplugged, so the array was degraded when I booted again. I saw the missing drive, plugged it in again and booted again. The missing drive is seen in physical drives (/dev/sdc, where the other disk is /dev/sdd), but when I open the RAID management page, the drives column only lists /dev/sdd, while the state column switches between "active, degraded" and "clean, degraded".
There isn't any drive to be selected when I want to use the recover option (and neither in the create option btw). Rebooting doesn't change anything
A cat /proc/mdstat shows this :
Personalities : [raid1]
md0 : active raid1 sdd[0]
2930135360 blocks super 1.2 [2/1] [U_]
unused devices: <none>
Should I wipe the sdc disk and re-add it ? It's the first time I use an mdadm array. I used a Nvidia fakeraid array before and removing a disk to add it again was the way to go.
Thx