Hi. Newbie here.
i have a Raid 5 setup on my OMV NAS, version 1.12 (kralizec), but after moving half way around the world and not accessing the NAS for about 3 months, the RAID 5 says "clean, degraded" and detail says that one of the five disks (/dev/sdf) has been "removed", however it's still in the machine and it IS recognized under S.M.A.R.T. with temperature measurements.
I tried swapping that "removed" disk with another one in the hot bay (switched /dev/sdf with /dev/sdc), and the RAID then recognizes those drives as the opposite of what they were initially, but it still says the /dev/sdf drive has been "removed" I.e., now the drive that was c is now f, and it says it has been removed.
When I click "recover" under the RAID management tab, I do not have a device appear there to choose a recovery.
Anyone know how to best restore the RAID for someone who is VERY illiterate with code?
A separate problem: I forgot my CLI login name and password. Any tips on how to recover it without doing a total reinstall?