Posts by Majorpayne

    I seem to be running out of space and I was hoping there is a easy way to look at the system drive and tell what files are taking up space... I'm worried through my tinkering that I saved a couple of large files to my C drive.

    currently sitting at 187gb used out of 256

    The user and group "911" is a Deluge created system user and group. If you look at Deluge's config directory, you'll see it in permissions. Similarly, if you look at Sonarr's config folder, you'll see the "users" group and user "dockeruser" with their PGID and PUID's. (I now understand why Sonarr crashed, without a user and group ID's assigned.)
    Sorry for the confusing start. Once committed, I should have config'ed it up instead of trying to do it from memory. Like they say, the devil is in the details. In any case, all's well that ends well.

    I have to say, I learned a few things in this process. I used to wonder about these packages, what they did, etc., but without a reason to do it, I wouldn't have loaded them up. Now, with a peek under the hood, I'm giving thought to configuring up Sonarr and Deluge on a dedicated ARM board.


    Yeah, I just like not seeing that error message all the time. I appericate all the help as it helps me learn more of this. I used to run linux for work back in the day but it was such a long time ago and i've lost 90% of the knowledge at least.

    I have it mostly working now. Sonarr is speaking with Deluge and Jackett. Deluge is saving the file to the proper place. Sonarr is not even attempting to check for the download show. I'm very close now.

    EDIT: Ok found out that a permission was missing and it made the group 911 which i assume means help something is wrong?

    I changed it and now it's able to move/extract the file.

    What container do you use for storage? Is it safe to assume that you are using a docker container for that?

    Because every once in a while sonarr doesn't move a show over, I can only assume it's because of this error
    You are running an old and unsupported version of Mono. Please upgrade Mono for improved stability.

    The last time i attempted to update mono i screwed the update management to the point I wasn't able to get new updates and had to resinstall OMV

    I've done option 2 before. Symlink's are great for getting past program limitations and other weirdness. Symlink's can also be used in a manner similar to mergerfs; to spread storage over multiple drives while providing the appearance and function of storing all data on one drive.
    With Symlink's and remote mount, one can transparently export/import data from/to remote servers. With the two, there's lots of flexibility for moving data around.

    Do you actually have all 9TB backed up?

    No, I do not have all 9 TB's backed up. Only about 5 TB's currently.

    Option 1 not possible as All 9 TB of files are sitting in /srv/disk name/Fileserver, I will trying option 2

    OK, you have a folder called "Fileserver" in the path. Apply the same permissions that were applied to "Downloads", to "Fileserver". (Note that the reset perm's plugin won't help you with this.)
    I've experienced the same sort of problem with permissions when I shared a sub-directory, under a root folder, of the data drive. If a shared folder is not at the root of the drive, the entire path to the shared folder needs to have compatible permissions. Change Fileserver to "Others", "Read / Write".

    This is why I asked if you have WinSCP installed. It's far easier to look at standard, unshared folders and change their permissions, with WinSCP, versus doing the same on the command line.

    The above is an attempt to resolve the "denied" error dialog you provided. Since I don't use this particular Docker, I don't know how the container interacts with the host.

    My Apologies I did not see the question about WinSCP. Yes I do have it installed.


    Import failed, path does not exist or is not accessible by Sonarr: /srv/9a94fceb-ff72-4c70-9562-591fcc600b9e/Fileserver/Downloads/DCs.Legends.of.Tomorrow.S03E10.iNTERNAL.720p.WEB.x264-BAMBOOZLE