any luck fbeye ?
Naw, being I am no coder but one who pays coders to work for me, I know nothing about that error or what it means or where to even begin, so for NOW I am good with just manually removing my stalled downloads.
any luck fbeye ?
Naw, being I am no coder but one who pays coders to work for me, I know nothing about that error or what it means or where to even begin, so for NOW I am good with just manually removing my stalled downloads.
So far no luck here either. just does not seem to do anything.
So far no luck here either. just does not seem to do anything.
Darn you was my lost hope Bern lol might have to bung ryecoaaron money to do it lol! 2 minutes with him and it would be working.
Darn you was my lost hope Bern lol might have to bung ryecoaaron money to do it lol! 2 minutes with him and it would be working.
Don't know about that. My poking so far is leading me to suspect potential python/pyarr version problems too. When I popped it on an ubuntu 20 lxc (basically the ubuntu exquivalent of debian 11) I am getting python related errors pointing to pyarr and it's json parsing. (I generally use ubuntu on the lxc's as it goes on with a little less messing around than debian)
not sure if the scripting is python 2 based but ubuntu is installing python3 and the equivalent pyarr, or if there is something else going on I am missing.
Gonna have one more kick at it with a debian 10 or 11.
Ok so, fired up a deb 11 lxc, it's pinging the host ok, running the scripts returns no errors but seems to do nothing still.
No access activity in the *arr logs. Tried echoing the parsed id's that the python scripts are supposed to return, and not seeing anything, which tells me that the scripts are not actually pulling anything from the containers.
Once again though if this is an issue with a docker versus a native install, I don't know to address that really. If it is because of a module missing from the container, you could add it, but the next restart would drop the module, unless the base image is rebuilt from scratch. A try with a native install of radarr and these scripts on another system or a vm might work and answer the question of the no returned id's, but it's something I don't have the time to even get into building up another system and testing right now.
While this would be nice for automation, I don't have the ability (without some time learning some things) or desire to really chase further by rebuilding docker base images.
***edit***
The deb lxc was not giving the pyarr errors that the ubuntu lxc was giving.
Alles anzeigenOk so, fired up a deb 11 lxc, it's pinging the host ok, running the scripts returns no errors but seems to do nothing still.
No access activity in the *arr logs. Tried echoing the parsed id's that the python scripts are supposed to return, and not seeing anything, which tells me that the scripts are not actually pulling anything from the containers.
Once again though if this is an issue with a docker versus a native install, I don't know to address that really. If it is because of a module missing from the container, you could add it, but the next restart would drop the module, unless the base image is rebuilt from scratch. A try with a native install of radarr and these scripts on another system or a vm might work and answer the question of the no returned id's, but it's something I don't have the time to even get into building up another system and testing right now.
While this would be nice for automation, I don't have the ability (without some time learning some things) or desire to really chase further by rebuilding docker base images.
No worries Bern thank you for trying. I would not be anygood to help as my coding skills are zero. I'll have a look about see if there is something like this one out there
Ive found one that works! Enjoy peeps
Interesting. I'll have to give it a whirl. Maybe see if it can be extended to do the other servarr apps too.
Interesting. I'll have to give it a whirl. Maybe see if it can be extended to do the other servarr apps too.
Works great BernH. I just made a folder and ran the two commands after filling in the API etc... ive set it to every 30 minutes tho.
Works great BernH. I just made a folder and ran the two commands after filling in the API etc... ive set it to every 30 minutes tho.
Looking at the docker file, it looks like the env variables are set for only sonarr and radarr, but theoretically, we should be able to add variables for the other apps. If their json structurs are the same (hopefully they should be), extending it to work with the others should not be a problem.
Since it builds a docker container to run this in, there should be no worries about it polluting the native python install for OMV.
Looking at the docker file, it looks like the env variables are set for only sonarr and radarr, but theoretically, we should be able to add variables for the other apps. If their json structurs are the same (hopefully they should be), extending it to work with the others should not be a problem.
Yea I don't see why not.
Since it builds a docker container to run this in, there should be no worries about it polluting the native python install for OMV.
indeed its its own container
Hey BernH!
So I have been messing around with this, changing, for me, from Radarr to Readarr... It seems to be a little more involved as if something were more embedded than the cleaner.pl file.
I modified mine as such
# Simple Sonarr and Readarr script created by Matt (MattDGTL) Pomales to clean out stalled downloads.
# Coulnd't find a python script to do this job so I figured why not give it a try.
import os
import asyncio
import logging
import requests
from requests.exceptions import RequestException
import json
# Set up logging
logging.basicConfig(
format='%(asctime)s [%(levelname)s]: %(message)s',
level=logging.INFO,
handlers=[logging.StreamHandler()]
)
# Sonarr and Readarr API endpoints
SONARR_API_URL = (os.environ['SONARR_URL']) + "/api/v3"
READARR_API_URL = (os.environ['READARR_URL']) + "/api/v3"
# API key for Sonarr and Readarr
SONARR_API_KEY = (os.environ['SONARR_API_KEY'])
READARR_API_KEY = (os.environ['READARR_API_KEY'])
# Timeout for API requests in seconds
API_TIMEOUT = int(os.environ['API_TIMEOUT']) # 10 minutes
# Function to make API requests with error handling
async def make_api_request(url, api_key, params=None):
try:
headers = {'X-Api-Key': api_key}
response = await asyncio.get_event_loop().run_in_executor(None, lambda: requests.get(url, params=params, headers=headers))
response.raise_for_status()
return response.json()
except RequestException as e:
logging.error(f'Error making API request to {url}: {e}')
return None
except ValueError as e:
logging.error(f'Error parsing JSON response from {url}: {e}')
return None
# Function to make API delete with error handling
async def make_api_delete(url, api_key, params=None):
try:
headers = {'X-Api-Key': api_key}
response = await asyncio.get_event_loop().run_in_executor(None, lambda: requests.delete(url, params=params, headers=headers))
response.raise_for_status()
return response.json()
except RequestException as e:
logging.error(f'Error making API request to {url}: {e}')
return None
except ValueError as e:
logging.error(f'Error parsing JSON response from {url}: {e}')
return None
# Function to remove stalled Sonarr downloads
async def remove_stalled_sonarr_downloads():
logging.info('Checking Sonarr queue...')
sonarr_url = f'{SONARR_API_URL}/queue'
sonarr_queue = await make_api_request(sonarr_url, SONARR_API_KEY, {'page': '1', 'pageSize': await count_records(SONARR_API_URL,SONARR_API_KEY)})
if sonarr_queue is not None and 'records' in sonarr_queue:
logging.info('Processing Sonarr queue...')
for item in sonarr_queue['records']:
if 'title' in item and 'status' in item and 'trackedDownloadStatus' in item:
logging.info(f'Checking the status of {item["title"]}')
if item['status'] == 'warning' and item['errorMessage'] == 'The download is stalled with no connections':
logging.info(f'Removing stalled Sonarr download: {item["title"]}')
await make_api_delete(f'{SONARR_API_URL}/queue/{item["id"]}', SONARR_API_KEY, {'removeFromClient': 'true', 'blocklist': 'true'})
else:
logging.warning('Skipping item in Sonarr queue due to missing or invalid keys')
else:
logging.warning('Sonarr queue is None or missing "records" key')
# Function to remove stalled Readarr downloads
async def remove_stalled_readarr_downloads():
logging.info('Checking readarr queue...')
readarr_url = f'{READARR_API_URL}/queue'
readarr_queue = await make_api_request(radarr_url, READARR_API_KEY, {'page': '1', 'pageSize': await count_records(READARR_API_URL,READARR_API_KEY)})
if readarr_queue is not None and 'records' in readarr_queue:
logging.info('Processing Readarr queue...')
for item in readarr_queue['records']:
if 'title' in item and 'status' in item and 'trackedDownloadStatus' in item:
logging.info(f'Checking the status of {item["title"]}')
if item['status'] == 'warning' and item['errorMessage'] == 'The download is stalled with no connections':
logging.info(f'Removing stalled Readarr download: {item["title"]}')
await make_api_delete(f'{READARR_API_URL}/queue/{item["id"]}', READARR_API_KEY, {'removeFromClient': 'true', 'blocklist': 'true'})
else:
logging.warning('Skipping item in Readarr queue due to missing or invalid keys')
else:
logging.warning('Readarr queue is None or missing "records" key')
# Make a request to view and count items in queue and return the number.
async def count_records(API_URL, API_Key):
the_url = f'{API_URL}/queue'
the_queue = await make_api_request(the_url, API_Key)
if the_queue is not None and 'records' in the_queue:
return the_queue['totalRecords']
# Main function
async def main():
while True:
logging.info('Running media-tools script')
await remove_stalled_sonarr_downloads()
await remove_stalled_readarr_downloads()
logging.info('Finished running media-tools script. Sleeping for 10 minutes.')
await asyncio.sleep(API_TIMEOUT)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Alles anzeigen
But when starting the container, it seems Sonarr works but not the radarr to readarr modified and I get this error;
[INFO]: Checking readarr queue...
Traceback (most recent call last):
File "/app/cleaner.py", line 111, in <module>
loop.run_until_complete(main())
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
return future.result()
File "/app/cleaner.py", line 105, in main
await remove_stalled_readarr_downloads()
File "/app/cleaner.py", line 79, in remove_stalled_readarr_downloads
readarr_queue = await make_api_request(radarr_url, READARR_API_KEY, {'page': '1', 'pageSize': await count_records(READARR_API_URL,READARR_API_KEY)})
NameError: name 'radarr_url' is not defined
Alles anzeigen
Like it wants to find radarr_url..Where is it looking for that.
Holy F Balls! Just mofified it...- Now says;
[INFO]: Checking readarr queue...
2023-09-20 20:47:05,469 [ERROR]: Error making API request to http://192.168.5.44:8787/api/v3/queue: 404 Client Error: Not Found for url: http://192.168.5.44:8787/api/v3/queue
2023-09-20 20:47:05,471 [ERROR]: Error making API request to http://192.168.5.44:8787/api/v3/queue: 404 Client Error: Not Found for url: http://192.168.5.44:8787/api/v3/queue?page=1
2023-09-20 20:47:05,471 [WARNING]: Readarr queue is None or missing "records" key
2023-09-20 20:47:05,471 [INFO]: Finished running media-tools script. Sleeping for 10 minutes.
I copied and pasted the key, and it is indeed the correct location.
Are you modifying Dockerfile at all?
its saying the IP address is wrong or Key
No the dockerfile is ran only once to install it to a container.
I run it like this
docker run -d --name media-cleaner -e SONARR_API_KEY='xxx' -e RADARR_API_KEY='xxx' -e SONARR_URL='http://10.0.0.222:8989' -e RADARR_URL='http://10.0.0.222:7878' -e API_TIMEOUT='600' media-cleaner
What command did you run it with?
docker run -d --name media-cleaner -e SONARR_API_KEY='xxx -e READARR_API_KEY='xxx' -e SONARR_URL='http://192.168.5.44:8989' -e READARR_URL='http://192.168.5.44:8787' -e API_TIMEOUT='600' media-cleaner
docker run -d --name media-cleaner -e SONARR_API_KEY='xxx -e READARR_API_KEY='xxx' -e SONARR_URL='http://192.168.5.44:8989' -e READARR_URL='http://192.168.5.44:8787' -e API_TIMEOUT='600' media-cleaner
Hmm looks fine ill install readarr tomorrow and have ago.
Just weird!
Sonarr works but not Readarr; All I did was change every instance of Radarr to Readarr, and of course the Port.
Sonarr is 192.168.5.44:8989
Readarr is 192.168.5.44:8787
Both are open and available. I even copy/pasted the API Key!!
I wonder if Sonarr and Radarr use
But Readarr does not.
I was digging into this too. readarr and lidarr are using a different api set from what I can see. (v1 not the v3 sonarr and radarr use). I was unable to get it to connect to both of them.
I suspect the entire script would have to be changed to talk to the v1 api. Unfortunately I don't know python so I'm at a loss on actually programming something in here.
All that said, I couldn't actually get this to work with my setup. The initial run reads the status of anything in the queue, and then goes into it's timeout before running again, but it never does a second run. so it never actually blacklists or removes the items from the queue. And of course since it just sits there at the timeout, there is no additional logging info that would point me to an error.
Sie haben noch kein Benutzerkonto auf unserer Seite? Registrieren Sie sich kostenlos und nehmen Sie an unserer Community teil!