Nextcloud download of files larger than 1GB fails

    • Offizieller Beitrag

    Hey Soma!

    I am sorry that I hit a nerve with this introduction, but sometimes it is intimidating to ask "professionals" for step-by-step tutorials. Using the search here leads to great guides but sometimes I cannot follow along that easy.

    I will post my stack when I'm home, but tried to find the php.ini in /etc/... subfolders in the meantime, but no luck.


    Another proof you should not follow youtube videos for installing something. The authors are not going to support you.

    Well, Youtube is a good guide (but it's just that, it should be verified).. My issue is with some of the things the Youtubers do that defy common sense (ie, as you pointed out.. using the 998 user for docker PUID's).... Unfortunately videos are more popular than written tutorials. The other big issue is, as the software evolves, docker containers change, etc.. videos are often not updated. Well, the creators aren't removing the videos because they want the views/ad revenue, but it leaves outdated info out there. Here, written tutorials are easily updated when things need to be updated.

  • Yes KM0201, YouTube tutorials helped me a lot of times already, but I hab to watch 3 different videos to setup Nginx, Cloudflare etc from something like the timespan of 1,5 years... Once they're published, they don't get updated, so you have to look for a newer one. Written tutorials are easier to maintain.


    So, the download of this 4.6GB file was successful. :) Thank you all again! But it leads me to some other questions:

    Does anybody know where the 16GB temp file, I set up in the config of Nginx, is stored? What would be the max size of it, if I want f.e. to store and download an ISO of one of my BluRays?

  • Does anybody know where the 16GB temp file, I set up in the config of Nginx, is stored? What would be the max size of it, if I want f.e. to store and download an ISO of one of my BluRays?

    If you have bad luck, it is stored in the filesystem of the container. This will fill up you disk over time. The solution would be to destroy and recreate the container every now and them.

    If you got help in the forum and want to give something back to the project click here (omv) or here (scroll down) (plugins) and write up your solution for others.

  • Hmm, let's see if anybody knows for sure.

    I'm asking this, because I just tried to download a 18GB file from the SMB share I linked Nextcloud to, through a shared link by email and it failed at 16GB, the max temp file size.

  • I know it's not related to your specific issue because you're using another container, but I'll also add how to fix this problem when using letsencrypt/nextcloud in conjunction with letsencrypt/swag as reverse proxy.


    Nextcloud documentations says

    Code
    By default, downloads will be limited to 1GB due to proxy_buffering and proxy_max_temp_file_size on the frontend.

    Therefore, simply add this configuration to the nextcloud.conf file from swag. You can add it just before the end.

    proxy_buffering off;


    And the magic will happen!

    I had this issue for a very long time and only few months ago I found the way to solve it.


    Nextcloud is a great piece of software but IMHO is very hard to master and to run properly. I invested more hours than I'd like to admit to make NC running "fine".

  • I know it's not related to your specific issue because you're using another container, but I'll also add how to fix this problem when using letsencrypt/nextcloud in conjunction with letsencrypt/swag as reverse proxy.


    Nextcloud documentations says

    Code
    By default, downloads will be limited to 1GB due to proxy_buffering and proxy_max_temp_file_size on the frontend.

    Therefore, simply add this configuration to the nextcloud.conf file from swag. You can add it just before the end.

    proxy_buffering off;

    Damn, I missed that one on SWAG, :O


    Only now that I am finally downloading big files that I had that snag:

    My proxy_max_temp_file_size 2048m; was killing my download of ~3Gb


    Thank you for the pointer, ;)

Jetzt mitmachen!

Sie haben noch kein Benutzerkonto auf unserer Seite? Registrieren Sie sich kostenlos und nehmen Sie an unserer Community teil!