• Resolved jorgonfla

    (@jorgonfla)


    Your plugin is great but I have a major issue, every time I create a backup there is a zip (for images) that is close to 1 GB, when downloading that file it pretty much takes over the whole server and connection and it prevents visits to the site by regular visitors and robots. Only until it finished downloading it releases the server and connection.

    I own a massive web server configured with 56 processing threads and 28 cores at my disposal and it is hosted at a massive Tier 3 facility with 14 networks going there so it is not a hardware or internet connection issue.

    Why does your plugin not throttle the downloading of large files? why does it take over the entire server and connection?

    Backups are very important but not at the cost of messing up our google indexing or losing visitors.

    I can download very large zip or tar files in other storage places in my server and those only take 1 thread and never prevent anything from working properly. Only your plugin does that.

    • This topic was modified 1 year, 3 months ago by jorgonfla.
    • This topic was modified 1 year, 3 months ago by jorgonfla.
Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Author David Anderson

    (@davidanderson)

    Hi,

    When you download a file from the UpdraftPlus settings page, this is sent to the browser using the PHP function readfile()https://plugins.trac.www.ads-software.com/browser/updraftplus/tags/1.23.8/class-updraftplus.php#L4825https://www.php.net/readfile – which thus by definition takes place only within the single PHP process involved.

    Not ever having had any similar reports, I tested downloading a 2GB zip file on the local network on my 2015 laptop, and this was instant (less than 1 second) and had no measurable impact on the CPU – which, given what readfile() does, is what you’d expect. 1GB is not a significant amount of data for UpdraftPlus; we’ve seen users with 100GB backups. Since we don’t have any experience of reports of anything similar to drawn on, you should ask your server administrator to put a trace on process activity to see what’s happening. Perhaps you have something like a very limited number of PHP processes allowed to run in PHP-FPM, such that just having a few dashboard pages open at the same time maxes them out?

    David

    Thread Starter jorgonfla

    (@jorgonfla)

    I am my server administrator and that is a server I personally assembled and configured, I have been a web server deployment specialist and manager for 25 years and have had in large co-location facilities upwards of 20 servers managing more than 10 thousand websites at a time. So this is not a novice talking.
    The only difference this time is that specific server I configured with litespeed and cyberpanel and it is for my own business alone ( a hole massive web server for just one large website), cyberpanel has been unstable in a few aspects I have to admit that could be the source of the problem, I don’t remember if I had the same problem previously when that site was running on a cpanel /apache server.

    Cyberpanel/Open Litespeed uses lsphp not php-fpm. I will look into that as well, that might be part of the problem.

    To solve I am downloading those backups using ftp instead and throttling the connection, I just tested earlier and that will be the temporary solution. When I throttle and via the ftp port there is no issue, I was able to open as many pages as I wanted instantaneously.

    I will talk to the company selling me the co-location service as well there might be a problem with the settings on the switch where they have me running.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘When downloading large files the server is completely taken over by the process’ is closed to new replies.