• ResolvedModerator James Huff

    (@macmanx)


    In the FAQ, you mention that DreamObjects can handle a maximum of 2 GB.

    My almost 10-year-old blog misses that by a tiny fraction of 1 massive GB. I see some development work mentioning multi-part uploads, and was wondering if the plan was to eventually side-step that 2 GB limitation by breaking the backup into smaller chunks.

    Thanks in advance!

    https://www.ads-software.com/plugins/dreamobjects/

Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Author Ipstenu (Mika Epstein)

    (@ipstenu)

    ?????? Advisor and Activist

    Yes and no.

    I’m mid-update of a massive backend rewrite, with some of the newer Ceph features to mitigate this buuuuuuttttt. Ceph and the hard limits of what it can move and PHP are separate matters.

    1) PHP memory on most hosts tends to crap out after around 500megs. Less on shared, obviously.

    2) PHP has a hard limit of 2G. https://docs.aws.amazon.com/aws-sdk-php/guide/latest/faq.html

    3) The new SDK way of multipart uploads is a bear and not as logical as the old way, which is saying something. https://blogs.aws.amazon.com/php/post/Tx7PFHT4OJRJ42/Uploading-Archives-to-Amazon-Glacier-from-PHP

    So the real issue with large backups and how this plugin works is item #1, when you break it down, since PHP will choke making the zip long before we hit the upload part, and that’s what I don’t have a great fix for. Even if I tell PHP to make a zip into it’s own multipart, it STILL uses a lot of memory.

    The ultimate solution would be to build a VaultPress-esque back-end, OR a totally boto-rsync one that would stream uploads ‘live.’ Both are beyond my skill-set at the moment. Boto is likely to happen, from a DreamHost server level, faster and would be a great thing for a much bigger backup of the WHOLE site (not just WP). And … I kind of think on many levels that would be better for people, y’know?

    Which may be why a Panel <-> WP interface is also on my massive to do list ?? “Panel, run a whole domain backup.” Affirmative, Captain.

    Moderator James Huff

    (@macmanx)

    Ah, that makes perfect sense. One crazy question though, making a massive zip is always going to be the hard part for PHP, but what if the backup could be split with a zip for each /wp-content/uploads/ year?

    The uploads are always going to be the biggest part, unless you have a mostly text blog, but then again that wouldn’t be a problem in the first place. ??

    If the backup were in multiple zips, one for the database, wp-config, themes, and plugins, then one zip each for each upload year, each zip might skate past any limits. On the other hand, the process of all the zips could tank a server if not separated by enough time.

    Oomph, I think I just talked myself out of that idea. :/

    Thanks for working on this plugin though. From what I can tell, it’s a super-handy way to automate backups on a service that doesn’t charge through the roof. ??

    Plugin Author Ipstenu (Mika Epstein)

    (@ipstenu)

    ?????? Advisor and Activist

    Yeah, that’s retry much how I work through these too. If I can get php to safely extend the timeout, and multithreading to work, we should be okay…

    Moderator James Huff

    (@macmanx)

    Awesome, looking forward to it, thanks! ??

    Plugin Author Ipstenu (Mika Epstein)

    (@ipstenu)

    ?????? Advisor and Activist

    So far, I managed to crash MAMP twice ?? I’m VERY talented. But that is why it’s so slow. I’m being careful and making sure it’s at least as good as today before I release it and optimize the living *(^@#&*^@# of out of it ??

    Moderator James Huff

    (@macmanx)

    Ha, no worries, I completely understand. ??

    “I managed to crash MAMP twice” isn’t something I’d want to read in release notes anyway. ??

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Multipart Uploads and DreamObjects Limit’ is closed to new replies.