• lorenzocroccolino

    (@lorenzocroccolino)


    I am facing an error during the upload on S3 for large backups (> 5gb).
    I already read other topics on the support section with the same problem but there aren’t real solutions.

    The error log is: “ERROR: Cannot transfer backup to S3! (0)”
    I can provide you with the complete log file but it does not seem to be much useful.

Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Support happyAnt

    (@duongcuong96)

    Hello @lorenzocroccolino
    hmmm which BWU version do you run now? 3.10? Also could you please give me the server information at BackWPUp -> Settings -> Infomation tab?
    Thank you!

    Thread Starter lorenzocroccolino

    (@lorenzocroccolino)

    Hi, the current plugin version is “3.10.0”.
    Infomation tab:

    Versione di WordPress: 5.8.2
    Versione di BackWPup: 3.10.0
    Versione PHP: 7.3.31-1+0~20210923.88+debian9~1.gbpac4058  (64bit)
    Versione MySQL: 10.1.48-MariaDB-1~stretch
    Versione cURL: 7.52.1
    Versione SSL cURL: OpenSSL/1.0.2u
    WP-Cron url:: https://www.sideaita.it/wp-cron.php
    Auto Connessione: Risposta HTTP non prevista:
    Codice-Stato: 200
    Date: Wed, 24 Nov 2021 09:55:18 GMT
    Content-type: text/html; charset=UTF-8
    Expires: Wed, 11 Jan 1984 05:00:00 GMT
    Cache-control: no-cache, must-revalidate, max-age=0
    X-newrelic-app-data: PxQEU1JbCQQJR1NXAAMOUFwFDxFORDQHUjZKA1ZLVVFHDFYPbU5mEA1qGBYWTltBXwpPEl9BFUpUHwYDUlZTTgBMCFIICgMeHlQVQwJRCgcAAQFVUwELAARRVQYVHVEHCEJTbg==
    Vary: Accept-Encoding
    Cf-cache-status: DYNAMIC
    Expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
    Server: cloudflare
    Cf-ray: 6b31c2c9389a4ab5-FRA
    Content-encoding: gzip
    
    Document root: /home/500367.cloudwaysapps.com/qcktfmjwbn/public_html/
    Cartella temporanea:: /home/500367.cloudwaysapps.com/qcktfmjwbn/public_html/wp-content/uploads/backwpup-73423f-temp/
    Cartella di log:: /home/500367.cloudwaysapps.com/qcktfmjwbn/public_html/wp-content/uploads/backwpup-73423f-logs/
    Server: Apache/2.4.25 (Debian)
    Sistema operativo: Linux
    PHP SAPI: fpm-fcgi
    Utente PHP corrente: qcktfmjwbn
    Tempo massimo di esecuzione: 899 secondi
    Tempo di esecuzione massima per lo script BackWpUp: 30 secondi
    WP cron alternativo: Off
    Disabilita WP Cron: Off
    CHMOD per la directory: 493
    Orario del server: 9:55
    Orario del blog: 10:55
    Fuso orario del blog: Europe/Rome
    Differenza di tempo per il Blog: 1 ore
    Lingua del blog: it-IT
    Codifica client MySQL: utf8
    Limite di memoria per PHP: 256M
    Limite di memoria per WP: 256M
    Limite massimo di memoria per WP: 256M
    Memoria in uso: 56,00 MB
    Funzioni PHP disabilitate:: pcntl_alarm, pcntl_fork, pcntl_waitpid, pcntl_wait, pcntl_wifexited, pcntl_wifstopped, pcntl_wifsignaled, pcntl_wifcontinued, pcntl_wexitstatus, pcntl_wtermsig, pcntl_wstopsig, pcntl_signal, pcntl_signal_get_handler, pcntl_signal_dispatch, pcntl_get_last_error, pcntl_strerror, pcntl_sigprocmask, pcntl_sigwaitinfo, pcntl_sigtimedwait, pcntl_exec, pcntl_getpriority, pcntl_setpriority, pcntl_async_signals, 
    Estensioni PHP caricate:: Core, PDO, PDO_Firebird, PDO_ODBC, Phar, Reflection, SPL, SimpleXML, Zend OPcache, apc, apcu, bcmath, bz2, calendar, cgi-fcgi, ctype, curl, date, dba, dom, enchant, exif, fileinfo, filter, ftp, gd, gettext, gmp, hash, iconv, igbinary, imagick, imap, interbase, intl, ionCube Loader, json, libxml, mbstring, memcached, mongodb, msgpack, mysqli, mysqlnd, newrelic, odbc, openssl, pcre, pdo_dblib, pdo_mysql, pdo_pgsql, pdo_sqlite, pgsql, posix, readline, recode, redis, session, shmop, soap, sockets, sodium, sqlite3, standard, sysvmsg, sysvsem, sysvshm, tidy, tokenizer, wddx, xml, xmlreader, xmlrpc, xmlwriter, xsl, zip, zlib
    Plugin Support happyAnt

    (@duongcuong96)

    @lorenzocroccolino
    Could I ask which S3 destination did you backup to? An AWS S3 or custom s3 endpoint?
    there is an option in your job s3 tab called: Destination supports multipart and it should be checked to let you upload a large file if you backup to custom s3 endpoint.

    Plugin Support happyAnt

    (@duongcuong96)

    since we haven’t heard back from you, I’m going to mark it as resolved.
    In case you’re still having problems, feel free to let us know ??

    Thread Starter lorenzocroccolino

    (@lorenzocroccolino)

    Sorry for my long absence, I was not able to respond before now.
    The destination is a standard AWS S3 bucket in eu-central-1 region.
    We are using the plugin in other websites and it works fine with the same bucket.
    The problem is present just for this one website where the size of the backup is greater than 5Gb.

    I will reply faster this time, sorry again.

    Plugin Support happyAnt

    (@duongcuong96)

    @lorenzocroccolino could you please help me install this modified version instead then try again?
    https://www.dropbox.com/s/6olutuyy951s3hr/backwpup_s3_mod.zip?dl=0
    Thank you so much!

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Error S3 large backups’ is closed to new replies.