Hi,
I backed up my site remotely. I downloaded the .tar backup file. I created a new wordpress site to test whether I could restore the backed up site (i.e. recreate it elsewhere). This page https://docs.xcloner.com/#/wordpress/restore-a-backup/restore-locally says I will see “Site Backup > Restore Backup” as an option but I see “Site Backup > Restore Site“. Under “Restore Backup Archive” dropdown there are no files to choose from. Nor is there a place for me to upload my .tar file. What’s going on?
Thank you!
Kerry
I keep getting this error when creating a backup. I get a popup that says:
Response Code:
{“readyState”:0,”status”:0,”statusText”:”error”}
Does anyone know what the issue is?
]]>Hi,
From time to time, but not every time, I receive by mail an error message from XCloner Backup as follows:
Subject: pre_auto_update – backup error
Backup Site Url: https://thewebsite.fr
<>Error Message: SplFileInfo::getType(): Lstat failed for /home/blabla/www/wp-content/backups-Nt9TX//.xcloner/database-backup.sql
What’s it about?
]]>Hello,
I’m having trouble getting XCloner to save backups into a specific subdirectory within a Backblaze B2 bucket. The backups are uploaded successfully, but they always end up in the root of the bucket, even though I tried to specify a subdirectory.
I’ve tried entering the bucket name followed by the desired subdirectory in different formats (with and without a trailing slash), but the backups still go directly to the root.
Has anyone encountered this issue or know how to configure XCloner to save backups in a subdirectory within a B2 bucket? Any guidance would be appreciated.
Thanks in advance for your help!
]]>Hi,
Having failed to get SFTP and Webdav storage to work, I fall back on FTP and, after creating a specific account on an FTP server of my own with the simplest account name and password, i.e. no special characters…, and of course testing it from my computer and another source (same ftp server, port 21, user/pass) , I only get the error: FTP connection error: Could not connect to host: ftp.myhost.com, port:21
under xCloner when clicking the Verify button.
I alose tried to switch Ftp Transfer Mode and Ftp Secure Connection with no more success.
The XCloner debug shows only:[2024-09-12 08:51:49] xcloner_remote_storage.INFO: Creating the FTP remote storage connection [] []
[2024-09-12 08:51:49] xcloner_remote_storage.INFO: Checking validity of the remote storage FTP filesystem [] []
It’s getting pretty frustrating!…
]]>Using a NextCloud instance as WebDAV-Storage for remote backups results in a
Fatal error: Type of Sabre\DAV\Xml\Service::$elementMap must be array (as in class Sabre\Xml\Service) in?/var/www/ud01_320/html/OV_live/wp-content/plugins/xcloner-backup-and-restore/vendor/sabre/dav/lib/DAV/Xml/Service.php?on line?17
Checked several NextCloud instances (my own one, the one of our organization, the one of my employer), none did work. Occurred the first time about 3 months ago. Worked some years before.
]]>Hi,
Trying to set up the export of backups to a server using SFTP, fully functional and already tested with other clients such as Filezilla and a terminal, I notice the systematic error: “SFTP connection error: Could not login with username: myuser, host: A.B.C.D” when myuser is the actual user on the remote host at address A.B.C.D.
The SFTP port to be used is 65002 and that’s what in the SFTP setting..
]]>Hi,
Trying to set up the export of backups to a cloud using webdav (private Nextcloud), fully functional and already tested by various customers, I notice the systematic error: “WebDAV connection error: Could not write data”.
The webdav settings are correct and validated from other clients.
The error log on the cloud server shows nothing about the connexion from xcloner.
]]>Hello !
When I try to modify the option “Regex Exclude Files”, I get a reply
Not Implemented
POST not supported for current URL.
So, I tried using phpmyadmin ; I tried to modify the table wp_options, the key xcloner_regex_exclude. I tried to modify the value. From
(wp-content\/updraft|wp-content\/uploads\/wp_all_backup|wp-content\/ai1wm-backups|wp-content\/plugins\/akeebabackupwp\/app\/backups)(.).(.)$(?<!config|php|html|htaccess|htm)
(.).(svn|git)(.)$
wp-content\/cache(.)$ (.)error_log$
xcloner-[0-9a-zA-Z]{5}$
to
(wp-content\/updraft|wp-content\/uploads\/wp_all_backup|wp-content\/ai1wm-backups|wp-content\/plugins\/akeebabackupwp\/app\/backups)(.).(.)$(?<!config|php|html|htaccess|htpasswd|htm)
(.).(svn|git)(.)$
wp-content\/cache(.)$ (.)error_log$
xcloner-[0-9a-zA-Z]{5}$
but I get a reply
Error in processing request
Error code: 501
Error text: error (rejected)
It seems that the connection to server has been lost. Please check your network connectivity and server status.
For information, the plugin is well working ; the backup are correctly performed.
]]>I am trying to install Xcloner to use Google Drive as a storage location. Xconer is working perfectly for backup to a local drive on the server but when I setup Google Drive and go to save the settings I get this error and have not figured out what is causing the error.
Fatal error: Uncaught Error: Call to undefined method GuzzleHttp\Utils::chooseHandler() in /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-backup-and-restore/vendor/guzzlehttp/guzzle/src/functions.php:64 Stack trace: #0 /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-google-drive/vendor/guzzlehttp/guzzle/src/HandlerStack.php(42): GuzzleHttp\choose_handler() #1 /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-google-drive/vendor/guzzlehttp/guzzle/src/Client.php(65): GuzzleHttp\HandlerStack::create() #2 /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-backup-and-restore/lib/Xcloner_Remote_Storage.php(803): GuzzleHttp\Client->__construct(Array) #3 /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-backup-and-restore/lib/Xcloner_Remote_Storage.php(860): Watchfulli\XClonerCore\Xcloner_Remote_Storage->gdrive_fetch_token_with_watchful_configuration(Array) #4 /home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-backup-and-restore/lib/Xcloner_Remote_Storage.php(401): Wa in?/home/cnyfindi/public_html/n2amg/wp-content/plugins/xcloner-backup-and-restore/vendor/guzzlehttp/guzzle/src/functions.php?on line?64
Any help with this would really be appreciated..
Rick Ellison
Hello,
It seems like encryption causes problems with backups. When using encryption, backups will randomly fail to upload to a remote destination. Furthermore, when the backup consists of multiple parts, the plugin will upload only the last part to remote storage. Without encryption, all parts are uploaded.
Best regards,
Nadav
Hello,
I noticed that in recently versions, MySQL options are not available. How may I backup additional databases?
Thank you.
]]>While it says “Transferring backup to remote storage…” while transferring to Backblaze a message pops up saying “Response Code” which is empty and there is an Okay button to click on. Clicking that makes it go away. The transfer does not go through. It is a 9GB backup but my server max input time and max execution time are all long enough. The error log simply says:
[2024-05-05 03:04:07] xcloner_remote_storage.INFO: Transferring backup…….to remote storage BACKBLAZE [“”] []
Then it just ends. The file does not get uploaded and it does not remove it from the local storage like it is set to.
]]>I’m looking to limit the Backblaze application key to a specific prefix. Can the XCloner not limit itself to work under a specific prefix? Without being able to specify a prefix you simply receive a 401 authorization error since it’s trying to do a lookup on the whole directory.
]]>Hi,
I translated the plugin into Dutch this week, but I have a small suggestion in order to optimise the translation work.
You have included the full changelog in the “readme.txt” file. I would suggest taking the same approach as the woocommerce plugin and many others. That is, include only the changelog of the latest minor/major release in the “readme.txt” and put the older changes in a separate file (e.g. changelog.txt). See https://www.ads-software.com/plugins/woocommerce/#developers
This way the older changelog fixes/releases will not pop up in the translation files.
Regards
]]>Hello,
when doing manual backup I get the following error:
Response Code: 504 error
{“readyState”:4,”responseText”:”<!DOCTYPE HTML PUBLIC \”-//IETF//DTD HTML 2.0//EN\”>\n<html><head>\n<title>504 Gateway Timeout</title>\n</head><body>\n<h1>Gateway Timeout</h1>\n<p>The gateway did not receive a timely response\nfrom the upstream server or application.</p>\n</body></html>\n”,”status”:504,”statusText”:”error”}
How can I fix this?
Ralf
]]>Hello,
When I try to clone a website, I have problems where the backup and mysql backup do not appear in the dropdown menu.
Often the backup becomes so large that my webhost cannot use xcloner to upload the file, so I have to upload it manually.
That’s when the dropdown menus are empty.
(I have tried unpacking the tar file so that all files are in the same directory as the restore script)
Link to screenshot:
https://prnt.sc/RxCn6Pr8-aTN
Is it a browser issue (Chrome) or have I missed something?
Thank you in advance.
I’m getting this error every day:
Thanks in advance!
]]>Hello, I just tested XCloner backup, because it seems to be the only free tool to backup WordPress to external cloud services like WebDav. Everything ssems to work, even the backup files got stored to my cloud via WebDAV – great!
BUT: when viewing the list of backups, everything seems to be OK as long as I just stay at the local backup storage location. When I switch to the remote location, the tool says, there are no backups located at my webDAV storage, from which I could restore. But I can see the files on my cloud storage! What sense does a backup tool make, which cant’t find the backup files to restore, when needed?
AND: when I tried to restore the files from a (even local) backup, I just get a simple error 500. Nothing more. What sense does a backup tool make, which isn’t able to restore?
Could you please give some advice, what could be wrong.
]]>Hi,
I′m trying to do a backup of my web with the XCloner Plugin, but when I press ′start backup′ it gives me this error:
404 Not Found – {“readyState”:4,”responseText”:”<html>\r\n<head><title>404 Not Found</title></head>\r\n<body>\r\n<center><h1>404 Not Found</h1></center>\r\n<hr><center>nginx</center>\r\n</body>\r\n</html>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n<!– a padding to disable MSIE and Chrome friendly error page –>\r\n”,”status”:404,”statusText”:”Not Found”}
Can you please tell me how to proceed to be able to do the backup?
Thank you very much
]]>hi,
I am getting the below warning:
Warning: is_readable(): open_basedir restriction in effect. File(/home/XXXX/.aws/config) is not within the allowed path(s): (/home/XXXX/webapps/XXXX1:/var/lib/php/session:/tmp) in?/home/XXXX/webapps/XXXX1/wp-content/plugins/xcloner-backup-and-restore/vendor/aws/aws-sdk-php/src/DefaultsMode/ConfigurationProvider.php?on line?152
It is breaking my site in debug mode.
]]>Hi, I try to find a regular expression for exclude image thumbnails to reduce backup size pattern for imagename-100×100.png or imagename-sizexsize.png
I try
(.*).(-\0-9+x\0-9.+)$(?<=(gif|png|jpg|webp))
(.*).(0-9+x+0-9.+)$(?<=(gif|png|jpg|webp))
(.*).(/-\d+x\d+\.+)$(?<=(gif|png|jpg|webp))
can anyone help me? please,
anyone found a pattern for exclude thumbnails en the backups?
]]>Hello everyone,
I host multiple WP instances and I would like to backup them to my home NAS storage (Zyxel NAS326) via secured WebDAVs protocol.
At home I have a public IPv4 and the NAS sits behind NAT where the connection is set up by port forwarding (the service on the NAS for dav runs on port 5002, for davs on port 5003).
Unfortunately I cannot use FTPs because when it’s behind NAT, the FTP service in XCloner tries to connect to data transmission port on the public IPv4 and not on the local one so the FTP connection simply does not work thus I would like to use WebDAVs.
If I connect from the outside network to my davs service via for example Filezilla, there isn’t any problem (I just need to accept manually the trust to present SSL certificate).
But when I try to connect to the davs via XCloner, I get the error “WebDAV connection error: SSL certificate problem: unable to get local issuer certificate”.
When I try to connect to dav without secure SSL connection, the connection works and my backups work as they should. But to be honest, this isn’t optimal and I would like to use secured davs, not dav.
On my NAS I have valid and signet SSL certificate from certbot, which is in use for https access, webdavs or ftps. What I can’t do on my NAS (just because the device does not offer this option), is the upload of complete certificate chain of the SSL certificate. I can upload and apply only single, root SSL certificate which is used for all services.
Is it possible this could be the problem (the missing cert chain)? Is there any chance or workaround in XCloner configuration how I could bypass or manually accept the trust to the connection?
Of course if someone has a tip how to make FTPs working better than davs (I just need can specify in FTP service in XCloner it just need to use the local IP range for transmission ports), please let me know.
Many thanks in advance.
]]>The manual inputs for client id & client secret are not working correctly as GDrive refuses the connection.
Leaving those inputs blank, clicking the “Sign in with Google” link and filling out the authorization code works fine temporarily whereas it will let me do a manual backup at that exact time without issue. However, any attempt at a scheduled backup fails at the point of remote transfer (the local compressed backup file is created) and any attempt at a manual backup at a later time (ie. next day) without redoing the auth code fails. It seems the method to get authorization is only temporary.
Perhaps someone can troubleshoot why the manual setup is not working and fix it that way it may work properly at all times?
]]>Hi, I tried to check a full backup of my site today to make sure it is usable. Unfortunately, I noticed that the database is not included.
I tried to redo a full backup from scratch and checked the WordPress database, but the database is not present in the backup.
I used to find the database in a /xcloner folder in the root of the site , but the folder is not present either.
If I backup just the database everything works as usual.
Can you please help me out? Without the database I am basically without a backup .
Thank you
Hi,
From time to time, but not every time (based on a daily backup), sending the backup to a webdav volume results in an error.
Most of the time it will be ‘<>Error Message: Unauthorized’ but I have also occasionally seen ‘<>Error Message: OpenSSL SSL_read: SSL_ERROR_SYSCALL, errno 104’.
I’ve searched the log, can’t find anything revealing.
Here’s the log of an interactive attempt this morning.
[2023-08-20 08:12:10] xcloner_file_system.INFO: Cleaning the backup storage LOCAL on matching rules [] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part1.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250270 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part2.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250411 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part3.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250442 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part4.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250471 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part5.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250517 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part6.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250562 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part7.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250584 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-multipart.csv from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250588 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-17_05-30-nosql-part8.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692250588 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_pre_auto_update_translation_mydomain.fr-2023-08-17_13-30-sql.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692279028 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part1.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333079 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part2.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333222 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part3.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333252 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part4.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333281 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part5.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333327 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part6.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333372 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part7.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333394 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-multipart.csv from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333399 =< 2"] []
[2023-08-20 08:12:10] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-18_04-31-nosql-part8.tgz from storage system LOCAL matching rule ["RETENTION LIMIT TIMESTAMP","1692333399 =< 2"] []
[2023-08-20 08:13:29] xcloner_remote_storage.INFO: Creating the WEBDAV remote storage connection [""] []
[2023-08-20 08:13:29] xcloner_file_system.INFO: Cleaning the backup storage WEBDAV on matching rules [] []
[2023-08-20 08:13:30] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-09_04-30-nosql-part7.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","38 >= 30"] []
[2023-08-20 08:13:30] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-09_04-30-nosql-part8.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","37 >= 30"] []
[2023-08-20 08:13:30] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-multipart.csv from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","36 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part1.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","35 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part2.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","34 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part3.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","33 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part4.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","32 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part5.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","31 >= 30"] []
[2023-08-20 08:13:31] xcloner_file_system.INFO: Deleting backup backup_mydomain.fr-2023-08-10_04-30-nosql-part6.tgz from storage system WEBDAV matching rule ["BACKUP QUANTITY LIMIT xcloner_webdav_cleanup_retention_limit_archives","30 >= 30"] []
[2023-08-20 08:13:31] xcloner_remote_storage.INFO: Transferring backup backup_mydomain.fr-2023-08-20_04-31-nosql-multipart.csv to remote storage WEBDAV [""] []
[2023-08-20 08:13:31] xcloner_remote_storage.INFO: Transferring backup backup_mydomain.fr-2023-08-20_04-31-nosql-part1.tgz to remote storage WEBDAV [""] []
[2023-08-20 08:13:36] xcloner_remote_storage.INFO: Transferring backup backup_mydomain.fr-2023-08-20_04-31-nosql-part2.tgz to remote storage WEBDAV [""] []
[2023-08-20 08:15:27] xcloner_remote_storage.INFO: Transferring backup backup_mydomain.fr-2023-08-20_04-31-nosql-part3.tgz to remote storage WEBDAV [""] []
[2023-08-20 08:22:16] xcloner_file_system.INFO: Cleaning the backup storage LOCAL on matching rules [] []
The question is: what can I do to avoid these errors? Sometimes the transfer is done, others not.
]]>UPDATE: I see this is probably a json error so will be working that angle.
Installed Xcloner to do a back-up before switching a testbed site to WP 6.3. Could not complete a back up because of a 415 error {“readyState”:4,”responseText”:”\n”,”status”:415,”statusText”:”error”}. Any help with what this means and how to fix it would be much appreciated.
]]>Can you please add the option to have an email notification only if the backup fails?
]]>Hi, whether you’re creating a new backup or editing an existing one, there’s this option ‘Backup Only Files Modified/Created After’ which in the 1st case shows you a calendar to choose an day but not in the 2nd.
That is you can choose a specific date (eg. 25 june 2023), which may be interesting for a “one shot” backup, but for automatic daily backups (for instance), you’d need to specify ‘yesterday’.
How can this be achieved?
]]>If XCloner is enabled with WooCommerce and you visit super admin dashboard or any WooCommerce super admin pages, the header portion of the html is removed resulting to these https://prnt.sc/L3x56z5bIL-d and https://prnt.sc/Bl3jIAO9BUmg
I traced this line of code located in this file LINE 71 that if removed/disabled, it all works fine:
wp-content/plugins/xcloner-backup-and-restore/lib/Xcloner_Api.php https://prnt.sc/c5mlDd7FbYIT
Versions
WP – 6.2.2
XCLONER – 4.6.4
PHP – 8.0.2
Hope this helps.
]]>