Thank you very much for your reply.
I must admit, this was a rather weird situation that I am neither able nor willing to reproduce because it has to do with a server misconfiguration. Let me try to explain:
It’s all about a directory on a Linux server that was connected to another local server via CIFS. So, the directory had initially been created empty, and then redirected (mount -t cifs). As long as the remote server was in its place, everything was ok. But then that server was replaced by another machine, and the CIFS connection was no longer needed. As the remote machine had changed, that connection became invalid. However, the empty directory was still in its place. If you did an ‘ls’ command on a higher directory level to show the dir structure, it would show up. If you would try to get any details on that specific dir from the command line, it would simply appear as empty. However, the routine SplFileInfo::getSize() that is being used by your software, seems to behave somewhat differently, and it brings forward an error (sorry for the German error messages):
[19-Jul-2024 02:02:55] Komprimiere Dateien als TarGz. Bitte habe einen Moment Geduld.
[19-Jul-2024 02:03:36] FEHLER: SplFileInfo::getSize(): stat failed for /data/web/www.xyz.de/vi
[19-Jul-2024 02:03:36] 2. Versuche, Backup-Archiv zu erstellen …
[19-Jul-2024 02:04:17] FEHLER: SplFileInfo::getSize(): stat failed for /data/web/www.xyz.de/vi
[19-Jul-2024 02:04:17] 3. Versuche, Backup-Archiv zu erstellen …
[19-Jul-2024 02:04:58] FEHLER: SplFileInfo::getSize(): stat failed for /data/web/www.xyz.de/vi
[19-Jul-2024 02:04:58] FEHLER: Schritt abgebrochen: zu viele Versuche!
But this is not the point. I was wondering why your software tries to collect information *at all* on directories which it is configured not to take care of. If it tries to get the size of such a directory, how can I be sure that it does not collect other (more sensitive) data?