Hey rudyz,
Some of the issues with trying to duplicate larger size sites is how hosts like GoDaddy deal with processes and load. Some hosts will kill any long running processes if the bandwidth on that server gets too heavy. So in other words if the other 300-500 sites (some hosts its even more) that share your same server are needing resources and you’re running a process like a long running backup routine, then your process just might get killed, it really depends on the load of the server.
This is because the host monitoring processes see your script as a possible culprit to chewing up server resources. Unfortunately this is the draw-back to shared hosting. Some hosts are actually better than others and I have done package creations over 1GB without any issues. Currently I’m starting to keep track of the hosting companies that have a better track record with the Plugin and have posted the host company list here. If anyone else reading this thread has a solid reliable host they know works well with the Duplicator please shoot me an email and I’ll add it to the list.
Also another common error that I have ran across is that some people don’t allocate enough space for the package to be created. In short they are trying to backup a 1GB archive file but they only have 250MB of space allocated to that hosting account. On most shared hosts determining if any space is available is very difficult to track. Therefore when the process quits due to space allocation then there isn’t really any way to detect this without custom API work for each hosts service (if they even expose the service). We are currently working around helping people with these issues and hurdles. Until then we just suggest to work with a host that has a really good reputation and is reliable.
Hope this helps