Oh dear! I thought you faced a more common issue. Thank you for sharing those details, it helps to deduct a lot ??
I think I found the issue: your hosting provider blocks visitors when there are too many active concurrent requests.
Google is not a saint when it comes to this, but I’m sure they’ll retry later.
For now, as the message states, Google has fallen back to the last known good robots.txt file; which is why your site’s still on Google.
Just try hitting CTRL+R a few times rapidly on your site, and you’ll see this poorly written “English” message:
This site/page has used all avaialble php / apache processes allowed on free hosting account.
Refreshing the page once the amount of apache / php processes are reduced will cause the site to work
We would recommend upgrading your hosting account at [some host] , premium hosting accounts have MUCH higher resources dedicated to them.
I know you’re hosting at about $2/month (or even free?) with excellent performance. So, I don’t know if you’re willing to up on that. I advise looking at how the business grows before adding more costs.
Why doesn’t this happen to the other sites you have at the same hosting party? I can’t tell from here, but it could be that the newer site receives a lot more traffic.
I believe the message you faced will disappear the next time Google tries to obtain the robots.txt file. Keep me posted!