Thanks so much for your help everyone, I have it working now!
The issue was the .conf file for the Nginx server. To try and pay it forward, I have tried to write the solution as instructions for the fix.
If you have permalinks enabled and still can’t get a virtual robots.txt to work, here is a fix to try.
Find your gnix .conf file; mine was in /etc/ngnix/conf.d called wordpress_http.conf & wordpress_https.conf.
Make a backup of these files and look for the following lines
location = /robots.txt {
allow all;
log_not_found off;
access_log off;
}
If you have the above, replace it with this :
I changed it to this:
location = /robots.txt {
try_files $uri $uri/ /index.php?$args;
access_log off;
log_not_found off;
}
From your console, restart nginx server, with “sudo systemctl reload nginx”
Now check again you should get the the better-robots.txt generated robots.txt
What have you just changed?
Just having the equals sign means that when the request matches, only these rules are performed and nothing else. When you have a robots file, you don’t have any issues – it just gets served with no issues. But if you don’t, and are relying on a virtual one, nothing in the block lets WordPress handle the request, so you get a 404.
You can remove that block completely, and it will be handled along with everything else. However, it will fill up your log file with all the robots.txt requests. But changing it to the above suggestion WordPress will handle the request and not fill up your log file.