• Resolved Andrew Brooks

    (@andybrooks)


    Hi,

    I have tried logging this via the typical support channel and have not gotten a response.

    I am running Rank Math SEO, but that doesn’t seem to be the problem. If I delete robots.txt, save settings in the plugin nothing gets created, and Google then says it can’t reach robots.txt, and I get a 404 error trying to open it.

    Andy

    The page I need help with: [log in to see the link]

Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Author Pagup

    (@pagup)

    Hi,

    Thank you for contacting our support.

    About your issue, just to mention that you have a robots.txt ! However, if you’re not able to see “our” robots.txt file, it means that either our plugin is conficting with another plugin (here, Rank MAth that is also generating a virtual robots.txt file) or our plugin is not comptatible with your website (which seems curently broken to us).

    One more thing. Protecting your robots.txt under the Cloudflare firewall may not be a good idea for search engines. It’s actually useless !

    Regards

    Thread Starter Andrew Brooks

    (@andybrooks)

    Hi

    I know I have a robots.txt file; I delete it, then try to access it and I get a 404 error.

    The website is working, we are getting plenty of orders, Cloudflare validates IP address locations and if you are not in New Zealand, should present a managed challenge. If you tick the box you’ll be directed to the site without this we are the target of a bunch of attacks. I have moved robots.txt to not be covered by the geographical restrictions.

    Are you saying RankMath is not compatible with your plugin?



    Plugin Author Pagup

    (@pagup)

    Alright! We are not from New Zealand.

    Regarding our plugin, it is compatible with Rank Math as it was designed to detect sitemaps generated by Rank Math. However, in this case, Rank Math may override our plugin by generating its own virtual robots.txt.

    Here’s what you can do:

    Since you have already deleted your robots.txt file on your server, it is possible to recreate a virtual robots.txt with our plugin.

    1. Go to the settings page of our plugin.
    2. At the bottom of the page, click on “delete settings,” then “save.”
    3. Once done, uninstall our plugin, clear your cache, then reinstall it.
    4. Apply your settings, save, and again, clear your cache.

    Our plugin should be able to override Rank Math and its own robots.txt feature.

    Let us know if this helps.

    Regards,

    Thread Starter Andrew Brooks

    (@andybrooks)

    Thanks very much for your support, I still can’t get it to work.

    I have followed your instructions, but it hasn’t worked. It looks like an issue with the Nginx server on my end.

    When I delete my existing robots.txt file without your plugin installed, RankMath doesn’t create an accessible robots.txt either; I just get a 404 error. This is the same as when I installed your plugin.

    I have tried disabling rank math and just installing your plugin, but that doesn’t work either.

    So, the issue is definitely my configuration; I just don’t know where to start.

    Plugin Author Sajjad Ali

    (@the-rock)

    Hello @andybrooks

    The issue is most likely with Nginx permalinks settings. Here are the things you can try.

    1. Enable Better-Robots.txt Plugin, go to Settings > Permalinks > Save changes
    2. If that doesn’t solve the issue, try configuring Permalinks for Nginx. This guide might help with that.

    Also, try disabling your cache plugins or Cloudflare cache for testing. Please note that your website is broken on my end as well (in Firefox, Windows). It’s not loading some CSS stylesheet for the homepage. The best way to debug is on a staging website. Try disabling all plugins and enabling only Better-Robots.txt to see if that works.

    Let us know if this helps.

    Regards

    Thread Starter Andrew Brooks

    (@andybrooks)

    Thanks so much for your help everyone, I have it working now!

    The issue was the .conf file for the Nginx server. To try and pay it forward, I have tried to write the solution as instructions for the fix.

    If you have permalinks enabled and still can’t get a virtual robots.txt to work, here is a fix to try.

    Find your gnix .conf file; mine was in /etc/ngnix/conf.d called wordpress_http.conf & wordpress_https.conf.

    Make a backup of these files and look for the following lines

    location = /robots.txt {
    allow all;
    log_not_found off;
    access_log off;
    }

    If you have the above, replace it with this :

    I changed it to this:
    location = /robots.txt {
    try_files $uri $uri/ /index.php?$args;
    access_log off;
    log_not_found off;
    }

    From your console, restart nginx server, with “sudo systemctl reload nginx”

    Now check again you should get the the better-robots.txt generated robots.txt

    What have you just changed?
    Just having the equals sign means that when the request matches, only these rules are performed and nothing else. When you have a robots file, you don’t have any issues – it just gets served with no issues. But if you don’t, and are relying on a virtual one, nothing in the block lets WordPress handle the request, so you get a 404.

    You can remove that block completely, and it will be handled along with everything else. However, it will fill up your log file with all the robots.txt requests. But changing it to the above suggestion WordPress will handle the request and not fill up your log file.


Viewing 6 replies - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.