• Resolved alexg9

    (@alexg9)


    Thanks for your effort Jeff, I don’t doubt that it is genuine, however since a few days ago my website metrics have started to plummet. Investigating I have seen that the pages have stopped being indexed and many pages positioned in the TOP 1 of Google have been deindexed, jumping an alert of “The Robots.txt file is blocking the indexing of the pages”. All this since I installed BlackHole and the robots.txt file has been edited by the plugin. I’ve tried uninstalling it and the robots.txt rules don’t change.

    My website that I have worked so many hours and years on is falling apart, could you bring a solution to the table?

    https://ibb.co/ygYKPNN
    https://ibb.co/DpCrWPg
    https://ibb.co/T4jZmGf
    https://ibb.co/MpbVhQt

Viewing 4 replies - 1 through 4 (of 4 total)
  • Hello @alexg9

    I am reporting as a power user of the plugin, so please also wait for the feedback from the author.

    First of all: I have been using many websites with the plugin for years and over time I have collected a list of almost 3,000 Bad-Bots across all domains. And there has NEVER been an issue with search engines, especially Google. Some of the sites are observed by their respective owners and I would get an alert within hours …

    From your screens I could see that the robots.txt and also the sitemap fit so far and cannot be the cause – so no exclusions exist.

    However, you have not set the plugin explicitly specified additions to robots.txt!

    User-agent: *
    Disallow: /*blackhole
    Disallow: /?blackhole

    Also, it can be seen in the source code that the “nofollow” is generated to the blackhole link.

    Google should respect this instruction with every link, but I am too little experienced, what can happen with the missing in the robots.txt. Jeff has certainly included this in the settings for a reason ??

    Currently it seems that the relevant bots from Google are already listed in your blacklist!

    Therefore it would be now either acutely necessary and would help to delete the blacklist and to set in addition the above-mentioned entry.

    I think you will get further with this.
    Regards!

    Plugin Author Jeff Starr

    (@specialk)

    Hi @alexg9,

    In addition to @wp-henne’s recommendations, also want to emphasize that, as explained in the plugin docs, readme, homepage and settings page, Blackhole is not compatible with *page cache* plugins. So that would be another thing to check. Let me know if any questions about this.

    • This reply was modified 1 year, 9 months ago by Jeff Starr.
    Thread Starter alexg9

    (@alexg9)

    Thank you for your extensive explanation @wp-henne. And to you too @specialk

    To achieve a quick solution I’ve disabled the plugin by the moment and manually rewrite the robots.txt. Now the indexation is working well again. Anyway, I’ll read all the information you’ve gived to me to test it in a staging site because I wouldn’t want to repeat indexation problems on my platform.

    Thank you again,

    álex

    Plugin Author Jeff Starr

    (@specialk)

    Sounds like a plan, álex. Also should mention there are complete testing steps under “Testing Blackhole” on the plugin Installation page. Basically make sure no page-cache plugins and that robots rules are correct, should be good to go.

    Let me know if I can provide any further infos, glad to help anytime.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Blackhole is causing deindexing in Robots.txt’ is closed to new replies.