• Hi everyone,

    I ran SEO analysis on my site and it stated that a number of pages are blocked from appearing in the search engines.

    I never intentionally blocked any pages. After doing some research, I learned some pages can be unintentionally blocked from search engines.

    And to remedy the issue, I have to remove the existing Robots Meta Tag, X-Robots-Tag, or robots.txt file directive that is currently in place.

    I know that I have to use an FTP program to access the files but don’t know where to find the robots.txt file and how to resolve the issue.

    My question is, can someone please explain how this is done and the path to get to the
    Robots Meta Tag, X-Robots-Tag, or robots.txt? Any help will be greatly appreciated. Thanks much.

Viewing 4 replies - 1 through 4 (of 4 total)
  • Moderator James Huff

    (@macmanx)

    First, visit Settings > Reading in your site’s Dashboard, and make sure that “Discourage search engines from indexing this site” is _not_ checked.

    Thread Starter samnexus

    (@samnexus)

    James,
    Thanks for the reply. Yes, that is not checked. I verified that when the SEO analysis results came up. That button is unchecked.

    Moderator James Huff

    (@macmanx)

    Are you using any SEO plugins? If so, which?

    Thread Starter samnexus

    (@samnexus)

    1. AmaLinksPro
    2. Amp
    3. Asset CleanUp: Page Speed Booster
    4. Classic Editor
    5. GDPR Cookie Consent
    6. iThemes Security Pro
    7. OptimizePress
    8. SiteGroundOptimizer
    9. Swift Performance
    10. Table of Contents
    11. WP Affiliate Disclosure
    12. WP-SpamShield

    I also thought Swift Performance could be the culprit and sent a support ticket to them explaining the issue, but hasn’t reeived a reply yet.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Unblocking Pages In The Robots Meta Tag’ is closed to new replies.