• Resolved Robin

    (@robinmehta)


    I am using pretty links on my blog https://bloggingidol.com and some of links created by pretty links are shown as blocked by robots.txt. I am using domainname/go/affiliatename in URL. While inspecting and checking using robots.txt everything working fine. But atleast 5 links are there in Blocked by Robots.txt under pages not indexed. What could be the reason for this?

    The page I need help with: [log in to see the link]

Viewing 3 replies - 1 through 3 (of 3 total)
  • Tyler

    (@tylerthedude)

    Hi Robin,

    Thank you for reaching out. Just to ensure I’m understanding the issue correctly, you have some Pretty Links created that aren’t being indexed by search engines? If this is the case, you might try editing those links on the Pretty Links page (Wp-Admin -> Pretty Links) then click the “Advanced” tab and see if the “No Follow” option is enabled.

    This option will tell search engines not to index the link, so if it’s enabled then you might try disabling it and see if they start getting indexed correctly going forward.

    Kind regards,

    Thread Starter Robin

    (@robinmehta)

    No I don’t want link to be indexed and “No Follow” option enabled is correct for me. But I want links to be crawled by google and not blocked by robots.txt file.

    Tyler

    (@tylerthedude)

    Hi Robin,

    Thank you for getting back to me, and please forgive the delayed response. If the “No Follow” option is enabled for your links, then search engines such as Google will take it as a hint to not crawl them so you might try disabling that option and see if the links start getting crawled correctly then.

    Kind regards,

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Links Blocked by robots.txt’ is closed to new replies.