• Resolved dh333

    (@dh333)


    Hello,
    I have about 1,700 active pretty links. Being that pretty links look like regular site URL’s, would using this many pretty links waste Google’s limited crawl budget for indexing?
    Thank you for any insight you can provide.

Viewing 7 replies - 1 through 7 (of 7 total)
  • Brian M

    (@millennialmoneyguide)

    I also came here to ask this, does anyone know how to block Google from crawling prettylinks by using robots.txt?

    Tyler

    (@tylerthedude)

    Hi @dh333 and @millennialmoneyguide,

    Thank you for reaching out. To prevent your links from being indexed, please navigate to Wp-Admin -> Pretty Links -> Options -> Links and enable the option, “Enable No Follow”.

    This will add a “nofollow” and “noindex” flag in the HTTP response headers which will tell the search engine not to index the links.

    Kind regards,

    Brian M

    (@millennialmoneyguide)

    Hi Tyler,

    Don’t mean to hijack the thread, but I’ve already done this. They aren’t being index but still being crawled.

    I’ve got hundreds of prettylinks and they are still getting crawled by Google which is taking up crawl budget. Is there a way to use robots.txt to prevent PrettyLinks from being crawled?

    • This reply was modified 1 year, 10 months ago by Brian M.
    Tyler

    (@tylerthedude)

    Hi Brian,

    Thanks for getting back to me, and not a problem at all. Would it be possible to provide me with one of your pretty links which are being crawled by Google?

    You could also prevent the links from getting crawled in your robots.txt file by inserting the slug of the link. If you’re using a slug prefix (/go/ for example), then you could exclude all of the links by specifying just that slug prefix:

    Disallow: /go/

    Kind regards,

    Brian M

    (@millennialmoneyguide)

    Thanks for the response, here is an example of a pretty link that is being crawled: https://www.mymillennialguide.com/acorns

    Unfortunately, I’m not using a slug prefix. Is there anyway to prevent links from getting crawled without having a slug prefix?

    • This reply was modified 1 year, 10 months ago by Brian M.
    Tyler

    (@tylerthedude)

    Hi Brian,

    I’m able to see the “nofollow” and “noindex” directives in the HTTP headers, so I’d recommend going ahead and excluding the links from within the robots.txt file as we’ve been discussing.

    If you’re not using a slug prefix, then you’ll just need to specify the slug of the pretty link itself in your robots.txt file. For example, to exclude the /acorns link you could add this to your robots.txt file:

    Disallow: /acorns/

    Please note that it may take some time in order for Google to index the site again, so those links may not immediately become de-indexed when excluded.

    Kind regards,

    Tyler

    (@tylerthedude)

    As we haven’t heard back from you in a while, we’re going to go ahead and mark this as resolved. However, if you have any other questions please feel free to open another thread and we’d be happy to assist.

    Kind regards,

Viewing 7 replies - 1 through 7 (of 7 total)
  • The topic ‘Can Pretty Links Hurt Google Crawl Budget?’ is closed to new replies.