Viewing 8 replies - 1 through 8 (of 8 total)
  • Plugin Author Emre Vona

    (@emrevona)

    what is the problem?

    Allow: /wp-content/cache/wpfc-minified
    Allow: /wp-content/cache/all

    Why do you need to do ?

    Plugin Author Emre Vona

    (@emrevona)

    ow I got it now. thank you ??

    Disallow: /wp-admin/
    Disallow: /wp-content/

    Lots of people use like that

    Thread Starter Saleswonder Team Tobias

    (@tobias_conrad)

    with this we do not have duplicated content?

    Plugin Author Emre Vona

    (@emrevona)

    duplicated content? are you talking about css and js files?

    Thread Starter Saleswonder Team Tobias

    (@tobias_conrad)

    Hey,

    you reduce server load when save pages as html.
    so your cached page and the real page make dupblicate content.

    ??

    Please give us a good and complete solution.

    Plugin Author Emre Vona

    (@emrevona)

    you reduce server load when save pages as html.
    so your cached page and the real page make dupblicate content.

    Google cannot scan the path which is /wp-content/cache/all. If you worry about it, you can write Disallow: /wp-content/cache/all

    Let me clear this up.

    The new guidelines of Google say: Google must be able to access the CSS and JS files on the server. So people shall ensure that the caching plugin they use does not lock Google out.

    WPFC does not add a disallow rule to the robots.txt, which means that it is compliant to the new rules of Google, so everything is fine.

    It has nothing to do with duplicate content.

    This can be closed now I think and marked as resolved.

    Plugin Author Emre Vona

    (@emrevona)

    thank you so much jackennils for clarifying ??

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘new google cache rules – duplicate content, robot.txt’ is closed to new replies.