Forum Replies Created

Viewing 2 replies - 1 through 2 (of 2 total)
  • Thread Starter webQuest

    (@cwsleigh)

    Here is the response I have received, so far, from the good folks at MaxMind:

    Hello,

    It is correct that we cannot filter requests on our end to detect any kind of potential bot activity. We know that would be useful, but unfortunately adding any additional filters or checks would greatly impact the service.

    Since you want web-crawlers to index your site, you’ll just need to make sure that you are not sending us repeat queries. The best way to do this is to employ data caching on your end. We update our database weekly, on Tuesdays, so you could cache our response for up to a week without losing accuracy (if you have already queried a certain IP that week, then refer to the cached data instead of querying that IP again during the same week). That way, bots can hit your pages as much as they need to, and will only consume a single query per week instead of one for each visit (which can be a massive drain).

    I hope that is helpful! Please let me know if I can be of further assistance, or if you have any questions.

    Regards,
    Customer Support
    MaxMind, Inc.

    So, with that said, is the data caching ‘on my end’ (described above) a feature that is already handled by Slim Stat Analytics, or could be in the future?

    Or is that something which I would need to ‘turn on’ some other place within my WordPress environment? And what would the potential ‘downsides’ be of doing that?

    Comments / Answers ..?..

    I have a question, seeking a point of clarification for the above. Where you able to verify that Google’s html version of the cache did include the hidden content as well?

    Then, as a follow up to that, can we have some certainty that the full page is being indexed for search purposes?

Viewing 2 replies - 1 through 2 (of 2 total)