• Resolved Audiomonk

    (@audiomonk)


    My host doesn’t allow the litespeed crawler and has disabled it on the server.

    My issue is that when my pages are cached, the speed is great, when it isn’t cached, the site is slow to respond. Going from an F at Gtmetrix, to an A when cached. If I can’t crawl the pages to preload the cache, how can I make sure my pages are preloaded and have cached versions ready?

Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Support qtwrk

    (@qtwrk)

    Hi,

    sadly if your hosting provider doesn’t allow crawler , then you can’t , if your site is pretty static , you can try make cache to stay alive as long as possible , but if it’s very dynamic site , I don’t really have any good solution for it.

    Best regards,

    Thread Starter Audiomonk

    (@audiomonk)

    Hi,

    Happily, I found a solution, not perfect, but it will do.

    Free Site analyser

    I configured it to crawl slowly through the site, and it acts the same way. My site is huge, so trying to keep the cache alive is tricky. Am wondering if I can just remove all the triggers to clear the cache, and just do a purge all when I’ve made changes. Or just purge the page I’ve edited etc?

    Anyway, all is not lost with this Site Analyser which of course has the massive advantage of giving me SEO information as well.

    Best wishes

    @audiomonk,

    I tested the SiteAnalyzer and for me it didn’t generate a cache.
    It would also have surprised me, since Lisa Clark, on May 14, 2021 writes:

    “Unfortunately we are not aware of any third-party crawlers that work with LSCache.”

    Here is the article:

    Best wishes

    Thread Starter Audiomonk

    (@audiomonk)

    Then that’s strange, because previous to running this siteanalyser pages were slow to load, all of them once the cache had been cleared. Visiting them once, then visiting them again manually as I said made the huge difference. Tested with GT metrix, gave poor score first time, then re-testing (after page visited) give an A score and the page loaded like lightning.

    So, crawled via siteanalyser then every page I checked loaded really quickly. Obviously you know more about the caching etc, but this was my experience, and still is. I tried it again today. Doesn’t LS cache a page when it’s visited? How does it work?

    I noticed Lisa Clark talking about a lighter version of the LS crawler that won’t need server permissions etc this would be a godsend. There’s talk on there of these crawlers working, but not “heating” the cache?

    Plugin Support qtwrk

    (@qtwrk)

    well , there is something like an online crawler that sends request from remote server , but it’s still a long way to-do

    Make on your site an html sitemap – many plugins can do this.
    In your browser, install a plugin like Downthemall (https://addons.mozilla.org/en-US/firefox/addon/downthemall/) or similar.
    Open the html sitemap in your browser and let Downthemall download all the html files to your computer.
    Cached!

    (be aware that this can let your server bleed if you download too many files simultaneously)

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Crawler deactivated by host’ is closed to new replies.