Numero del rapporto: BYREQKVY
Data del rapporto: 11/15/2024 09:53:15
]]>Whenever I try to access any page as a logged in user, I am getting a X-Litespeed-Cache: miss
and the page loads in real time (even though it was previously cached by the same user).
However, if I access a page and then immediately (within 1-2 seconds) request the same URL again, the correct cached version of the page loads and I am getting X-Litespeed-Cache: hit:private
. Private cache TTL is set to default value of 1800.
My report number is: XQSCKLOA
(I wouldn’t like to share the URL publicly as it is a temporary testing environment, but I believe it is included in the report)
What could be causing this? Thank you.
]]>First of all I want to thank you guys for creating this free but extensive plugin. At our company we are trying to improve performance of our website (uwcomputerstudent.nl). We use Cloudways hosting and are trying to set up Cloudflare Caching (Free) but I can’t seem to get it to work. The header still shows x-cache: MISS and when clicking Test cache inside your plugin I get the message: Page caching seems not working for both dynamic and static pages. I purged all cache (from wp-admin, cloudflare and cloudways dashboard), checked the page rules, workers, etc. I checked the FAQ and some other topics here but no luck.
You can find all important info and screenshots here: https://imgur.com/a/NmPsQ1r
Note: we are using Object Caching Pro and Varnish. We don’t have any other optimization plugin active inside WP
If you need any more info, let me know
Thank you!
The dropdown is supposed to be like this page: https://atromitosconsulting.com/
Any guidance to why this is happening would be great.
]]>I am trying to understand how the crawler feature works.
I have submitted a sitemap to the crawler, and is now trying to run it manually. I have the crawler set to “off” in the general settings, as I wish to only trigger the crawler manually (to not overload my shared servers).
After I have manually started the crawler for all 4 entries in this list:
When i go to my website after the crawling is done, only the hompage seems to be cached.
When i run the crawler again, it still seems like only the homepage is cached:
How should i do to make the crawler cache all the pages in the submitted sitemap?
Thank you!
]]>I have found an issue with the WooCommerce Integration in the plugin.
When I load any product page, WooCommerce sets a cookie which is designed to stay in the session and display up to 15 of your most recently viewed products
set-cookie: woocommerce_recently_viewed=11477; path=/
The issue is that this custom cookie conflicts with the plugin and therefore causes a BYPASS to be displayed.
If I use the option to Strip out all headers then my site breaks as the basket doesn’t work because it removes the headers which send through the data.
Could you please set it so that this cookie is no longer an issue?
Many thanks!
]]>Unfortunately, and for a long time now, SG Optimizer’s Dynamic Cache feature does not appear to be working. Every time we perform a Dynamic Cache test, we get this message: “The URL is not cached”
Research reveals SG is already aware of the above condition (four months ago), and indicated a fix would be released “next week” – as stated here.
Questions:
(1) Has the fix been implemented? If so, what plugin version incorporates the fix and why do we keep getting the above message?
(2) If the fix has been implemented, shouldn’t the SG Optimizer plugin be updated to auto-detect websites that use Cloudflare and provide a message in the backend that states: “Your website is using Cloudflare. Therefore, Dynamic Caching has been deactivated.” (or something like it)
(3) If the fix has been implemented, why do we keep seeing the header: “x-proxy-cache: MISS” as shown here“?
Many SG clients remain confused or frustrated with the above or are unaware that their website(s) are not being dynamically cached.
Again, a fix is appreciated and/or additional information or changes should be added to your plugin to auto-detect Cloudflare, etc.
As always, thank you for your help!
]]>