• Resolved reviewmylife

    (@reviewmylife)


    Hi Donncha,

    ‘Use mod_rewrite to serve cache files’ is the fastest mode. However if I want to use mfunc on certain pages I have to use the slower ‘Legacy page caching’ mode for the whole site.

    How about adding a mode which combines the best of both. By default pages are cached using mod_rewrite, and only those pages which use the dynamic tags get put in legacy mode. Is this something you’d consider doing one day?

    Cheers.

Viewing 15 replies - 1 through 15 (of 17 total)
  • Well, if you have PHP caching enabled the plugin will look for a legacy file first, then the supercache one so it already works that way.

    Enable debugging mode and try it out as I never tried mixing dynamic and static pages myself.

    Thread Starter reviewmylife

    (@reviewmylife)

    Hi Donncha,

    I’ve verified the dynamic tags only work in legacy mode. This matches your documentation that says you need to use legacy mode.

    I’ve looked into what is stopping dynamic tags from working in PHP mode and have come up with a very rough proof of concept patch. I’ve uploaded the changes to a tmp area on trac. Here is the URL to view the changes.

    https://plugins.trac.www.ads-software.com/changeset?old_path=%2Fad-injection%2Fbranches%2Fwp_super_cache_tmp&old=321176&new_path=%2Fad-injection%2Fbranches%2Fwp_super_cache_tmp&new=321177

    To test these changes set the caching mode to ‘PHP’.

    With these updates dynamic pages that use mfunc/mclude/dynamic-cached-content are stored as .php files and exec’d by WP Super Cache before being the data is returned to the browser.

    Non-dynamic pages are either returned as static HTML, or as gzipped data if you have that box ticked.

    Result: Pages that don’t use the dynamic tags are now way faster on my testing site (not very scientific but it does make a big difference on my slow hosting).

    Is this enough for you to think about productising this improvement?

    BTW – I think you could go one step further. Now that the dynamic pages are stored as .php I think you could use the mod_rewrite rules to return them instead of firing up the WP Super Cache engine. Do you think that would be possible?

    So this basically moves dynamic pages for anonymous users into the supercache directory structure? That’s a good idea. It’s a starting point. The functions that clear the cache have to be checked because some of them clear index.html and index.html.gz but not the .php file of course.

    Did you ever profile where the slowdown is on your site that makes this method so much faster than using legacy mode? Do all your pages have dynamic code and as a result do you have lots of wp-cache files? Could be directory traversal if there are very many files.

    Thread Starter reviewmylife

    (@reviewmylife)

    Hi Donncha, That’s a yes to your first question. And absolutely this code is very incomplete (yes the .php files aren’t being cleaned up as you point out). And I’m in no doubt that my changes are buggy! I have started adding a bit more code to my version, and I’ll let you have the updates when they work.

    The speed difference is mainly coming from the ‘non dynamic’ pages which can now be statically gzipped instead of having to be served direct or compressed on the fly. I haven’t done any formal profiling; it is more a case of watching the pages, and seeing whether they load fast or slow on average! The difference when serving the static gzipped files is big enough to be easily observable by eye.

    Yes some of my pages use the mfunc tags, and others don’t. The mfunc tags are being dynamically and conditionally inserted into the pages by my Ad Injection plugin depending on factors such as page length, page age, etc. So new pages for example don’t have these tags inserted, whereas older ones do. I want the newer pages to be served via mod_rewrite pre-gzipped, and the dynamic ones to be served by mod_rewrite as pre-built dynamic php files. All of which I know WP Super Cache could do with some modifications.

    Right now I’m looking into a bug relating to mod_rewrite. I want my pages to work with mod_rewrite, but there is a problem which I think may be specific to my host (1and1 shared hosting). I think it is a very simple bug to fix, and I’ll put another message up about this problem when I solve it (which I hope will be later today).

    Just checked in code based on this idea into trunk. Can you give the development version a go or check out the SVN at https://svn.wp-plugins.org/wp-super-cache/trunk

    It creates index.html.php files, and serves them. It also cleans them up when required. Does the version in trunk work for you?

    This should actually work in full mod_rewrite mode too, except that the plugin will fall back to PHP mode to serve the dynamic pages.

    If a supercache directory somehow has index.html and index.html.php files then the index.html will be served. This might happen if two simultaneous requests come in and are treated differently (one gets adverts because they came from Google, the other didn’t have a referrer)

    Thread Starter reviewmylife

    (@reviewmylife)

    After modifying the original 1and1 patch (see the other thread) I’ve started testing this.

    I’ve commented out the debug wp_mail line you left in (wp-cache-phase2.php:578) so you don’t get loads of emails!

    It looks like it is working so far ?? I’m trying it on one real web site first to see how it goes. I’ll let you know.

    Serving the PHP files via WordPress PHP mode may be a good for some people (such as 1and1 users) as it allow the php.ini ‘zlib.output_compression = On’ tweak to be used to compress these files on the fly. The php.ini settings don’t seem to work when if they are served with mod_rewrite.

    1and1 doesn’t support setting compression in the .htaccess so the only ways to compress the php files is if PHP does it. A work around that would allow the php file to be served compressed via mod_rewrite would be to stick a <?php ob_start("ob_gzhandler"); ?> right at the top of each cached php file before saving it to the supercache (obviously you’d only add this line to the buffer that was going to be stored, not to the one that was being sent to the browser).

    Maybe this would be slightly faster than serving them via PHP mode?

    Oops, removed that wp_mail() – I haven’t received any emails so obviously nobody tried it when I tweeted it ??

    Anyway, the plugin could gzip the output of dynamic pages if supported by the browser. I’ve avoided doing that in the past because of the extra CPU power required but it might be worth it because of the cut down in network traffic.

    Just added ob_gzhandler to the code that serves dynamic pages. Can you try trunk?

    Thread Starter reviewmylife

    (@reviewmylife)

    I’ve had a go and it looks like it is working. It could be a good extra tweak for people who can’t define PHP output compressions in php.ini.

    BTW – should you make the compression conditional on whether the ‘Compress pages so they’re served more quickly to visitors’ box is ticked?

    Also, Is it worth noting in some docs/faq that this option won’t compress the first (non-cached) version of the page that is served from WordPress? Whereas setting the compression in php.ini will. To test if this compression is working you need to load the page twice so you get the cached version.

    Overall, the best option is for the user to define the compression in php.ini, but if they can’t do that then WP Super Cache can do the extra compression.

    If you then go one stage further and serve these php files via mod_rewrite then you could add the <?php ob_start("ob_gzhandler"); ?> line to the top of the stored file.

    But would you need extra options to turn these things on and off? Just in case some people’s servers can’t stomach the CPU compression.

    Yeah, I need to check the compression flag in the plugin first, and gzip the content when it’s first generated. Thanks for pointing those out. It’s a pity 1and1 users can’t use the compressed cached files. That would be a big win for them.

    Thread Starter reviewmylife

    (@reviewmylife)

    In case my comment was confusing: ‘for people who can’t define PHP output compressions in php.ini’ does not include 1and1 shared hosting. I was meaning other hosts.

    1and1 users can define php output compression in php.ini. The <?php ob_start(“ob_gzhandler”); ?> trick also works, so they have two options for compressing the PHP on the fly.

    The php.ini option would be better for 1and1 users as it means that all PHP output gets compressed… except for if the php file is loaded via the mod_rewrite rules (which I’ve tested by adding a .html.php redirect section to the .htaccess). I don’t understand why the php.ini enabled compression isn’t working in this case. Something I might look into later. Maybe something simple I’m doing wrong.

    In summary with your changes 1and1 users can:

    1. Have the static pages pre-gzipped (as .gz files) and served via mod_rewrite. i.e. minimal CPU load on average, and minimal file size.
    2. Have the dynamic pages served via php and gzipped on the fly by defining compression in php.ini. (or alternatively by having the compression done in WP Super Cache, but at the expense of the first generated page being uncompressed). Both of which gives them minimal file size, but with a CPU hit.

    Hope that clears the 1and1 situation up.

    Another random thought I just had: if the CPU load of ‘on the fly’ compression in PHP were a problem for some people, maybe WP Super Cache could check the CPU load before serving the file, and then only compress if the CPU load were below a certain threshold. Is it expensive to check the CPU load? I know the broken link checker plugin for example can be configured to only run when the CPU load is acceptable.

    Things would become a bit more complicated if you were to add mod_rewrite support for the dynamic files.

    Would be cool to have options to turn all these features on and off though for users who know what they are doing.

    Unfortunately it’s impossible to check the load on the CPU reliably, especially across different platforms.

    When you served the .html.php file through mod_rewrite rules are you certain the php was executed? It probably didn’t load the rest of WordPress and you may have got an unknown function error if it had dependencies.

    Thread Starter reviewmylife

    (@reviewmylife)

    Yes the PHP is definitely being executed. I’m not loading the rest of WordPress – this is standalone PHP without WordPress/MySQL dependencies.

    I figured out why the php.ini wasn’t compressing files served via mod_rewrite. On 1and1 the php.ini only affects php files in the current directory. Therefore for mod_rewrite compression to work there would have to be a php.ini in each directory containing the .html.php file!

    Here are the rough ‘proof of concept’ mods to the rewrite rules and some code for creating the php.ini file if anyone wanted to test this out. I’ve not put any php.ini cleanup code in.

    Although from my brief testing by measuring the response and page load times using Firebug there is no noticeable difference in speed between serving the PHP files via PHP or mod_rewrite. Presumably there would be less CPU load though with mod_rewrite as WordPress doesn’t need to be loaded.

    $rules .= "CONDITION_RULES";
    	$rules .= "RewriteCond %{HTTP:Accept-Encoding} gzip\n";
    	$rules .= "RewriteCond {$apache_root}{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html.gz -f\n";
    	$rules .= "RewriteRule ^(.*) \"{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html.gz\" [L]\n\n";
    
    	$rules .= "CONDITION_RULES";
    	$rules .= "RewriteCond %{HTTP:Accept-Encoding} gzip\n";
    	$rules .= "RewriteCond {$apache_root}{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html.php -f\n";
    	$rules .= "RewriteRule ^(.*) \"{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html.php\" [L]\n\n";
    
    	$rules .= "CONDITION_RULES";
    	$rules .= "RewriteCond {$apache_root}{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html -f\n";
    	$rules .= "RewriteRule ^(.*) \"{$inst_root}cache/supercache/%{HTTP_HOST}{$home_root}$1/index.html\" [L]\n";
    	$rules .= "</IfModule>\n";

    And the php.ini creation changes (the changes are very small although they are a bit spread out in this block).

    [Code moderated as per the Forum Rules. Please use the pastebin]

    Thread Starter reviewmylife

    (@reviewmylife)

    Oh and by the way, I’ve started testing your trunk at revision 323148 on my main reviewmylife website. The mixed mod_rewrite/PHP mode is working very nicely so far. All pages are being served compressed with a mix of the .gz files, and dynamic compression.

    I might give your new CDN support a go soon as well.

    Thread Starter reviewmylife

    (@reviewmylife)

    Just spotted that my code block in reply 15 was deleted and I can’t edit the post anymore. Here’s a link to it on pastebin.

Viewing 15 replies - 1 through 15 (of 17 total)
  • The topic ‘[Plugin: WP Super Cache] Suggestion: mod_rewrite mode with legacy fallback’ is closed to new replies.