• Hi,
    I am using this plugin for a long time now and really love it ??

    Being a novice at SEO, I barely noticed the stats in Google Webmaster tools. However, recently I discovered that it was reflecting a huge number of duplicate title tags and majority of url’s were the search pages created viz. “/search/<search term>/page/2”, etc.

    Is there a way to restrict the number of results shown by Relevanssi so that such a huge number of sub pages aren’t created and thus lesser duplicate title tags?

    Thanks for such an awesome plugin ??

    https://www.ads-software.com/extend/plugins/relevanssi/

Viewing 11 replies - 1 through 11 (of 11 total)
  • Plugin Author Mikko Saari

    (@msaari)

    I suppose so, though the best way to handle this in my opinion is to set robots meta noindex on your search results pages (or block them in robots.txt).

    Thread Starter Perryb2006

    (@perryb2006)

    Isn’t it possible to index the first page of search and noindex, follow the sub pages?

    Thanks for prompt reply ??

    Thread Starter Perryb2006

    (@perryb2006)

    Will this work:
    DISALLOW: /search/*/page
    to block all the pages except the first search page?

    Plugin Author Mikko Saari

    (@msaari)

    No, that won’t work in robots.txt, it doesn’t support wild cards like that.

    Google recommends setting all search results pages to noindex, so I’d go with that, but if you want to noindex everything but the first page, you should do it with meta tags you set in WordPress – setting noindex on everything but the first page is not complicated that way.

    Thread Starter Perryb2006

    (@perryb2006)

    Could you please help me out in this regard? I am not that great with Robots.txt handling.

    Thanks ??

    Plugin Author Mikko Saari

    (@msaari)

    In your header template, add

    <?php
    global $wp_query;
    if (is_search() && $wp_query->query_vars['page'] > 1)
    echo "<meta name='robots' content='noindex' />";
    ?>

    That should do the trick, I think.

    Thread Starter Perryb2006

    (@perryb2006)

    Thanks for the code, I’ll implement it and see if it takes care of the errors.
    Also, is there a way to restrict number or search results shown in Relenvassi? As of now it shows 10 results per page and maximum of 500 but if I could change this to 10 results per page with maximum of 10 i.e only show 10 results of the search performed. This will not create sub pages and hence no duplication in search engine.

    I tried to implement this by editing the plugin file and changed the following code

    add_filter('relevanssi_query_filter', 'relevanssi_limit_filter');
    function relevanssi_limit_filter($query) {
    	if (get_option('relevanssi_throttle', 'on') == 'on') {
    		return $query . " ORDER BY tf DESC LIMIT 10";

    ~Changed 500 to 10~
    But this didn’t work :/
    IMO this will be better option.

    Plugin Author Mikko Saari

    (@msaari)

    That doesn’t work, as it’s ten hits per search term.

    Use the relevanssi_hits_filter and trim the $hits[0] array to desired length.

    Thread Starter Perryb2006

    (@perryb2006)

    $filter_data = array($hits, $q);
    	$hits_filters_applied = apply_filters('relevanssi_hits_filter', $filter_data);
    	$hits = $hits_filters_applied[10];

    Did what you said. Changes the $hits_filters_applied[0] array to 10 but upon doing so, the search page for all the terms yielded “No Search Results”.

    Thread Starter Perryb2006

    (@perryb2006)

    Tried the php code as well, didn’t work. Got the following error:
    PHP Parse error: syntax error, unexpected T_STRING, expecting ',' or ';' in <directory url>

    Plugin Author Mikko Saari

    (@msaari)

    Try this

    add_filter('relevanssi_hits_filter', 'cuttoten');
    function cuttoten($hits) {
    $ten = array_slice($hits, 0, 10);
    return array($ten);
    }

    That’s how you take the ten first hits from the $hits array.

Viewing 11 replies - 1 through 11 (of 11 total)
  • The topic ‘[Plugin: Relevanssi – A Better Search] Limit No. Of Search Pages Created’ is closed to new replies.