mrclay
Forum Replies Created
-
I added this code at the bottom of our HEAD element to workaround this issue:
<?php /* Currently the ajax calendar uses admin_url() to output ai1ec_calendar.ajaxurl, but this can result in cross protocol request XHR attempts (which fail). This script rewrites the protocol of ajaxurl to match the current page, which allows the browser to send the requests. */ ?> <script type="text/javascript">//<![CDATA[ if (this.ai1ec_calendar) { ai1ec_calendar.ajaxurl = location.protocol + ai1ec_calendar.ajaxurl.replace(/^https?\:/, ''); } //]]></script>
I rewrote the DoRobots() method in sitemap-core.php to list all our sitemaps in the robots.txt “file”:
public function DoRobots() { $this->Initate(); if($this->GetOption('b_robots') === true) { //$smUrl = $this->GetXmlUrl(); //echo "\nSitemap: " . $smUrl . "\n"; $blogs = get_blog_list(0, 'all'); foreach ($blogs as $blog) { $info = get_blog_details($blog['blog_id']); if ($info->deleted || $info->archived) { continue; } $url = 'https://' . $blog['domain'] . $blog['path'] . 'sitemap.xml'; // allow pre-fetching the sitemaps to initialize the plugin // in each subsite if (0 && isset($_GET['init'])) { file_get_contents($url); usleep(10000); } echo "\nSitemap: $url"; } } }
Multisite was designed to host separate sites, so each site having its own site map makes sense. That said, my organization is using multisite to separate access control, so we need a unified sitemap as well.
I extended the 4.0b4 version to output an index that points to all the site indexes, but Google won’t allow an index to point to another index.
Option 1 is to include all the sitemaps in robots.txt
Option 2 is to make a script that consolidates all the sitemaps into one indexHopefully arnee will consider integrating something like this.
If “80×80.html” contains no PHP code, it needn’t be parsed by PHP. Use:
readfile('includes/80x80.html', true);