krisgunnars
Forum Replies Created
-
Thanks a lot, I will try it out!
Hey Sybre,
Thank you very much for the detailed response.
I was storing a lot of data in custom fields for some posts. Maybe the loop that created the sitemap was pulling all of that data into RAM, which caused the memory limit to fill up.
This data was stored in custom fields via the ACF plugin, in the wp_postmeta table, which was over 500MB in size. A couple of days ago I moved the data into my own custom database table, which caused the various loops running on the site to use much less memory than before.
The only explanation I can think of is WordPress pulling all the meta data in when looping through a post. With enough posts, it ends up exceeding the memory limit.
But it all seems to be working fine now, so it’s all good.
Best,
KrisI tried adding the filter and it actually caused the sitemap generation to fail again. So I removed the filter again.
2020/08/31 03:43:31 [error] 7269#7269: *760000 FastCGI sent in stderr: "PHP message: PHP Fatal error: Allowed memory size of 1610612736 bytes exhausted (tried to allocate 860160 bytes) in /var/www/stockanalysis.com/htdocs/wp-content/plugins/autodescription/inc/classes/builders/sitemap-base.class.php on line 200" while reading upstream, client: 110.77.164.94, server: stockanalysis.com, request: "GET /sitemap.xml HTTP/1.1", upstream: "fastcgi://unix:/var/run/php/php74-fpm-stockanalysis.com.sock:", host: "stockanalysis.com" 2020/08/31 03:44:14 [error] 7268#7268: *760116 FastCGI sent in stderr: "PHP message: PHP Fatal error: Allowed memory size of 1610612736 bytes exhausted (tried to allocate 20480 bytes) in /var/www/stockanalysis.com/htdocs/wp-includes/functions.php on line 6331" while reading upstream, client: 110.77.164.94, server: stockanalysis.com, request: "GET /sitemap.xml HTTP/1.1", upstream: "fastcgi://unix:/var/run/php/php74-fpm-stockanalysis.com.sock:", host: "stockanalysis.com"'
Hey Sybre,
Thanks. I already have PHP 7.4 enabled, along with OpCache.
I tried increasing the PHP memory limit to 1536 MB and then the full sitemap generated, but it was very slow and the initial loading of the sitemap took like 10-20 seconds.
I’ve also tested the Google XML Sitemaps plugin, as well as WordPress’s native sitemap introduced in 5.5. Both of them generated very fast compared to TSF’s sitemap with no memory issues.
Not sure if it’s a bug or if it just needs to be solved with pagination.
Best,
KrisHey Sybre,
No they only showed up yesterday during the plugin update. I haven’t seen any errors since then.
It all seems to be working well now, but I will let you know if I see any more errors.
Thanks,
KrisAh, thanks for clarifying.
In that case, I will just remove the filter and change back the SEO titles of the Categories, they aren’t very important from a traffic perspective anyway.
And thanks for the tip on the home page title in the breadcrumbs, I will change it as you suggested.
Hey Sybre,
Thank you very much.
I’m not sure what you mean by terms, but I tried the filter and it did in fact fix the breadcrumb code on the posts so it’s now using the internal category name instead of the custom SEO title.
It also changed the home (position 1) part of the breadcrumb to just the site name, instead of site name + tagline. That is fine with me.
Thanks a lot! This worked to get rid of the timestamps from the header of the pages.
Forum: Plugins
In reply to: [The SEO Framework – Fast, Automated, Effortless.] Noindex a page via phpSweet, this is super useful. Thanks a lot.
Forum: Plugins
In reply to: [The SEO Framework – Fast, Automated, Effortless.] Noindex a page via phpSo putting the post IDs there will cause them to both be noindexed and excluded from the sitemap?
Forum: Plugins
In reply to: [The SEO Framework – Fast, Automated, Effortless.] Noindex a page via phpHey Sybre,
Thanks for the reply, I look forward to the updates.
For now, I have made a custom function that loops through all the posts when I run it manually and gives me an array of IDs to exclude, then I put the IDs manually into the filter in functions.php each time.
It would be better to have a fully automated way to do it, but this works for now.
Kris
Forum: Plugins
In reply to: [The SEO Framework – Fast, Automated, Effortless.] Noindex a page via phpI found a way to do this via functions.php.
Now I am trying to remove the pages from the sitemap, as well.
This is the code I have, but it’s not working — the pages are still in the sitemap:
// Remove financials pages with limited data from sitemap add_filter( 'the_seo_framework_sitemap_exclude_ids', function( $ids = array() ) { $ids[] = -1; if (get_post_type() == 'stocks' && !is_search()) { $page_type = get_field('page_type'); if ($page_type == 'financialsincq' || $page_type == 'financialsbsq' || $page_type == 'financialscfq') { $root_id = get_field('root_id'); $api_data_raw = get_field('fundamentals_quarterly', $root_id); list($data) = json_decode($api_data_raw, true); $count = count($data['datatable']['data']); if ($count < 5) { $ids[] = $root_id; } } else if ($page_type == 'financials' || $page_type == 'financialsbsa' || $page_type == 'financialscfa') { $root_id = get_field('root_id'); $api_data_raw = get_field('fundamentals_annual', $root_id); list($data) = json_decode($api_data_raw, true); $count = count($data['datatable']['data']); if ($count < 1) { $ids[] = $root_id; } } } return $ids; }, 10, 1 );
Hey Sybre,
I just need a way to change the titles and descriptions of hundreds of pages at the same time. All of them have identical titles and descriptions except pull in some dynamic variables via custom fields.
I managed to find a way to do it with the php filters in the plugin, so it’s all good. Everything but the titles and descriptions is being handled by the plugin.
Kris
Never mind, I figured it out via this page: https://theseoframework.com/docs/api/filters/
Thanks! Is it possible to somehow just disable the HTML output but still keep the post type in the sitemap?
If not, is there a way to modify the SEO settings via PHP in the template?
The reason I’m asking is that I’m developing something that will have hundreds of URLs. I wanted to manage the SEO settings programmatically instead of having to change the meta settings for one post at a time.