Hi Francisco,
Your site (I found listed on your profile) requires JavaScript to view. This forces Google to process your page in their rendered before displaying coherent information in SERP. Caveats are involved, and you may experience those right now.
For details, see https://web.dev/javascript-and-google-search-io-2019/ and https://developers.google.com/search/docs/guides/javascript-seo-basics.
Briefly:
1. Googlebot queues URLs for crawling.
2. It then fetches the URLs with an HTTP request based on the crawl budget.
3. Googlebot scans the HTML for links and queues the discovered links for crawling.
4. Googlebot then queues the page for rendering.
5. As soon as possible, a headless Chromium instance renders the page, which includes JavaScript execution.
6. Googlebot uses the rendered HTML to index the page.
For most sites, Google can postpone steps 4 through 6, and use the available content obtained at step 3.
To quickly see if Google is capable of reading your description tags, use a “site:”-query; for example, for our site, use this query in Google Search: site:theseoframework.com
. To inspect a specific page (or directory), simply fill in the full URL after “site:”.
Such “site:”-query disables most post-processing of the metatags. With typical non-“site:”-queries, Google may alter the description to its best-efforts, that post-processing is explained here: https://support.google.com/webmasters/answer/35624#meta-descriptions.
Summarized, when your search query’s keywords aren’t listed in the description, Google will supplement it with a (few) line(s) from your content that contains those keywords. That’s their “best-effort” ??
P.S. I don’t recommend using sitemap plugins in combination with The SEO Framework, because they aren’t aware of the indexing settings, which causes them to list posts that aren’t indexable.
-
This reply was modified 4 years, 9 months ago by
Sybre Waaijer. Reason: clarity