Hi,
The funny thing is, what’s written in my robots.txt is different from what’s visible in the robots.txt file. What’s visible in the robots.txt file is NOT what I want it to say. What’s written in my file is this:
# Rule 1
User-agent: Googlebot
Disallow: /nogooglebot/
# Rule 2
User-agent: *
Allow: /
I’m not sure WHY it’s saying this:
User-agent: *
Disallow: /
Is it possible that something is interrupting it, such as another plugin, my CDN or something in my DNS settings?
What I submitted into the sitemap is this:
/slides_category-sitemap.xml
/testimonials_category-sitemap.xml
/portfolio_category-sitemap.xml
/post_tag-sitemap.xml
/category-sitemap.xml
/popup-sitemap.xml
/portfolio_page-sitemap.xml
/page-sitemap.xml
/post-sitemap.xml