Why is Google Webmaster Tools completely misreading my robots.txt file?

Below is the entire content of my robots.txt file.

User-agent: *
Disallow: /marketing/wp-admin/
Disallow: /marketing/wp-includes/

Sitemap: http://mywebsite.com/sitemap.xml.gz

It is the one apparently generated by WordPress. I haven’t manually created one.

Read More

Yet when I signed up for Google Webmaster tools today. This is the content of that Google Webmasters tools is seeing:

User-agent: *
Disallow: /

… So ALL my urls are blocked!

In WordPress, settings > reading > search engine visibility: “Discourage search engines from indexing this site” is not checked. I unchecked it fairly recently. (Google Webmaster tools is telling me it downloaded my robots.txt file on Nov 13, 2013.)

…So why is it still reading the old version where all my pages are disallowed, instead of the new version?

Does it take a while? Should I just be patient?

Also what is the “.gz” on the end of my sitemap line? I’m using the Yoast All-in-One SEO pack plugin. I’m thinking the plugin added the “.gz”, whatever that is.

Related posts

Leave a Reply

2 comments