I am working with a WP 3.9 install on a godaddy WP managed hosting plan running impulse press theme. I DO NOT have Discourage search engines from indexing this site checked.
I only installed Yoast SEO plugin as far as SEO plugins. The robots.txt file is editable on the YOAST SEO plugin under Edit Files. If you visit the site http://260.303.myftpupload.com/robots.txt you can see the robots.txt is being generated as:
User-agent: *
Disallow: /
In YOAST plugin it shows the robots.txt as:
User-agent: *
Crawl-delay: 1
Disallow: /wp-content/plugins/
Disallow: /wp-includes/
Disallow: /wp-admin/
Disallow: /wp-admin
I can view the file through FTP and can verify the robots.txt file is present on the server and matches the contents as shown in the Yoast plugin.
It seems that the WP virtual file is overriding the manually created robots.txt file. I have deleted the file and generated it through Yoast. I have deleted the file and uploaded it directly through FTP. Either way http://260.303.myftpupload.com/robots.txt only shows:
User-agent: *
Disallow: /
Please help!
After we moved WordPress managed hosting to live url from temporary URL, robots.txt reflected properly. Me thinks godaddy had an overriding robots.txt function on their server for all hosting plans using temporary urls.