Forum Moderators: Robert Charlton & goodroi
This is not a penalty notification, but a warning that if Google cannot see your whole site, it may result in poorer rankings.
If you get this message, talk to your developers and discuss what you can do, if you need to do anything. Use the fetch and render tool to diagnosis the issue deeper as well.
Update: I should add, that many many WordPress sites are getting this notification because their /wp-includes/ folder is blocked by robots.txt. Plus there are many popular CMS solutions that block their include files by default.
Disallow: /wp-content/
Allow: /wp-content/uploads/ User-Agent: Googlebot
Allow: .js
Allow: .css
Only one group of group-member records is valid for a particular crawler. The crawler must determine the correct group of records by finding the group with the most specific user-agent that still matches. All other groups of records are ignored by the crawler. The user-agent is non-case-sensitive. All non-matching text is ignored (for example, both googlebot/1.2 and googlebot* are equivalent to googlebot). The order of the groups within the robots.txt file is irrelevant.
-- From about 2/3 of the way down the page.
Sample situations:URL allow: disallow: Verdict Commentshttp://example.com/page /p / allowhttp://example.com/folder/page /folder/ /folder allowhttp://example.com/page.htm /page /*.htm undefinedhttp://example.com/ /$ / allowhttp://example.com/page.htm /$ / disallow
-- From almost at the bottom of the page.