Would putting directories in robots.txt cause probs with rendering?
born2run
6:39 am on Aug 5, 2017 (gmt 0)
Hi, so I added some directories into robots.txt. Some of the files in these directories are used by browser to render pages.
Would adding these dirs into robots.txt cause problems with rendering of pages mobile/desktop? Please advise. Thanks!
not2easy
1:36 pm on Aug 5, 2017 (gmt 0)
Yes, if by "adding some directories" you mean that you have disallowed them.
Go to GSC, in the left menu, under Crawl - look for Blocked Resources and you can get a list of the files that you have blocked that Google needs access to in order to render your pages.
lucy24
2:54 pm on Aug 5, 2017 (gmt 0)
Do you mean rendering by humans, or rendering by search engines (for example GSC's “Fetch and Render” and its behind-the-scenes equivalentns)? Human browsers are not affected by robots.txt in any way whatsoever, since the whole point of robots.txt is to give instructions to robots.
not2easy
4:59 pm on Aug 5, 2017 (gmt 0)
lucy24 is right, humans would not be affected. But your success with Google can be affected if you block resources that are needed to render your pages. I recently needed to update a robots.txt file after a WP theme update changed the location of the responsive menu's .js files. They had decided the site was not mobile friendly because they could not access that .js - as silly as it sounds. Fetch and Render worked fine, I only saw the list at GSC's Blocked Resources.
Google tells you how to block access to directories while still allowing them access to resources that are in those directories. Be aware that Google's is the only bot (afaik) that uses the Allow: directive to do this. See their Pattern-matching rules [support.google.com] to streamline your robots.txt code. Those rules are about half-way down that page and you need to click to see them.
born2run
1:00 am on Aug 7, 2017 (gmt 0)
So in the Google Search Console I went to "Crawl" and then "robots.txt Tester" and it's not giving me any errors or warnings there. Things are fine for google bot then?
PS: you guys are right! Now the blocked resources page does show all those robots.txt blocked resources as blocked.. I"m removing all of them from robots.txt :-)
not2easy
1:31 am on Aug 7, 2017 (gmt 0)
I don't spend a lot of time in GSC, (maybe visit twice a month) which is why I posted (from memory) how to find the Blocked Resources. It is not under Crawl, it's listed just above that under Indexing or something close to that. Sorry for any confusion. I do suggest you take a few minutes to read those Pattern Matching rules so you aren't giving out more information than needed.
born2run
3:18 am on Aug 7, 2017 (gmt 0)
Again you are right! Now I have to worry about other files in these allowed directories
keyplyr
5:32 am on Aug 7, 2017 (gmt 0)
The robots.txt tester in GSC tests the syntax. This tool does not tell you anything about the way Google renders your page.
Do not block any of your directories via robots.txt. If you do, you risk Google not getting what it needs to properly determine the value of your site.
born2run, you need to read through the forums and learn from what other members have discussed. Use the Site Search to look up topics.
tangor
9:01 am on Aug 7, 2017 (gmt 0)
Pick and chose what is blocked in robots.txt ... Meanwhile, know that only HONORABLE bots will follow those directives.
Meanwhile #2: Why block any files or folders that are required to RENDER a page? That doesn't make sense.