Forum Moderators: Robert Charlton & goodroi
Blocked CSS & JS apparently causing large Panda 4 drops
If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.
What reasons would a webmaster have to block such files from Google?
do you really think you can block spying with robots.txt
It says that fonts.google and two of Google's Adsense .js files were blocked by robots.txt
Whose robots.txt blocked it?
You might want to delete your robots.txt file, too, if it's only a few lines long.
<FilesMatch "\.(js|txt|xml)$">
Header set X-Robots-Tag "noindex"
</FilesMatch> Now, iPhoned makes money from ads. It doesn't have a ridiculous amount of them, but because it uses an ad network a fair amount of scripts and pixels get loaded. My hypothesis was: if Google is unable to render the CSS and JS, it can't determine where the ads on your page are. In iPhoned's case, it couldn't render the CSS and JS because they were accidentally blocked in their robots.txt after a server migration.
We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site's visible content or its layout
[edited by: aakk9999 at 10:42 pm (utc) on Jun 21, 2014]
[edit reason] Unlinked sample URLs [/edit]