homepage Welcome to WebmasterWorld Guest from 54.227.56.174
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

Featured Home Page Discussion

This 43 message thread spans 2 pages: < < 43 ( 1 [2]     
Blocked CSS & JS apparently causing large Panda 4 drops
Robert Charlton




msg:4681345
 12:01 am on Jun 20, 2014 (gmt 0)

This interesting case study by Joost de Valk may explain some of the recent traffic drops reported in this forum. Joost dealt with several sites which reported losing roughly 2/3 of their traffic, at roughly the time of Panda 4....

Google Panda 4, and blocking your CSS & JS
19 June, 2014 by Joost de Valk
https://yoast.com/google-panda-robots-css-js/ [yoast.com]

He associated a site's drop with its accidental blocking of CSS and Javascript files. This was shortly after Google had announced its new Fetch and Render feature in Webmaster Tools. Assumption is that this is now being used in the page layout algorithm. Unblocking CSS and JS appeared to produce quick recoveries.

This kind of association is entirely consistent with algorithmic changes I've seen over the years, where Google has been quick to make use of a capability that we first see in a reporting feature.

See our discussion here...

Rendering pages with Fetch as Google
May 27, 2014
http://www.webmasterworld.com/google/4675119.htm [webmasterworld.com]

In this earlier Google blog post, Google strongly hints at the importance of not blocking JSS and CSS....

Understanding web pages better
Friday, May 23, 2014
http://googlewebmastercentral.blogspot.ca/2014/05/understanding-web-pages-better.html [googlewebmastercentral.blogspot.nl]

If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot canít retrieve them, our indexing systems wonít be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.

Anyone seeing drops since Panda 4 which can be attributed to this change?

 

lucy24




msg:4681747
 2:20 am on Jun 22, 2014 (gmt 0)

Edit: Ouch. Didn't mean for this to land on the top of a page.

why block it

Another one: I block all search engines from piwik.js-- not just in robot.txt but a flat-out 403 for any and all previewoid entities. There's not actually anything in the file, just 26k of boilerplate that's presumably identical on every site in the world that uses piwik. (And surely not that different from GA's own code?) I just don't think it's any of their ### business.

:: wondering if anyone else has noticed how very often the googlebot includes a referer when requesting css, and whether search engines have reason to fear they'd be served differing content depending on a stylesheet's referer ::

indyank




msg:4681748
 2:58 am on Jun 22, 2014 (gmt 0)

May be the timing might have been the issue for this misinterpretation....but then I will reiterate this to be no way related to Panda...I think the best thing for some popular guys to do is to just describe the result of their action rather than linking to any particular algo..by doing so they are only creating confusion in the minds of people...I am sure if this story had been described by me or other lesser known guys, no one would have been discussing this at all...

Edge




msg:4681801
 1:07 pm on Jun 22, 2014 (gmt 0)

Blocked CSS & JS apparently causing large Panda 4 drops


If true, I suspect that the website pages did not render properly without the css. Who knows what google saw when rendering the webpage without the css.

Presentation matters to Google...

I just don't think it's any of their ### business.
That's it? Any thing in there that helps you with the serps?
lucy24




msg:4681903
 7:37 pm on Jun 22, 2014 (gmt 0)

Any thing in there that helps you with the serps?

Do you mean, could the mere act of exclusion be harmful to a site? Well, that's kind of what this whole thread is about, isn't it.

Some scripts have no effect on page content. Should we allow this material to be crawled as a precautionary measure?

Some scripts affect page content, but don't affect how Google (specifically) renders a page. That's one of my long-standing gripes with Google Preview. It doesn't actually show what the human visitor would see; it shows what the Googlebot would see if it were human.

Edge




msg:4681947
 2:36 am on Jun 23, 2014 (gmt 0)

Do you mean, could the mere act of exclusion be harmful to a site? Well, that's kind of what this whole thread is about, isn't it.

Should we allow this material to be crawled as a precautionary measure?

Dead links, scripts, and other broken webpage elements are a sign of poor quality and will be viewed as such by Google.

The question every website site operator should ask is:

Do you want to be found in Googles search results?

Yes? Fix your website so that Google can see what you have.

No? Then whatever...

IanCP




msg:4682291
 8:26 am on Jun 24, 2014 (gmt 0)

Yes? Fix your website so that Google can see what you have

Err, silly me, but that isn't what I see the topic is all about.

Robert Charlton




msg:4682298
 9:27 am on Jun 24, 2014 (gmt 0)

If this fetch and render thing forms the basis of panda 4 then its clearly quite flawed.

For now, just to clarify this particular point... I'm not even remotely suggesting that fetch and render is anything close to the basis of Panda 4. From what Yoast reported, I was suggesting that fetch and render is now apparently used for fine tuning the above-the-fold aspect of Panda (assuming, ie, that above-the-fold is still part of Panda, and not now run separately). Whether or not this is part of Panda, in fact, isn't all that important.

If fetch and render applies, it would apply only to special types of pages. If your traffic has dropped a lot, and you have a lot of ads and you have CSS and JS blocked, take a look at what fetch and render shows of your page.

Why is this suddenly a problem...

I don't know that it is a problem. That's for you to take a look, based upon what's happening . It's a new capability, to enable Google to make finer discriminations in above-the-fold... and it's apparently had some dramatic repercussions... so it's something to check out if you are having problems. That's it.

For some sites, in fact, fetch and render was possibly beneficial, to help pages that were hit by above the fold regain rankings.

There's definitely a lot more to the algorithm, and I have some thoughts and will try to get to them in the "Source of Info" thread, which I think says a lot more about what went on overall with Panda 4.0 than fetch and render does...

Panda 4.0: "Source of Info" websites are the winners?
http://www.webmasterworld.com/google/4677106.htm [webmasterworld.com]

Edge




msg:4682373
 1:53 pm on Jun 24, 2014 (gmt 0)

Err, silly me, but that isn't what I see the topic is all about.


Ok then, enlighten me...


Thread title
Blocked CSS & JS apparently causing large Panda 4 drops



My conclusion
Yes? Fix your website so that Google can see what you have

SEOPTI




msg:4683133
 11:10 pm on Jun 26, 2014 (gmt 0)

Be careful if you use a CDN. Most of them serve robots.txt and if you use it your site won't render at all.

Rahu




msg:4683511
 3:55 am on Jun 28, 2014 (gmt 0)

Googlebot couldn't get all resources for this page. Here's a list:

http://pagead2.googlesyndication.com/pagead/show_ads.js
Script Denied by robots.txt

now the silly google adsense has this robots.txt
http://pagead2.googlesyndication.com/robots.txt

who is to be blamed for this ?
.

[edited by: Robert_Charlton at 6:26 am (utc) on Jun 28, 2014]
[edit reason] disabled autolink on urls [/edit]

Robert Charlton




msg:4683786
 6:35 pm on Jun 29, 2014 (gmt 0)

who is to be blamed for this ?

OK, I'll swallow the bait since no one else has jumped in.

No one is to be "blamed" for this. To repeat, it is most likely not a problem for the great majority of sites. Itanium nails it in the ninth post in this thread...

I don't think Javascript blocking is a problem per se. Adsense is blocking the Google bot by default, as do many other ad networks.

Blocking JS and CSS related to the style of a website might be a problem though, for the reasons told.

I've not seen any negative impact so far (knock on wood) for the blocked ad networks (adsense and another one) javascript files. I think Google can differentiate those from other scripts.

Jim Westergren




msg:4685486
 8:51 pm on Jul 6, 2014 (gmt 0)

A close contact of mine who is the CTO of a small SEO company just had 22 domains fall in Google, today. We did an audit together and the only reason we could find was that CSS was blocked in robots.txt.

We have now allowed CSS files and I will report back to this thread as soon as we see some kind of change. My contact will not make any changes to these 22 domains.

lucy24




msg:4685752
 9:47 pm on Jul 7, 2014 (gmt 0)

Earlier in this thread I said
often the googlebot includes a referer when requesting css

It's even more striking if you look at javascript, as I just discovered while --stop me if you've heard this one-- looking for something else.

On one site I've got a slew of js files that were created within the past 6 months. To date, the googlebot has never requested them without a humanoid referer (that is, the page that actually uses the javascript). The only referer-less js requests on this site are for older files that now meet with a 301 to a different site.

Equally interesting is that the scripts are always requested as a package: if a game uses five scripts, Googlebot will ask for all 5 at once.

It's part of the "executing js" pattern, isn't it? You need to have the page in hand to see what the scripts do with it. The only quirk is that the requests for js packages are not immediately preceded by requests for the page they belong to; there's a lag of up to 12 hours.

This 43 message thread spans 2 pages: < < 43 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved