homepage Welcome to WebmasterWorld Guest from 54.161.219.112
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 103 message thread spans 4 pages: 103 ( [1] 2 3 4 > >     
Matt Cutts asks webmasters: let googlebot crawl js and css
tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 12:55 am on Mar 27, 2012 (gmt 0)

In a new video "public service announcement" Matt Cutts asks webmasters to remove robots.txt disallow rules for js and css files. He says that Google will understand your pages better and be able to rank you more appropriately.

[youtube.com...]

 

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4433745 posted 1:33 am on Mar 27, 2012 (gmt 0)

Better for who? hehe

I don't have anything in my JS files that would help any of my sites rank better.

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 2:21 am on Mar 27, 2012 (gmt 0)

I still remember him telling us how to page rank sculpt. That didn't turn out to good either.

Yes Matt, I'll rush and get those unblocked for you..

Marshall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 2:28 am on Mar 27, 2012 (gmt 0)

If say you have an <h> series of tags on your page, what is the difference how they are defined in your CSS, unless of course that display:none comes into play. Sounds to me like they are hunting for display:none properties, IMHO.

Marshall

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 3:41 am on Mar 27, 2012 (gmt 0)

Source ordered content could also be part of the hunt. If their algo is any good, then seeing the actual visual content order "might" help. But if their algo misfires, it could hurt.

I never blocked css or js files on the sites I work with, so U won't have a dog in this race. I do block some redirect scripts, but I block the directory where those files exist.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4433745 posted 4:37 am on Mar 27, 2012 (gmt 0)

If g### doesn't like what it sees in robots.txt why doesn't it just ignore it? Preview, Translate, Goggles and Wireless Transcoder already do. So does the plainclothes msnbot and everything from Yahoo.

I've got certain well-known robots blocked from piwik.js via htaccess because they refuse to honor instructions to keep the ### out. What do they think-- that instead of a well-known analytics program, I've got a secret stash of extra formatting in there?

realmaverick

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4433745 posted 11:27 am on Mar 27, 2012 (gmt 0)

I don't like Matt Cutts. There, I said it. I don't trust him, one bit. I guess we're supposed to take what he says as gospel and quickly run around and make changes that he tells us to.

It's quite worrying though, as it suggests that js and CSS are ranking factors, or at least websites that Google "believe" are using questionable js or css, have or will be penalised?

I don't disallow js or css anyway, as Google knows exactly what they are, they're not web pages, so why index them or continue to crawl them.

Will we see some new ebooks tomorrow, "Turbo charge your rankings, SEO your CSS Files".

Blah

zdgn

5+ Year Member



 
Msg#: 4433745 posted 11:52 am on Mar 27, 2012 (gmt 0)

Matt Cutts asks webmasters to remove robots.txt disallow rules for js and css files


I will robots-allow my prized hardworked-and-created js and css ONLY IF Google *promises* NOT to index them and NOT to make the source code readily searchable for open scraper theft with some 'awesome' future Labs component dreamt up by an 'awesome' engineer during his/her allotted playtime at the Plex.

I have bad, bad memories of finding my code readily available at the old Code Search compo in the past... Took me ages to obfuscate and expunge my code off the SERPs and off scrapers' radars... so I'm paranoid.

Besides, Googlebot seems to read DOM-ready renditions now, so please ask the bot to focus on the outcome and not what's under the hood, thanks.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4433745 posted 2:24 pm on Mar 27, 2012 (gmt 0)

Tell em they can't have your css or your js (or your cell phone number for that matter - they keep asking for that too) till we get our keywords back.

kb5nju

5+ Year Member



 
Msg#: 4433745 posted 3:05 pm on Mar 27, 2012 (gmt 0)

I wonder if this has something to do with their "natural language search" efforts. Perhaps Googlebot wants to view the page as a human does in order to learn more about how humans interact with the pages? That would make sense to me.

I've never blocked CSS or JavaScript because I've never had a reason to do so. I can see where some might find it objectionable; for instance, I completely understand zdgn's reasoning.

@netmeg: Good one!

Samizdata

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4433745 posted 4:20 pm on Mar 27, 2012 (gmt 0)

I wonder if this has something to do with their "natural language search" efforts

I would say not.

I suspect it is entirely about the fact that Google SERPs are still far too easily gamed.

I wouldn't expect Matt Cutts to admit it, but I'd have more respect if he did.

There is no other reason to crawl CSS and javaScript files.

And yes, I would like my search terms back too.

...

agent_x



 
Msg#: 4433745 posted 4:44 pm on Mar 27, 2012 (gmt 0)

I wonder if this has something to do with their "natural language search" efforts. Perhaps Googlebot wants to view the page as a human does in order to learn more about how humans interact with the pages? That would make sense to me.


I've thought this for a while. Google doesn't just need to render your page to provide the preview image in the search results, it needs to know what your page looks like to know where the content is, how many ads you have above the fold, etc - surely?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 4:51 pm on Mar 27, 2012 (gmt 0)

Google definitely does do an algorithmic rendering of your page these days. How else would they measure "above the fold"? Or give more weight to links in the content area compared to the footer or side panel?

Actually, Microsoft was the first to file a patent on algorithmic rendering of web pages. So Bing may be in the same situation as Google here.

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 5:04 pm on Mar 27, 2012 (gmt 0)

It seems odd to just ask people to do it like this. If they really want to see everyone's JS/CSS, why not just introduce an obvious penalty for disallowing it? Everyone would allow it in a big hurry.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4433745 posted 5:10 pm on Mar 27, 2012 (gmt 0)

We're supposed to build pages for site visitors not search engines. But Google is also requiring web publishers to build their sites in a way that Google believes is a better way, regardless of the impact on earnings, because Google feels it's a better experience for our site visitors. It feels like an over reach for Google to be dictating to publishers what their sites should be (panda friendly), but asking site publishers to not block JS/CSS files inches us even more in the direction of building websites for search engines. It just feels increasingly intrusive and presumptious to ask web publishers to alter how they build websites so that they are increasingly building websites for the search engines, not the site visitor.

[edited by: martinibuster at 5:14 pm (utc) on Mar 27, 2012]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 5:12 pm on Mar 27, 2012 (gmt 0)

LOL! Google has already generated several PR disasters in recent times, some even with good intentions. A move like that would bring about major reactions, I think!

PCInk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 5:19 pm on Mar 27, 2012 (gmt 0)

If you are hiding CSS/JS from search engines, you probably have something to hide. Possibly blackhat.

I can't see why Google haven't penalised this for years. Perhaps this message from Matt Cutts is a warning that they will soon.

wildbest

5+ Year Member



 
Msg#: 4433745 posted 6:13 pm on Mar 27, 2012 (gmt 0)

If you are hiding CSS/JS from search engines, you probably have something to hide. Possibly blackhat.


If Google are hiding their search index algo, they probably have something to hide. Possibly illegal.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4433745 posted 6:23 pm on Mar 27, 2012 (gmt 0)

If you are hiding CSS/JS from search engines, you probably have something to hide.


No. Hiding JS and CSS is about bandwidth and speeding up the search engine crawl and most importantly assuring that the site visitor's experience is not impaired due to crawler activity. As web publishers we want the search engines to focus on the content and then get out- without disturbing our site visitor's experience.

It is also an ethical problem where Google is overstepping a boundary line.

webastronaut

5+ Year Member



 
Msg#: 4433745 posted 7:01 pm on Mar 27, 2012 (gmt 0)

So true MartiniBuster! GoogleGuy, I am really busy taking down some banner ads and such from quit a few partnerships that I have made over the years because they are going out of business. I'm seeing so many sectors now dropping like flies in Goog search results. And being replaced by your own services and very spammy sites and the big guys that hold a lot of shares with Goog.

zeus

WebmasterWorld Senior Member zeus us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 7:16 pm on Mar 27, 2012 (gmt 0)

did he not say once make sites for the users not search engine, but now at least 50% of work is done for google.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 7:26 pm on Mar 27, 2012 (gmt 0)

Google have said on so many occasions "we're not the internet police", but more and more of their recent moves look like they want to be, but they just can't admit it.

I block JS on some sites and will continue to do so. Google's getting too nosey.

Marshall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 8:30 pm on Mar 27, 2012 (gmt 0)

If you are hiding CSS/JS from search engines, you probably have something to hide. Possibly blackhat.


Never assume!

This goes back to what I said about searching for display:none. I will freely admit I use that on elements which the user cannot access, but not for black hat purposes. Occasionally I want to hide something that should only appear when the page is printed, say an alternate header. Am I to be penalized for not wanting my visitors to have to print a large colorful graphic when a small black and white one will do? Is Google going to compare my screen CSS to my print CSS? Or maybe I want to hide something when a handheld device is used. Am I to be penalized for trying to make my regular page fit a small screen? My sites are built for the convenience of my visitors, not Google bots.

And if people want to black hat, that is still easily accomplished with detection scripts.

Marshall

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 8:38 pm on Mar 27, 2012 (gmt 0)

A move like that would bring about major reactions, I think!

Right, but then why mention it at all? People who are deliberately blocking JS/CSS for a reason aren't likely to stop just because Matt asks them politely to do so, not unless there's teeth behind it - especially if they're doing it for shady purposes! On the other hand, people who are likely to do what Matt says just because he says it are probably not blocking anything anyway. I don't know, it just seems like an odd thing to say... like maybe it's really intended more as a hint that they are parsing JS and CSS more, to scare people away from misusing them. In Google's position it might be wise to imply that you know more than you in fact know.

SEOMike

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4433745 posted 8:51 pm on Mar 27, 2012 (gmt 0)

I'm not recommending my clients unblock JS and CSS. Sorry.

I can just see Google weighting the size of your H tags against the rest of the content on the page to see if there is "enough" difference to give the H tags proper credit, or discount them for only being x% larger than your body text instead of their desired(x+y)%. Or judging if your links are visible enough to users to pass pagerank through or not. Or whatever else they want for that matter.

Yeah... no thanks. I'm not letting a robot interpret my clients' design elements.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4433745 posted 8:58 pm on Mar 27, 2012 (gmt 0)

Actually, just occurred to me, perhaps I will put a message for Google in a js file they can see. I know *just* what I'm gonna say, too.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4433745 posted 10:30 pm on Mar 27, 2012 (gmt 0)

Google doesn't just need to render your page to provide the preview image

Preview is irrelevant to this discussion, because Google Preview is not a robot and is not affected by robots.txt.

If you are hiding CSS/JS from search engines, you probably have something to hide.

Would you like to look at my piwik.js file and see what black-hat evils I'm hiding? Be my guest. For obvious reasons it isn't blocked to humans-- and so far it isn't blocked to empty referers, though possibly that's what I should be doing.

JAB Creations

WebmasterWorld Senior Member jab_creations us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4433745 posted 10:46 pm on Mar 27, 2012 (gmt 0)

Thanks for letting me know I can block Google from crawling that, blocked!

Also I highly recommend everyone stop using Google outright! They're totally screwing over webmasters by denying us referral headers. Google is effectively trying to kill the open nature of the internet by deciding what referral data "we may be allowed or not allowed" to see in their webmaster tools.

Google has officially become the enemy of the internet.

- John

kerrykob



 
Msg#: 4433745 posted 10:52 pm on Mar 27, 2012 (gmt 0)

I can see it now. People off spending their time keyword stuffing their javascript and css files.

(rolls eyes)

mslina2002

10+ Year Member



 
Msg#: 4433745 posted 11:04 pm on Mar 27, 2012 (gmt 0)

I wonder if they follow their own suggestions

e.g.
www.google.com/robots.txt
maps.google.com/robots.txt
etc. etc.

Spot any disallow on js? css?

Do as I say, not as I do dammit!

This 103 message thread spans 4 pages: 103 ( [1] 2 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved