homepage Welcome to WebmasterWorld Guest from 54.234.147.84
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Can Google read javascript produced content?
Northstar




msg:3744605
 2:28 pm on Sep 14, 2008 (gmt 0)

Can Google read content that is produced via java scripting? I have a site that produces new content through a java script call

<script SRC="http://www.example.com/category/Whats_new.js"></script>
<script LANGUAGE="JavaScript" TYPE="text/javascript">
<!--
get_Whats_new();
// -->
</script>.

Will Google see the produced content or only this script link in the html?

 

dannyboy




msg:3744755
 12:04 am on Sep 15, 2008 (gmt 0)

I've noticed that google was able to cache some content that's added dynamically via DOM. I know there are those that say googlebot can't read javascript, so it doesn't matter what you do in there, but I would exercise extreme caution if I were you.

It's not worth losing your search engine rankings because you decide to add a bunch of stuff via Javascript.

restless




msg:3744760
 12:29 am on Sep 15, 2008 (gmt 0)

It's not worth losing your search engine rankings because you decide to add a bunch of stuff via Javascript.

But if you can create a better user experience because of the javascript content then I would say go for it, you are assuming the javascript is going to add rubbish content to the page?

dannyboy




msg:3744761
 12:36 am on Sep 15, 2008 (gmt 0)

restless, what I experienced could have been a fluke, but it did appear that google did infact recognize the content added to by DOM and treated it negatively. It isn't clear if they're using the dynamically added text to change the "weight" of the page, but in my experience it was a net negative.

I'm not saying the OP shouldn't do it, but I would caution him to be careful about it and not to go overboard. For instance, if he's planning to add some dynamic content across a vast swath of pages, then he should consider just adding it to a few pages and wait a couple of days to see if google notices it or drops his rankings. If it turns out to be all good, then go ahead and add it to the remaining pages across the site.

By the way, I'm distinguishing between content and behavior. If the OP is just planning to add some action hooks (ie, click, hover, etc), then I don't see that causing a problem. But if he's going to dynamically add actual textual or html content to the page dynamically, then be more careful.

dstiles




msg:3745307
 9:59 pm on Sep 15, 2008 (gmt 0)

> "not worth losing your search engine rankings..."

And this is going to be an on-going problem as google manages to read more and more of a site's contents. Yesterday you could include form links to content and be sure google would not read it. Today they are likely to read it and then penalise you for inappropriate / duplicate / whatever content. Same with Flash. Next it will be javascript.

We try to stick to google's rules. How about google sticking to them as well?

dannyboy




msg:3745309
 10:01 pm on Sep 15, 2008 (gmt 0)

"form links to content"?

dstiles, could you please clarify what you mean by that?

Quadrille




msg:3745316
 10:17 pm on Sep 15, 2008 (gmt 0)

How about google sticking to them as well?

Google has to evolve or die, just like everything else on the web. It's in searchers' interest that Google can parse js successfully - and flash - so we can be sure that day will come.

Then the rules will change, and we'll have to adapt. That's show biz! :)

For the moment, I agree that JS content is largely ignored, which means you probably get no SEO benefit (or damage) from JS-delivered content.

So if that's the key content of your page, then I'd use a different method, if possible, such as a server side include for 'what's new' type content.

[edited by: Quadrille at 10:18 pm (utc) on Sep. 15, 2008]

dstiles




msg:3745868
 7:38 pm on Sep 16, 2008 (gmt 0)

Dannyboy - google can, apparently, now follow certain types of form link such as may be used for selecting pages in a drop-down. There was a thread about it somewhere here a few weeks ago.

Quadrille - why is it in searchers' interest? If I wanted google to index my javascript/form-based content I would provide a proper link to it. Which I do on some sites which usually use drop-down Select. Until now I haven't had to worry about google reading two sets of links because they said they couldn't/wouldn't. Now I have to go in and modify the sites, more wasted time.

Quadrille




msg:3746158
 8:40 am on Sep 17, 2008 (gmt 0)

Google is not thinking only of your needs and preferences, but those of other webmasters too, and - much more important - webusers.

When someone searches for, say, electric trains, they mostly do not mind if that information is in iframes, js, flash, or plain vanilla HTML (or any other yet to be imagined form, for that matter).

And as Google's mission (and all other SEs), is to find what people are searching for, then, logically, all SEs will try to find ways to read every system that displays information on the web.

On balance, I suspect most webmasters (especially those who like flash), will welcome that.

I'm not sure if Google ever said they "couldn't" read js - though they've admitted not being able to read it 100% - but I'm 100% sure they never said they "wouldn't" - quite the opposite - they've always made it clear that they hoped to crack that nut.

There are plenty of way you can tell SEs not to index your stuff if you so desire - but holding back the inevitable progress of search technology is not among those options. Sorry. ;)

It may be 'wasted time' to you, but it's usually called progress. While we can all make our sites as 'future proof' as possible, it is unrealistic not to expect to have to move with the times; It's not only Google that advances, every technology on the web is either advancing, planning to, or looking forward to obsolescence.

Best to go with the flow, if you want to benefit from change. :)

[edited by: Quadrille at 8:52 am (utc) on Sep. 17, 2008]

dstiles




msg:3746305
 2:14 pm on Sep 17, 2008 (gmt 0)

I can see your point but I can also see several problems.

As I said, if we wanted to expose the content to google we would do so with plain links. What comes up through other means is not relevant to searches - for example, popup Terms & Conditions, Help and so forth.

In any case, how does/would google present the link to JS content? Certainly not in a popup window, I would hope! And if it went to the link's originating page, how would people without javascript find the content?

Quadrille




msg:3746439
 4:16 pm on Sep 17, 2008 (gmt 0)

Interesting questions (I never said it would be easy!).

If and when Google gets to read JS effectively (I doubt they'll ever reach 100% - but my suspicion is that they are getting closer), thenit'll only be a matter of time before we learn the implications.

if you *don't* want to expose the content, then you'll simply not be able to use JS as a blocker - but as I said, there's other ways.

If you *do* want to use js with content - and from my reading, many do - then people will experiment. Personally, I woudn't mind if they do follow to pop-up boxes, as it will force everyone to address that as an issue in itself (but that's just me!).

We'll see when it happens ... but the key is to see such developments from the *searchers'* POV; because that's how Google will, and that's how js developers will, even if they dress it up to please web developers ;)

And some who currently don't use js may have to think again, as it will be used in different ways, while some who never blocked it before may start to. Who knows?

Don't forget it will give Google problems too; they may have to rethink how they set up Adwords/Adsense. It'll be fun ...

[edited by: Quadrille at 4:17 pm (utc) on Sep. 17, 2008]

dstiles




msg:3746496
 5:13 pm on Sep 17, 2008 (gmt 0)

I think a lot of people will soon be blocking javascript anyway. I added the Firefox script blocker yesterday and have recommended it to others. It's getting a dangerous world out there. :(

Preventing google from reading JS content isn't too difficult if you know it's a possibility AND if they (and other SEs) follow the nofollow code. It probably needs a new meta tag, though. Actually it's way past time (by many years) that the robots.txt standard was updated.

My other point was the possibility of reading drop-down Select navigation forms, which can really only be blocked (as far as I know) by detecting the SE's UA and dropping the form. I can do it but I'm sure a lot of webmasters can't. See notes re: meta tags/robots.txt above.

But as I said, if one doesn't a) know that all of this is a problem and b) do something about it then: will google penalise for the (suddenly) new duplicate content via alternative URLs, presented through no fault of the webmaster's?

As to "it will give google problems too"... they always manage to pass the hassle on to us, have you noticed? :)

Quadrille




msg:3746802
 1:09 am on Sep 18, 2008 (gmt 0)

Up to a point; but it's the webmaster's job to plan ahead. Hopefully, a thread like this will help people do just that.

And I'm sure that regular readers in these forums will be among the first to know of any changes in SEs and JS - so they'll have an advantage.

I still believe it's a minority who'll be seeking to hide content; most of us want to shout our content from the rooftops, and get it seen by as many visitors as possible :)

dannyboy




msg:3746835
 2:55 am on Sep 18, 2008 (gmt 0)

Quadrille, I don't think the potential problem is for those that are using JS to "hide" content. I think it's those that are using JS to innocently add textual content to the page that will have their rankings rocked out of the blue if google implements more advanced filters.

Dynamically added content could change the keyword "weight" of the page, which could be damaging for many sites.

This is one of those things that we may become aware of when folks begin to report how their rankings have mysteriously dropped. Folks should be aware of the possible future ramifications of adding content using JS.

Quadrille




msg:3746863
 5:05 am on Sep 18, 2008 (gmt 0)

add textual content to the page that will have their rankings rocked out of the blue

Fair comment; the time will come when those who expect to be affected will need to review their pages.

I'm not sure that js has ever been the ideal way to manage 'keyword weight', and I'm sure it's almost time for folk who do that routinely to start thinking again.

It's already a policy that will be confusing to non-js users, if not making pages impossible to follow.

Of course, it'll be high noon for those who do it 'non-innocently' ;)

Northstar




msg:3753545
 10:17 pm on Sep 27, 2008 (gmt 0)

It's not worth losing your search engine rankings because you decide to add a bunch of stuff via Javascript.

I't is not my choice to use Javascript to show this content it is just the way the program produces it. So is the general consensus that Googlebot can't see it?

Quadrille




msg:3753546
 10:30 pm on Sep 27, 2008 (gmt 0)

Not at all.

Google can see it; what we don't know is how much Google can READ it, and how much Google CHOOSES to read it (or not).

There's a fair bit of evidence that Google is beginning to get to grips with JS, and no reason to think that the process will stop at the beginning. Indeed, as JS is widely used for SE-unfriendly acts, there's EVERY reason to believe that research and development will continue in that area.

Personally, I suspect that Google (and other SEs) can make much more sense of JS than they are letting on, and that's simply because their JS readers are not yet reliable enough for them to go public. But that is pure speculation on my part, based only on my interpretation of how These Things Work. ;)

tedster




msg:3753557
 11:04 pm on Sep 27, 2008 (gmt 0)

Right on, Quadrille. Especially with the mushrooming use of AJAX (which Google loves in their own applications) you can bet that all search engines are hard at work in this area.

Reading Javascript is one thing, and evaluating it is another. If you want Google to use some content in evaluating your page's relevance, I'd still stay away from javascript to produce it. Javascript to change its visibility is fine - but the content still better be there in the default source code and not require a click or a server call to be generated.

asgdrive




msg:3753631
 5:53 am on Sep 28, 2008 (gmt 0)

I hope this sheds some light on this issue!

We have a very trusted and authority 9 YO ecommerce site with over 300,000 pages indexed in Google (NON SUPPLEMENTAL).

Because of the size and the breadth of the categories and products, we use a LH category hover menu with over 200 category and subcategory links that until this past week was handled with JavaScript.

Just about 2 weeks ago we began losing thousands of pages and lost 50% of our traffic. I quickly realized pages that ranked in the top 3 were in the -950 penalties.

I spent over 20 hours trying to figure out why our site that for the past 5 years never was affected by any updates or algo change, was now failing.

After quickly looking at a cached page on Google, there it was! Our LH navigation as clear as day.

As a note; we have the JS directory disallowed in the robots.txt file, and we have used all current techniques to restrict robots on page to render the javascript. It appears that Google indeed has cracked the code, and furthermore ignores all requests to stay OUT!

There has been conjecture in the past as where or not JS links pass PR. Based on What I now see, yes in fact Google does count those links and does pass PR on internal link structure.

The killer for us is that rendering the JS on all pages has over optimized pages, and too many on page links, putting us into the -950 penalty as well as kicking in the duplicate content filters.

How do I know this? well we quickly used AJAX instead of JS to render the LH Navigation and the pages that have been spidered are back to the top with no LH nav to be found in source.

Again the JS code and the files on our site have not changed for some time. So my deduction is that Google just implemented something that is much better at accessing hidden code and files using JS.

If you are hiding links or content using JS, I would quickly recommend using AJAX before you get hammered, as we did.

Hope this helps

B

Quadrille




msg:3753697
 10:13 am on Sep 28, 2008 (gmt 0)

Thanks for sharing that - but I suspect that your solution is not exactly future proof; if your conclusions are correct (and they seem to fit the facts), then don't you think that many aspects of AJAX will be next in line?

As Tedster says, SEs are sure to be busy in that area. And even now, no-one knows AJAX like Google knows AJAX!

Why not review your policy of over optimizing, which would then enable you to be 100% future proof - and probably make for a better visitor experience (sounds to me like you have very, very cluttered pages!

Small Website Guy




msg:3754430
 1:16 pm on Sep 29, 2008 (gmt 0)

Google has some of the world's smartest programmers. If they can program a web browser that reads JavaScript, OF COURSE they can program their spiders to read it!

The issue is probably more about CPU time. Interpreting the JavaScript has to used up a lot more resources than simply reading HTML. But with computers getting more and more powerful, it would make sense for Google to start reading JavaScript on a certain percentage of web pages.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved