| 12:04 am on Sep 15, 2008 (gmt 0)|
| 12:29 am on Sep 15, 2008 (gmt 0)|
| 12:36 am on Sep 15, 2008 (gmt 0)|
restless, what I experienced could have been a fluke, but it did appear that google did infact recognize the content added to by DOM and treated it negatively. It isn't clear if they're using the dynamically added text to change the "weight" of the page, but in my experience it was a net negative.
I'm not saying the OP shouldn't do it, but I would caution him to be careful about it and not to go overboard. For instance, if he's planning to add some dynamic content across a vast swath of pages, then he should consider just adding it to a few pages and wait a couple of days to see if google notices it or drops his rankings. If it turns out to be all good, then go ahead and add it to the remaining pages across the site.
By the way, I'm distinguishing between content and behavior. If the OP is just planning to add some action hooks (ie, click, hover, etc), then I don't see that causing a problem. But if he's going to dynamically add actual textual or html content to the page dynamically, then be more careful.
| 9:59 pm on Sep 15, 2008 (gmt 0)|
> "not worth losing your search engine rankings..."
We try to stick to google's rules. How about google sticking to them as well?
| 10:01 pm on Sep 15, 2008 (gmt 0)|
"form links to content"?
dstiles, could you please clarify what you mean by that?
| 10:17 pm on Sep 15, 2008 (gmt 0)|
|How about google sticking to them as well? |
Google has to evolve or die, just like everything else on the web. It's in searchers' interest that Google can parse js successfully - and flash - so we can be sure that day will come.
Then the rules will change, and we'll have to adapt. That's show biz! :)
For the moment, I agree that JS content is largely ignored, which means you probably get no SEO benefit (or damage) from JS-delivered content.
So if that's the key content of your page, then I'd use a different method, if possible, such as a server side include for 'what's new' type content.
[edited by: Quadrille at 10:18 pm (utc) on Sep. 15, 2008]
| 7:38 pm on Sep 16, 2008 (gmt 0)|
Dannyboy - google can, apparently, now follow certain types of form link such as may be used for selecting pages in a drop-down. There was a thread about it somewhere here a few weeks ago.
| 8:40 am on Sep 17, 2008 (gmt 0)|
Google is not thinking only of your needs and preferences, but those of other webmasters too, and - much more important - webusers.
When someone searches for, say, electric trains, they mostly do not mind if that information is in iframes, js, flash, or plain vanilla HTML (or any other yet to be imagined form, for that matter).
And as Google's mission (and all other SEs), is to find what people are searching for, then, logically, all SEs will try to find ways to read every system that displays information on the web.
On balance, I suspect most webmasters (especially those who like flash), will welcome that.
I'm not sure if Google ever said they "couldn't" read js - though they've admitted not being able to read it 100% - but I'm 100% sure they never said they "wouldn't" - quite the opposite - they've always made it clear that they hoped to crack that nut.
There are plenty of way you can tell SEs not to index your stuff if you so desire - but holding back the inevitable progress of search technology is not among those options. Sorry. ;)
It may be 'wasted time' to you, but it's usually called progress. While we can all make our sites as 'future proof' as possible, it is unrealistic not to expect to have to move with the times; It's not only Google that advances, every technology on the web is either advancing, planning to, or looking forward to obsolescence.
Best to go with the flow, if you want to benefit from change. :)
[edited by: Quadrille at 8:52 am (utc) on Sep. 17, 2008]
| 2:14 pm on Sep 17, 2008 (gmt 0)|
I can see your point but I can also see several problems.
As I said, if we wanted to expose the content to google we would do so with plain links. What comes up through other means is not relevant to searches - for example, popup Terms & Conditions, Help and so forth.
| 4:16 pm on Sep 17, 2008 (gmt 0)|
Interesting questions (I never said it would be easy!).
If and when Google gets to read JS effectively (I doubt they'll ever reach 100% - but my suspicion is that they are getting closer), thenit'll only be a matter of time before we learn the implications.
if you *don't* want to expose the content, then you'll simply not be able to use JS as a blocker - but as I said, there's other ways.
If you *do* want to use js with content - and from my reading, many do - then people will experiment. Personally, I woudn't mind if they do follow to pop-up boxes, as it will force everyone to address that as an issue in itself (but that's just me!).
We'll see when it happens ... but the key is to see such developments from the *searchers'* POV; because that's how Google will, and that's how js developers will, even if they dress it up to please web developers ;)
And some who currently don't use js may have to think again, as it will be used in different ways, while some who never blocked it before may start to. Who knows?
Don't forget it will give Google problems too; they may have to rethink how they set up Adwords/Adsense. It'll be fun ...
[edited by: Quadrille at 4:17 pm (utc) on Sep. 17, 2008]
| 5:13 pm on Sep 17, 2008 (gmt 0)|
Preventing google from reading JS content isn't too difficult if you know it's a possibility AND if they (and other SEs) follow the nofollow code. It probably needs a new meta tag, though. Actually it's way past time (by many years) that the robots.txt standard was updated.
My other point was the possibility of reading drop-down Select navigation forms, which can really only be blocked (as far as I know) by detecting the SE's UA and dropping the form. I can do it but I'm sure a lot of webmasters can't. See notes re: meta tags/robots.txt above.
But as I said, if one doesn't a) know that all of this is a problem and b) do something about it then: will google penalise for the (suddenly) new duplicate content via alternative URLs, presented through no fault of the webmaster's?
As to "it will give google problems too"... they always manage to pass the hassle on to us, have you noticed? :)
| 1:09 am on Sep 18, 2008 (gmt 0)|
Up to a point; but it's the webmaster's job to plan ahead. Hopefully, a thread like this will help people do just that.
And I'm sure that regular readers in these forums will be among the first to know of any changes in SEs and JS - so they'll have an advantage.
I still believe it's a minority who'll be seeking to hide content; most of us want to shout our content from the rooftops, and get it seen by as many visitors as possible :)
| 2:55 am on Sep 18, 2008 (gmt 0)|
Quadrille, I don't think the potential problem is for those that are using JS to "hide" content. I think it's those that are using JS to innocently add textual content to the page that will have their rankings rocked out of the blue if google implements more advanced filters.
Dynamically added content could change the keyword "weight" of the page, which could be damaging for many sites.
This is one of those things that we may become aware of when folks begin to report how their rankings have mysteriously dropped. Folks should be aware of the possible future ramifications of adding content using JS.
| 5:05 am on Sep 18, 2008 (gmt 0)|
|add textual content to the page that will have their rankings rocked out of the blue |
Fair comment; the time will come when those who expect to be affected will need to review their pages.
I'm not sure that js has ever been the ideal way to manage 'keyword weight', and I'm sure it's almost time for folk who do that routinely to start thinking again.
It's already a policy that will be confusing to non-js users, if not making pages impossible to follow.
Of course, it'll be high noon for those who do it 'non-innocently' ;)
| 10:17 pm on Sep 27, 2008 (gmt 0)|
| 10:30 pm on Sep 27, 2008 (gmt 0)|
Not at all.
Google can see it; what we don't know is how much Google can READ it, and how much Google CHOOSES to read it (or not).
There's a fair bit of evidence that Google is beginning to get to grips with JS, and no reason to think that the process will stop at the beginning. Indeed, as JS is widely used for SE-unfriendly acts, there's EVERY reason to believe that research and development will continue in that area.
Personally, I suspect that Google (and other SEs) can make much more sense of JS than they are letting on, and that's simply because their JS readers are not yet reliable enough for them to go public. But that is pure speculation on my part, based only on my interpretation of how These Things Work. ;)
| 11:04 pm on Sep 27, 2008 (gmt 0)|
Right on, Quadrille. Especially with the mushrooming use of AJAX (which Google loves in their own applications) you can bet that all search engines are hard at work in this area.
| 5:53 am on Sep 28, 2008 (gmt 0)|
I hope this sheds some light on this issue!
We have a very trusted and authority 9 YO ecommerce site with over 300,000 pages indexed in Google (NON SUPPLEMENTAL).
Just about 2 weeks ago we began losing thousands of pages and lost 50% of our traffic. I quickly realized pages that ranked in the top 3 were in the -950 penalties.
I spent over 20 hours trying to figure out why our site that for the past 5 years never was affected by any updates or algo change, was now failing.
After quickly looking at a cached page on Google, there it was! Our LH navigation as clear as day.
There has been conjecture in the past as where or not JS links pass PR. Based on What I now see, yes in fact Google does count those links and does pass PR on internal link structure.
The killer for us is that rendering the JS on all pages has over optimized pages, and too many on page links, putting us into the -950 penalty as well as kicking in the duplicate content filters.
How do I know this? well we quickly used AJAX instead of JS to render the LH Navigation and the pages that have been spidered are back to the top with no LH nav to be found in source.
Again the JS code and the files on our site have not changed for some time. So my deduction is that Google just implemented something that is much better at accessing hidden code and files using JS.
If you are hiding links or content using JS, I would quickly recommend using AJAX before you get hammered, as we did.
Hope this helps
| 10:13 am on Sep 28, 2008 (gmt 0)|
Thanks for sharing that - but I suspect that your solution is not exactly future proof; if your conclusions are correct (and they seem to fit the facts), then don't you think that many aspects of AJAX will be next in line?
As Tedster says, SEs are sure to be busy in that area. And even now, no-one knows AJAX like Google knows AJAX!
Why not review your policy of over optimizing, which would then enable you to be 100% future proof - and probably make for a better visitor experience (sounds to me like you have very, very cluttered pages!
|Small Website Guy|
| 1:16 pm on Sep 29, 2008 (gmt 0)|