Forum Moderators: open

Message Too Old, No Replies

Ajax/JS and SEO implications

         

dustin999

10:08 pm on Aug 11, 2006 (gmt 0)

10+ Year Member



I'm working on a site where most of my content will be obtained from a xml datafeed (using XMLHTTP) in javascript and writing it to div sections on my page using DIV.innerHTML. The reason is, I want to build a data structure for all of my data that populates this DIV as it's mostly tables, and I want the flexibility of javascript so that users can sort rows based on the clickable column headers and things like that.

So my question is, what's the best way to approach SEO with a site like this? I'm concerned that spiders won't see the majority of my content, since it's being pulled from xml and pushed to the DIV section... i.e. if you view the source of my page, you don't even see the content.

Should I make use of the NOSCRIPT feature and have a static version of my site that contains the content, just to accomodate SEO concerns? Or can I just ignore this because spiders are intelligent enough to handle javascript/ajax? I'm guessing the first answer is the most appropriate, but wanted to get others' opinions.

Thanks,
Dustin

supermoi

4:47 pm on Aug 12, 2006 (gmt 0)

10+ Year Member



Anything done with javascript is not accessible by bots. They will see everything inside the <noscript> tag though.

daveVk

2:20 am on Aug 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Rather than use noscript tags, put this content in the div you intend to overwrite from javascript. Having to generate duplicate static content is a pain, consider doing it all server side. Consider how much of the data you wish to be indexed, unless you have a popular site, dont expect google to index large numbers of database generated pages.