Msg#: 3231385 posted 1:39 pm on Jan 25, 2007 (gmt 0)
Msg#: 3231385 posted 1:41 pm on Jan 26, 2007 (gmt 0)
We're all suppose to develop sites and content with the visitor's experience in mind, but I get cold feet doing anything with Ajax that might get confused as spam attempts, such as show hide panels with content within them.
Is gaceful degradation, where non js clients get delivered all the content that would otherwise be hidden an SEO friendly option?
So with the exception of forms, I don't have any experience with delivery user content through ajax.
Msg#: 3231385 posted 1:46 pm on Jan 26, 2007 (gmt 0)
While I have no idea what google "thinks" of XML/Ajax/RSS content in terms of indexing it, I have been visited by the googlebot (v2.1) doing what appears to be "sampling" of the content in the <enclosure> tag of one of my RSS 2.0/podcast feeds.
This sampling appears organized as the bot reads exactly around 1.5 Mbytes of audio/mpeg content, then exits, waits around 15 secs. then reads again -- it repeats this as many times as necessary until it reaches the content-length of the enclosure.
This seemed to start once I added the feed to my personalized Google page -- this caused the bot to refresh the feed around every three or so hours -- the feed has a <TTL> tag of 10 mins. so this is ignored by the googlebot.
I guess that means that the googlebot can read and understand XML -- what it does or is doing with the content is probably up in the air now.
Msg#: 3231385 posted 8:15 pm on Feb 14, 2007 (gmt 0)
Good discussion and I think this answers my question.
My concern is that Google will not see anything on this page. It will not be googleable at all.
It sounds like this is the case. But many sites (including Google homepage) are creating content similarly. Will Google be able to evolve? Or is there a workaround on our end that will make the content spiderable?