Msg#: 4385282 posted 3:39 pm on Nov 9, 2011 (gmt 0)
Need some help with respect to best practices on linking, redirects and canonical tag usage on an Ajax driven website. I’m looking at Twitter as a reference because they have good Google rankings.
Redirects: Should you 301 your escaped frag pages to the hash bang versions? We currently don't redirect our escaped frag pages anywhere. I see Twitter using both a 301 redirect and window.location redirect on http://example.com/?_escaped_fragment_=/andersoncooper to the /#!/andersoncooper version.
I would have thought that if you used a 301 on the escaped frag version, Googlebot would not be able to crawl it, and the whole point of having a escaped frag version is to serve a “static” version of the page to Google so they have something crawlable. I can see the window.location redirect making sense on the escape frag, as you want to serve users the usable version of the page, but I don’t get the 301.
Then on the "pretty URL" we 301 redirect to the hashbang version. For reference, on http://example.com/andersoncooper they are using the window.location to redirect to the #! version.
I would have thought that using a 301 from the pretty URL to the hashbang would make more sense…
Linking: Is it best practice to link to the #! URLs, or to the pretty URLs throughout the site. We currently link to the hashbang versions and I see Twitter also linking to hashbang URLs.
Canonical tags: Should pretty URLs canonicalize to the #!, or should the #! canonicalize to the pretty URLs? And what should the escaped frag canonicalize to? Anything? We canonicalize our escape frag versions to the hashbang versions. Looking at our example site, they are using the base URL tag, coupled with canonicalizing to the root "/" on all pages.
Very hard to find specific info from Google on what’s best practice with these issues – so any comments or help would be much appreciated.
Msg#: 4385282 posted 2:13 pm on Nov 13, 2011 (gmt 0)
IMO for the time being serve HTML static content to all spiders. I do not expect them neither I want them to start messing with js. So in general I use jquery and ajax requests are basically invisible to spiders. There is no window.location if you view the html source or any other trace of js mangled with the html tags.
Also in the server logs which I check regularly there is no access trace of any spider to the js scripts. They do not access scripts so far and I do not expect this to change anytime soon. If a site uses ajax it doesn't mean it cannot serve static html if need be.
Now if you figured out a hole or a bug in a spider that perhaps boosts the seo one way or another by injecting js into the html in some way, let us know.