Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Ajax Linking, Redirects & Canonicalization

Looking for Best Practices for SEO

3:39 pm on Nov 9, 2011 (gmt 0)

Full Member

10+ Year Member

joined:Oct 25, 2005
posts: 307
votes: 0

Hey guys,

Need some help with respect to best practices on linking, redirects and canonical tag usage on an Ajax driven website. I’m looking at Twitter as a reference because they have good Google rankings.

Should you 301 your escaped frag pages to the hash bang versions?
We currently don't redirect our escaped frag pages anywhere.
I see Twitter using both a 301 redirect and window.location redirect on http://example.com/?_escaped_fragment_=/andersoncooper to the /#!/andersoncooper version.

I would have thought that if you used a 301 on the escaped frag version, Googlebot would not be able to crawl it, and the whole point of having a escaped frag version is to serve a “static” version of the page to Google so they have something crawlable. I can see the window.location redirect making sense on the escape frag, as you want to serve users the usable version of the page, but I don’t get the 301.

Then on the "pretty URL" we 301 redirect to the hashbang version. For reference, on http://example.com/andersoncooper they are using the window.location to redirect to the #! version.

I would have thought that using a 301 from the pretty URL to the hashbang would make more sense…

Is it best practice to link to the #! URLs, or to the pretty URLs throughout the site. We currently link to the hashbang versions and I see Twitter also linking to hashbang URLs.

Canonical tags:
Should pretty URLs canonicalize to the #!, or should the #! canonicalize to the pretty URLs? And what should the escaped frag canonicalize to? Anything? We canonicalize our escape frag versions to the hashbang versions. Looking at our example site, they are using the base URL tag, coupled with canonicalizing to the root "/" on all pages.

Very hard to find specific info from Google on what’s best practice with these issues – so any comments or help would be much appreciated.
11:23 am on Nov 11, 2011 (gmt 0)

Full Member

10+ Year Member

joined:Oct 25, 2005
posts: 307
votes: 0

Anyone? :-)
2:13 pm on Nov 13, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2007
votes: 0

IMO for the time being serve HTML static content to all spiders. I do not expect them neither I want them to start messing with js. So in general I use jquery and ajax requests are basically invisible to spiders. There is no window.location if you view the html source or any other trace of js mangled with the html tags.

Also in the server logs which I check regularly there is no access trace of any spider to the js scripts. They do not access scripts so far and I do not expect this to change anytime soon. If a site uses ajax it doesn't mean it cannot serve static html if need be.

Now if you figured out a hole or a bug in a spider that perhaps boosts the seo one way or another by injecting js into the html in some way, let us know.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members