homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Code, Content, and Presentation / JavaScript and AJAX
Forum Library, Charter, Moderator: open

JavaScript and AJAX Forum

Ajax Linking, Redirects & Canonicalization
Looking for Best Practices for SEO

5+ Year Member

Msg#: 4385282 posted 3:39 pm on Nov 9, 2011 (gmt 0)

Hey guys,

Need some help with respect to best practices on linking, redirects and canonical tag usage on an Ajax driven website. I’m looking at Twitter as a reference because they have good Google rankings.

Should you 301 your escaped frag pages to the hash bang versions?
We currently don't redirect our escaped frag pages anywhere.
I see Twitter using both a 301 redirect and window.location redirect on http://example.com/?_escaped_fragment_=/andersoncooper to the /#!/andersoncooper version.

I would have thought that if you used a 301 on the escaped frag version, Googlebot would not be able to crawl it, and the whole point of having a escaped frag version is to serve a “static” version of the page to Google so they have something crawlable. I can see the window.location redirect making sense on the escape frag, as you want to serve users the usable version of the page, but I don’t get the 301.

Then on the "pretty URL" we 301 redirect to the hashbang version. For reference, on http://example.com/andersoncooper they are using the window.location to redirect to the #! version.

I would have thought that using a 301 from the pretty URL to the hashbang would make more sense…

Is it best practice to link to the #! URLs, or to the pretty URLs throughout the site. We currently link to the hashbang versions and I see Twitter also linking to hashbang URLs.

Canonical tags:
Should pretty URLs canonicalize to the #!, or should the #! canonicalize to the pretty URLs? And what should the escaped frag canonicalize to? Anything? We canonicalize our escape frag versions to the hashbang versions. Looking at our example site, they are using the base URL tag, coupled with canonicalizing to the root "/" on all pages.

Very hard to find specific info from Google on what’s best practice with these issues – so any comments or help would be much appreciated.



5+ Year Member

Msg#: 4385282 posted 11:23 am on Nov 11, 2011 (gmt 0)

Anyone? :-)


WebmasterWorld Senior Member 5+ Year Member

Msg#: 4385282 posted 2:13 pm on Nov 13, 2011 (gmt 0)

IMO for the time being serve HTML static content to all spiders. I do not expect them neither I want them to start messing with js. So in general I use jquery and ajax requests are basically invisible to spiders. There is no window.location if you view the html source or any other trace of js mangled with the html tags.

Also in the server logs which I check regularly there is no access trace of any spider to the js scripts. They do not access scripts so far and I do not expect this to change anytime soon. If a site uses ajax it doesn't mean it cannot serve static html if need be.

Now if you figured out a hole or a bug in a spider that perhaps boosts the seo one way or another by injecting js into the html in some way, let us know.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Code, Content, and Presentation / JavaScript and AJAX
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved