|AJAX site, JS redirects and noscript: could Google see this as spam?|
| 10:26 pm on May 6, 2009 (gmt 0)|
These mirrored flat pages pull from the same source material as the real site (pulls the content in via PHP includes), so the content is exactly the same.
In each of these flat pages we have a JS redirect in the head that points to the 'real' page on our AJAX site.
On the 'real' pages, we have no-script code that pushes people off to the flat pages.
Overall we're pretty happy with it, the flat site works great for users with NoScript or who have JS off for some reason. When we turn JS back on, the redirect kicks in and the 'real' site works fine.
My main worry is Google mistaking this approach as cloaking or a sneaky redirect. I'm aware that similar tactics are part of the black-hat playbook. Any thoughts on how we might mitigate this risk, or will Google be smart enough to see what we're doing? Has anyone seen Google mistake this approach as black hat sneakiness?
| 11:02 pm on May 6, 2009 (gmt 0)|
I think you've a good chance of escaping algorithmic filtering, but without a great deal of care, pure AJAX implementations can end up like framesets - they fundamentally distort the model of the web that many users (and search engines) expect. And performance can suffer heavily as a result.
| 11:09 pm on May 6, 2009 (gmt 0)|
Tricky, but I would make sure only one version can be spidered.
| 11:19 pm on May 6, 2009 (gmt 0)|
We have deeplinking and all that sorted out actually - even the back/forward button functionality is working. So in that respect we've kept the typical web functionality that people have come to expect, whilst maintaining a very unique UI that we think is quite different from what people have come to expect.
And speaking of UI, that's the reason for the flat mirror. It's kind of hard to describe without showing you (it's not public yet), but you can think of it as an app-like interface. The site is for a creative agency so standing out from the crowd was very important.
That UI is what led us to AJAX...new page loads would have mucked up the interface, and also would have made the content more difficult to manage.
So you think we have a good chance of escaping the filters? That's my main worry. I know Google has always treated mirrored content as a grey area, simultaneously recommending it at the conferences but then keeping a close eye on abuses...
| 11:29 pm on May 6, 2009 (gmt 0)|
Incoming links are going to be a major issue. They'll likely point at the wrong version because people cut them from their URL bar and paste them to other sites.
You need the new "canonical" tag here for sure, but if you use that, you need search engines to spider both copies of the site.
I made a pact to never duplicate a site for different browser versions back in 1997, and I have resisted duplicating things for any other reason ever since.