homepage Welcome to WebmasterWorld Guest from 54.146.175.204
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
AJAX site, JS redirects and noscript: could Google see this as spam?
Illah

10+ Year Member



 
Msg#: 3908295 posted 10:26 pm on May 6, 2009 (gmt 0)

We're developing an AJAX site (corporate site for a creative agency, AJAX UI), and to make the deeper content search accessible we have a mirror of the site that's fully accessible to search engines and users without JavaScript.

These mirrored flat pages pull from the same source material as the real site (pulls the content in via PHP includes), so the content is exactly the same.

In each of these flat pages we have a JS redirect in the head that points to the 'real' page on our AJAX site.

On the 'real' pages, we have no-script code that pushes people off to the flat pages.

Overall we're pretty happy with it, the flat site works great for users with NoScript or who have JS off for some reason. When we turn JS back on, the redirect kicks in and the 'real' site works fine.

My main worry is Google mistaking this approach as cloaking or a sneaky redirect. I'm aware that similar tactics are part of the black-hat playbook. Any thoughts on how we might mitigate this risk, or will Google be smart enough to see what we're doing? Has anyone seen Google mistake this approach as black hat sneakiness?

--Illah

 

Receptional Andy



 
Msg#: 3908295 posted 11:02 pm on May 6, 2009 (gmt 0)

Forgive me for being blunt, Illah, but it seems like javascript spaghetti to me. Why can't you have one URL and use javascript to update relevant sections for users with JS enabled? Why the need to redirect?

This type of implementation relies on javascript redirects not triggering algorithmic filters (and that will likely be the case). But what happens if someone wants to bookmark a URL on the "real" site? What about incoming links? Presumably, user-created links will not point to a spiderable URL with the content linked to, but could point at any entry point on the AJAX-ed site.

I think you've a good chance of escaping algorithmic filtering, but without a great deal of care, pure AJAX implementations can end up like framesets - they fundamentally distort the model of the web that many users (and search engines) expect. And performance can suffer heavily as a result.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3908295 posted 11:09 pm on May 6, 2009 (gmt 0)

Tricky, but I would make sure only one version can be spidered.

Illah

10+ Year Member



 
Msg#: 3908295 posted 11:19 pm on May 6, 2009 (gmt 0)

We have deeplinking and all that sorted out actually - even the back/forward button functionality is working. So in that respect we've kept the typical web functionality that people have come to expect, whilst maintaining a very unique UI that we think is quite different from what people have come to expect.

And speaking of UI, that's the reason for the flat mirror. It's kind of hard to describe without showing you (it's not public yet), but you can think of it as an app-like interface. The site is for a creative agency so standing out from the crowd was very important.

That UI is what led us to AJAX...new page loads would have mucked up the interface, and also would have made the content more difficult to manage.

So you think we have a good chance of escaping the filters? That's my main worry. I know Google has always treated mirrored content as a grey area, simultaneously recommending it at the conferences but then keeping a close eye on abuses...

--Illah

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3908295 posted 11:29 pm on May 6, 2009 (gmt 0)

Incoming links are going to be a major issue. They'll likely point at the wrong version because people cut them from their URL bar and paste them to other sites.

You need the new "canonical" tag here for sure, but if you use that, you need search engines to spider both copies of the site.

I made a pact to never duplicate a site for different browser versions back in 1997, and I have resisted duplicating things for any other reason ever since.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved