|Hashbang AJAX cloaking for non-AJAX pages, How will Google respond?|
| 2:28 pm on Apr 23, 2013 (gmt 0)|
Please bear with me, as this one is quite technical and complicated:
A client of mine insists on presenting a major section ('/section/') of her website as one long list of pages. When a user clicks on a navigation item (eg '/section/#page'), he is scrolled down to the appropriate page within this list (a form, a slideshow, a content bit, anything really). In essence, the site section is one long page scroll with anchors.
Obviously, our client still wants SEO results. She therefore ideally wishes to be able to have '/section/#page' separately indexed, so that she can assign this page distinct title tags and meta descriptions and link build to it. As luck would have it, the pages are actually present in the CMS separately, so it's perfectly possible to give each page its own title and description. Since the pages are loaded into one long list however, we can currently present these titles and descriptions to the search engines.
Enter my idea to make this happen:
- We use Google's #! ('hashbang' AJAX standard, see https://developers.google.com/webmasters/ajax-crawling/docs/getting-started) to link to the anchors. In practice, this means we change the link to '/section/#page' to now point to '/section/#!page', which triggers Google to re-request the page as 'page/?_escaped_fragment_=page'
- We intercept the 'escaped_fragment' query string server side and output the page separately from the CMS
- Note that the hashbang is the only workable solution here, since using pushstate would cause the user to land on the separate page instead of the anchor within the section.
Under this solution, Google SHOULD index the '#!page', '#!page2' anchor sections as separate pages, giving us the freedom to assign them separate title tags etc.
My main question is: is this a 'bannable' offence?
I know the hashbang standard is mainly used for legit AJAX cloaking, but this off-label usage seems legit to me as well. We are not spamming the search engines in any way and are not breaking the user experience, but are just providing the search engines with some extra context in the way of title tags and meta descriptions.
| 4:47 pm on Apr 23, 2013 (gmt 0)|
I don't think anyone can tell you what impact it will have and how long it would work for if it does, because it's not something many do, so who knows how Google will handle a non-standard implementation today, tomorrow, next month or a year from now.
And the idea of the "super scroll" page really makes no sense to me, because if the client gets what they want and the pages are indexed separately, then that's where visitors would land from SEs and they wouldn't see the "super scroll" page, so imo there's nothing very important other than the client liking their idea more than I think most people, Search Engines or SEOs are likely to.
Never mind where other webmaster are likely to link -- some to the "super scroll" URL with no fragment, some to the "super scroll" URL with the fragment, some to the individual page created -- and how split that link weight could get since the pages aren't even close to exact duplicates, so the URLs are not likely to be grouped with the "best" one shown.
Personally, I'd fire the client if they insisted on it being that way, mainly because I've worked with enough to know who's fault it is for doing what they said like they said you had to do it, even after telling them it wouldn't work or was very risky, and it's never theirs.
| 7:19 pm on Apr 23, 2013 (gmt 0)|
... and never, ever let your client find out that google search already uses fragments. I don't think anyone has clear data on how common this is-- but I've personally seen it in logs, so it can't be all that rare ;)
The trick of course is to make sure they end up at the right fragment.
After you've cashed your final paycheck, ask the (ex)client how the USER benefits from loading one vast page instead of a much smaller targeted page. Inquiring minds want to know.
| 11:13 pm on Apr 23, 2013 (gmt 0)|
Thanks for that. Unfortunately, I am not in the luxury position if being able to fire a client, nor am I in the position to change what they're going to do. What I can do though, is have them sign off a memorandum stating what the limits and risks are.
OptimizationIdiot, as for your comment (paraphrased) "users would land on the separately indexed page", I don't intend to use pushstate for this very reason. Instead, if I use hash bang, the correct fragment URL should be indexed (?_escaped_fragment URLs do not get indexed).
Lucy, as for your comment: the client is a very frivolous 'happy' brand and wants their website to be flashy and happy. I know there are other AJAX-enabled solutions that would make more sense, but I'm afraid that's outside the scope of their budget.
So my question still stands; if I sign off the client on any risks, should I include the risk of getting banned over this? How does google respond to legitimate off label usage of the AJAX hash bang standard? Technically it's cloaking, but for benevolent purposes (much like any AJAX solution).
| 12:02 am on Apr 24, 2013 (gmt 0)|
Actually, it's not cloaking if you show GoogleBot the same info on the "super scroll" page as you do the user and the same info on the individual content page as you would the user if they were to request it the same way as GoogleBot.
What it is though is really odd duplication.
I see your point on the #! but I still wouldn't do it lol. Mainly because there's really no telling how they're going to handle that type of duplication now or in the future.
The answer to your question is unless you can find someone who's actually done it exactly the same way you're on your own to find out, because you're in a micro-percentage of the web that does something like that, so there's not much chance Google has something specifically coded in the algorithm to handle it since there's not enough people who do it, so their time is better spent doing a number of other things than worrying about how the algo deals with what could be a "one off" solution like you're using, unless it becomes a problem for them.
But all that said, if you get them to sign off, then go for it and find out what happens.
| 12:44 am on Apr 24, 2013 (gmt 0)|
Duplication is one of the reasons (the other being load time) why I am pushing for them to at least implement an AJAX lazy loading script.
Assuming they don't do this, I wonder what would happen if I 'noindex' the main super scroller page, yet explicitly 'index' the fragment pages. Technically this should be possible, but I wonder what would take precendence.
| 12:49 am on Apr 24, 2013 (gmt 0)|
I would definitely push for the "lazy load" at the very least, and you could even do some cool stuff if they go for it, like move the "fragment" the vistor requests to the top of the page rather than scrolling to it and then load the rest if the user scrolls. (It should help to speed the page load/info display up that way too.)
If they won't load the rest of the page "onscroll", I might try to talk them into showing a "snippet / paragraph" of each "fragment" on the page initially and then loading the rest via AJAX on a click of the "read more" link, much the same way as many sites do for an index of an article directory or whatever. So basically, show a summary initially, speed the page load up and shorten it at the same time, then load the rest if the visitor really wants to see it.
It sounds like a horrible user experience I would leave without even bothering with the way she wants it. (I do it all the time when people put too much ish I don't want on the page. Personally, I want one article on one topic on one page and if I want more I'll click the links to those, but I don't want everything on one page, except maybe as a brief summary that loads or takes me to the rest if I click.)
| 5:57 am on Apr 24, 2013 (gmt 0)|
Matt Cutts on pushState instead of !# - [youtube.com...]
| 8:33 am on Apr 24, 2013 (gmt 0)|
FranticFish, I have already seen that. Unfortunately, pushstate is not an option for us because the URL indexed using the pushstate approach would present the user with a single page instead of the full experience desired by the client.
| 3:00 pm on Apr 24, 2013 (gmt 0)|
I would push for having the client sign a memo saying that the novel nature of the request has significant risk of negative SEO results, that they might be banned or drop in rank on Google, and any loss of business revenue & reputation is their responsibility. And have a plan B to recommend that minimizes their cost (fits in budget) and maximizes their maintaining or increasing page rank. IMHO.