|Functional cloaking allowed?|
| 2:03 pm on Mar 7, 2007 (gmt 0)|
m wondering whether functional cloaking would be allowed by the major
search engines as opposed to content cloaking.
The situation is as follows:
There is a site build in an Ajax framework. The site is a RIA, with 1
pagerefresh. To allow people to bookmark and communicate different
'pages' (they're really states of the application) and to use the back
and forward functionality of the browser, a hash # is used in the url
+ extra information to determine the state.
There are 2 problems here from a SEO perspective;
the site isn't indexable, since the content is loaded with JS.
The urls arent crawlable because of the # in the url.
So what I'm thinking at this moment is building a 'shadow site', that
will be using the exact same content, only in html and via seo
Then I'd check whether a visitor is a SE or a reall visitor, and
rewrite the url accordingly.
Obviously this would be a cloaking method, which isn't allowed. But I
think the Search Engines have stated in the terms that you are not
allowed to show the search engines different content.
Since we'd be using the same content, only another technical way of
presenting it, would that be ok you think?
| 5:36 pm on Mar 7, 2007 (gmt 0)|
Sounds fairly suicidal, for all the reasons you have given.
It's rarely a good idea to think you can outsmart the SEs - you may 'get away with it' for a while, but then when the catastrophe happens, you won't even know what it was that they sussed out.
SEO = design = SEO; if you get that wrong, most else is chasing your tail.
| 3:43 am on Mar 8, 2007 (gmt 0)|
|Since we'd be using the same content... |
What's he getting away with?
Not that I'm against cloaking, but what I don't understand is why people design sites that aren't spider-friendly in the first place. If it can be done in a spider-friendly manner (and it can since you are able to cloak a spider-friendly version with the same content), why not convert the whole site to that version?
| 4:23 am on Mar 8, 2007 (gmt 0)|
Why not simply design a great site for visitors, in the full knowledge that SEs will follow?
Life is complicated enough, without going looking for trouble ...
| 4:55 pm on Mar 9, 2007 (gmt 0)|
I'd say a site re-design would be a better idea than cloaking, cant really comment on whether "you suck for not making it SE friendy" because I dont really know what content you're serving up, but cloaking is not a good idea in this case.
| 6:33 pm on Mar 9, 2007 (gmt 0)|
Some BBs accomplish this by having a "printable" version of their discussion threads. While a dodge like this might make it possible for Google to spider your site, it seems like a recipe for permanent suboptimal rankings.
Why not just build your proposed shadow site, and forget about the version that has all the state information in the URL?
| 11:15 am on Mar 12, 2007 (gmt 0)|
Thanks for the replies..
I disagree with the 'just design a spiderable site' mantra though.
I think a website should be designed for the users, the visitors are most important! We are doing so by designing a AJAX website, which is user friendly from a interaction design point of view. The downside of this method is that the search engines can't crawl/index it though.
That's why we are thinking of implementing the suggested method...
What I thought of just now is this:
Checking on the server whether the visitor has a browser that supports the AJAX framework, and give the visitor the AJAX site or the html one accordingly..
This would result in Opera users seeing the HTML site, but also the search engines beeing directed to this html site..
This should be ok.. shouldn't it?