|'Search sniffer' technology |
Is our search sniffer technology cloaking?
| 8:00 pm on Aug 11, 2009 (gmt 0)|
Would love some thoughts on this...
We've built a "search sniffer" where we detect that a visit is coming from Google and we serve up a few lines of text at the top of the page that inserts their keyword (for relevance) and explains what our site is (for context). This is a partial attempt to improve bounce rate from search visits.
It's more clear if you see an example: In Google, type in 'widget'. The first listing will be ours. You will see the box at the top of the page. This gets inserted only when the visit comes from search. If you just click on this page from our site, the box isn't there.
In the 3 weeks since it launched we have had a slight, but noticable, downturn in search traffic-it continues to decline each day. Appears we have declined in rankings (detailed keyword analysis difficult as we have thousands of small vol unique keywords--tail terms).
My obvious question is, Is this cloaking? We have no intention of black hat SEO. Since we've only had a slight decrease, it makes me think Google isn't penalizing us, or could they be a slight bit?
[edited by: incrediBILL at 8:17 pm (utc) on Aug. 11, 2009]
[edit reason] removed specifics, see TOS #13 [/edit]
| 8:24 pm on Aug 11, 2009 (gmt 0)|
It could be called cloaking -- or more accurately, IP- or user-agent-based content delivery.
However, it does not sound like it is "cloaking with intent to deceive" either search engine spiders or visitors. Generally, when search engines talk about cloaking penalties, these penalties apply to "cloaking with intent to deceive".
It would make no sense to punish a site (punish its URLs, really) just because the content changes based on the User-agent or requesting IP address. Imagine for example if Google punished sites for serving different content to mobile devices, gaming devices, netbooks, tablets, notebooks, and PCs/Macs. Or if they punished sites for using IP- or User-agent based delivery to serve different-language content to different areas of the world. They would punish most of the major sites on the Web, including themselves!
I doubt that a few extra lines of text are causing you trouble, unless it contains a bunch of gibberish or excessively-repeated keywords. You can bet that the search engines are aware of your alternate content (unless you subscribe to a "real-time robot IP address identifier service" for cloakers to prevent that), but I doubt that this is the cause of your trouble; I use an approach similar to yours on a couple of pages to help visitors find what they are actually looking for (as opposed to what they initially searched for), and it doesn't seem to affect those pages' ranking.
| 11:32 pm on Aug 12, 2009 (gmt 0)|
Sounds like you are doing some referrer-based cloaking. I've done the same for years with excellent results and never been penalized, on high profile sites.
Your declining rankings are more likely due to some other factor.
| 5:05 pm on Aug 17, 2009 (gmt 0)|
Thx very much for the replies!