Welcome to WebmasterWorld Guest from 54.145.235.72

When is cloaking fine with google?

   
4:59 pm on Mar 20, 2014 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I am seeing several popular sites ranking highly on google though they seem to show different pages to human visitors and bots. A popular site that provides answers for most short and quick questions is using a slideshow to let visitors know the answer, when they click on it. But what is indexed in google is a different page without the slideshow. But this seem to be fine with google!

1) So what is bad and what is not when it comes to cloaking?
2) Are certain sites exceptions and if so why? if these are exceptions, how does google actually recognize them?
5:05 pm on Mar 20, 2014 (gmt 0)

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



mods note: i think this is one case where we want to avoid using specific sites as examples

Haven't heard on this subject for a while, I think the base idea is that Google doesn't like any kind of cloaking at all, no sites would get special dispensation on that. I'm sure someone can update us on what Google says about it.
6:19 pm on Mar 20, 2014 (gmt 0)

WebmasterWorld Senior Member Top Contributors Of The Month



I would guess the slideshow is JavaScript driven and made to "gracefully upgrade" to a slide show when JavaScript is available, but the information is presented in a non-slideshow manner in the HTML for bots and browsers without JavaScript.

I do something similar often to make sure sites are accessible, by loading anything requiring JavaScript via JavaScript, so all the information is still accessible without it, but not all the function and "cool stuff" that makes the pages/site more readable or usable is there.

EG

On one directory I use JS to collapse the listings down to a business name and address/phone number without showing the full list of services then add a "toggle" link to show the list of services, so if someone's looking for a specific name they can easily scan the list, but without JS the listings all stay expanded -- It's the same information either way, but the two views look different and the JS version is "smoother" and more "interactable" than the version without.

Cloaking is when you show two different pages to a SE -- EG A SE visits and the page is about apples, but when a regular person lands it's about widgeting.

It's not "cloaking" to show two different levels of interactivity or different presentations, especially if you're doing it to make a site more accessible to everyone.
5:10 pm on Mar 21, 2014 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



No it isn't a javascript that is used and a simple "click to get the answer" type site and a popular one too..

but yes they aren't really showing two different content. Though the content is the same, they induce clicks from real human users. I am thinking it to be an attempt to increase visit duration and engagement metric in the eyes of google. So they use background images behind these short answers (which aren't relevant most of the time) and induce clicks from visitors. Most of the textual content is hidden behind images for users. But what is shown to googlebot is a different design, without background images, and is more textual.
5:27 pm on Mar 21, 2014 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Thankss @JD_Toims.

Cloaking is when you show two different pages to a SE -- EG A SE visits and the page is about apples, but when a regular person lands it's about widgeting.


To be more clear, what they do is to show related questions with answers by way of "Explore this topic" to googlebot. but for real human users, they show the related questions (without the answers) with background images and links that leads to respective pages for the answers.Isn't this cloaking when they show only questions to real users while questions and answers to googlebot?
6:53 pm on Mar 21, 2014 (gmt 0)

WebmasterWorld Senior Member Top Contributors Of The Month



If it's the site I'm thinking you're talking about with a "slideshow box" and red left/right boxes with arrows to "navigate", then it's done with JS -- If you turn your JS off and refresh the page you'll see exactly the type of page you're describing in the results.
7:15 pm on Mar 21, 2014 (gmt 0)

WebmasterWorld Senior Member Top Contributors Of The Month



In looking a bit more, the site I'm looking at almost has to be the one you're talking about, and I don't think what they're doing is cloaking.

What they're doing is actually really accommodating for a screen reader, because a screen reader doesn't need all the "fancy nav" or the background images on the related stuff -- The images either "clutter up the reading of the topics" if they have alts, or, they're not going to be "read" as part of the page if they don't have alts or the alts are left empty, so there's no real reason I can think of to slow down the page load speed by having them present for a screen reader since all they do is get in the way or get ignored.
5:25 am on Mar 22, 2014 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



may be you are right that we are talking about the same site. But if you notice, the answers for the related questions are not present in the page made visible to human users.But yes, the answer for the main question is presented to the human user with a JS slideshow. I am referring to what they do to related questions as they seem to be doing things differently for the bot and human users.
5:44 am on Mar 22, 2014 (gmt 0)

WebmasterWorld Senior Member Top Contributors Of The Month



I was off a bit with my guess as to exactly which site you were talking about. I think it looks like they're cloaking too now that I know exactly where to look -- Interesting.
5:51 am on Mar 22, 2014 (gmt 0)

WebmasterWorld Senior Member Top Contributors Of The Month



Not sure if I'm saying too much publicly, so if I am the mods should feel free to snip.

Yeah, it's definitely cloaked and it's not even a good one. The page displays the same with or without JavaScript, but all I had to do was switch my useragent to GoogleBot and I got the cached version of the page rather than the version a searcher sees -- WOW!
2:22 pm on Mar 22, 2014 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



1) So what is bad and what is not when it comes to cloaking?
2) Are certain sites exceptions and if so why? if these are exceptions, how does google actually recognize them?

This should answer these questions:

Cloaking
https://support.google.com/webmasters/answer/66355 [support.google.com]
Some examples of cloaking include:

- Serving a page of HTML text to search engines, while showing a page of images or Flash to users
- Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor

On this page are listed some exceptions:

Hidden text and links
https://support.google.com/webmasters/answer/66353 [support.google.com]
However, not all hidden text is considered deceptive. For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site.
(...)
JavaScript: Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.

Being penalised because of cloaking is just a waiting game. Google can find the page is cloaking during the manual review.

Google could also request the page(s) with a different user agent and compare textual content, which would definitely scale better, but would require additional resources and a risk of that UA being blocked. I do not know whether they do anything alongside these lines or not.
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month