|brotherhood of LAN|
| 5:05 pm on Mar 20, 2014 (gmt 0)|
mods note: i think this is one case where we want to avoid using specific sites as examples
Haven't heard on this subject for a while, I think the base idea is that Google doesn't like any kind of cloaking at all, no sites would get special dispensation on that. I'm sure someone can update us on what Google says about it.
| 6:19 pm on Mar 20, 2014 (gmt 0)|
On one directory I use JS to collapse the listings down to a business name and address/phone number without showing the full list of services then add a "toggle" link to show the list of services, so if someone's looking for a specific name they can easily scan the list, but without JS the listings all stay expanded -- It's the same information either way, but the two views look different and the JS version is "smoother" and more "interactable" than the version without.
Cloaking is when you show two different pages to a SE -- EG A SE visits and the page is about apples, but when a regular person lands it's about widgeting.
It's not "cloaking" to show two different levels of interactivity or different presentations, especially if you're doing it to make a site more accessible to everyone.
| 5:10 pm on Mar 21, 2014 (gmt 0)|
but yes they aren't really showing two different content. Though the content is the same, they induce clicks from real human users. I am thinking it to be an attempt to increase visit duration and engagement metric in the eyes of google. So they use background images behind these short answers (which aren't relevant most of the time) and induce clicks from visitors. Most of the textual content is hidden behind images for users. But what is shown to googlebot is a different design, without background images, and is more textual.
| 5:27 pm on Mar 21, 2014 (gmt 0)|
|Cloaking is when you show two different pages to a SE -- EG A SE visits and the page is about apples, but when a regular person lands it's about widgeting. |
To be more clear, what they do is to show related questions with answers by way of "Explore this topic" to googlebot. but for real human users, they show the related questions (without the answers) with background images and links that leads to respective pages for the answers.Isn't this cloaking when they show only questions to real users while questions and answers to googlebot?
| 6:53 pm on Mar 21, 2014 (gmt 0)|
If it's the site I'm thinking you're talking about with a "slideshow box" and red left/right boxes with arrows to "navigate", then it's done with JS -- If you turn your JS off and refresh the page you'll see exactly the type of page you're describing in the results.
| 7:15 pm on Mar 21, 2014 (gmt 0)|
In looking a bit more, the site I'm looking at almost has to be the one you're talking about, and I don't think what they're doing is cloaking.
What they're doing is actually really accommodating for a screen reader, because a screen reader doesn't need all the "fancy nav" or the background images on the related stuff -- The images either "clutter up the reading of the topics" if they have alts, or, they're not going to be "read" as part of the page if they don't have alts or the alts are left empty, so there's no real reason I can think of to slow down the page load speed by having them present for a screen reader since all they do is get in the way or get ignored.
| 5:25 am on Mar 22, 2014 (gmt 0)|
may be you are right that we are talking about the same site. But if you notice, the answers for the related questions are not present in the page made visible to human users.But yes, the answer for the main question is presented to the human user with a JS slideshow. I am referring to what they do to related questions as they seem to be doing things differently for the bot and human users.
| 5:44 am on Mar 22, 2014 (gmt 0)|
I was off a bit with my guess as to exactly which site you were talking about. I think it looks like they're cloaking too now that I know exactly where to look -- Interesting.
| 5:51 am on Mar 22, 2014 (gmt 0)|
Not sure if I'm saying too much publicly, so if I am the mods should feel free to snip.
| 2:22 pm on Mar 22, 2014 (gmt 0)|
|1) So what is bad and what is not when it comes to cloaking? |
2) Are certain sites exceptions and if so why? if these are exceptions, how does google actually recognize them?
This should answer these questions:
|Some examples of cloaking include: |
- Serving a page of HTML text to search engines, while showing a page of images or Flash to users
- Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor
On this page are listed some exceptions:
Hidden text and links
Being penalised because of cloaking is just a waiting game. Google can find the page is cloaking during the manual review.
Google could also request the page(s) with a different user agent and compare textual content, which would definitely scale better, but would require additional resources and a risk of that UA being blocked. I do not know whether they do anything alongside these lines or not.