Welcome to WebmasterWorld Guest from 54.196.212.62

Forum Moderators: open

Message Too Old, No Replies

Page Degradation For Specific Device Support is Cloaking?

Big Name Googler Claims It's Cloaking!

     

incrediBILL

1:06 am on Jul 4, 2008 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



One of my pet peeves is all the tools people use to spy on other people's sites, not to mention scrapers just ripping things off, so I came up with a variety of methods to combat the situation and hide part of my SEO from prying eyes.

The first step is using NOARCHIVE on all the pages to eliminate the search engine cache so prying eyes can't look elsewhere.

The second step was to only send search engine directives to the search engines and not the end users. This makes perfectly good sense on many levels because browsers don't support search engine META tags, nor the rel=NOFOLLOW in links on pages, so you can eliminate a bunch of bloat and protect your SEO methods from prying eyes and tools that can rip all that information in seconds and present pretty reports detailing all your keywords and such.

Anyway, I was told by a Googler that this is cloaking.

How can degrading the page content to only contain what the target device is capable of using be cloaking?

It's common practice not to send javascript or Flash to devices that can't understand it, it's simple content degradation based on the capabilities of the device.

The browser doesn't do anything with most META tags, except the redirect command, so why bloat your pages sending META keywords and descriptions to anything but a search engine?

Does a browser use NOINDEX? NOFOLLOW? NOARCHIVE? KEYWORDS? META?

Of course not, so why would not giving the browser this information be deemed cloaking?

It's not sneaky or deceptive, just protecting our investment from prying eyes trying to make a quick buck off our hard work.

What do you think, is Google overstepping here or am I overreacting?

Receptional Andy

1:54 pm on Jul 5, 2008 (gmt 0)



I agree with you, Bill, but Google pays the piper...

It seems to hinge on whether cloaking is showing something different to Google, or deceptively showing different content to Google. The other issue being whether or not you might trip algorithmic filters with hiding of nofollow etc.

I hear a lot of people advising to allow crawling of CSS/javascript and the like too. Seems like a waste of bandwidth to me. If they come along with a browser, they can look at design and functionality all they want, but it's not for spiders if you ask me.

To be frank, I don't cloak nofollow and the like though: seems like it might be risky.

jdMorgan

3:42 pm on Jul 5, 2008 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yeah, the operative word is deceptively.

I suspect that your Googler was just parroting their simplified public policy of "no cloaking;" SEs have a history of using "hard" statements about "soft" and subjective issues. If you think about it, that's really all they can do, since the issues *are* subjective, and any attempt to comprehensively describe what they do and do not allow would become the "SEO Manual for Google." It would also become obsolete quite quickly, I suspect.

I believe that they simply want to declare a public policy that everyone can understand, so that the days of "buy car, buy car, buy car" pages never return.

Jim

incrediBILL

8:28 pm on Jul 5, 2008 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



If they come along with a browser, they can look at design and functionality all they want, but it's not for spiders if you ask me.

Design can (and is) used to cloak such as keyword stuffing combined CSS that hides the content from the visitors with matching text and background colors so I can understand the SE wanting to see if your color scheme is deceptive.

But visitors NEVER see the META tags so where's the deception there?

Receptional Andy

8:42 pm on Jul 5, 2008 (gmt 0)



But visitors NEVER see the META tags so where's the deception there?

Meta information might (in some cases should) be used by the client, so I don't see much mileage in saving the extra bytes.

I agree regarding design, but no search engine has anything like reasonable capabilities in interpreting design/functionality by examining files, so I don't like to serve such content to search engines. They can come by with a browser if they're concerned ;)

Clark

11:32 am on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Although it's hinted at in this thread, let's state specifically here what might legitimately cause Google consternation plus what some browsers CAN see if the practice isn't cloaked.

The nofollow. If you cloak the nofollow, that means some visitors who have a plugin to cause nofollow'ed links to show in a different color, might give you back a link thinking that your link to them passes juice along, when in fact it doesn't. That can be considered deceptive.

Of course, Google can't say that outright, because it would be like saying it is legitimate for people to give out links based on whether a link to you is nofollowed or not, that is, for reasons that are related to SE's rather than for "natural" reasons.

OTOH, they can argue that using that plugin, you can judge if a site is using nofollow in a so called "legitimate" manner, and is therefore "worthy" of linking to, or if a site is using it for SEO purposes to gain links in an unfair manner.

The truth is, cloaking nofollow's is a big bug for the principle of nofollow IMO, because you have no clue if a site is truly linking to you or not.

I think we should debate this nofollow cloaking topic more vigorously. (maybe we have, but I check out Front Page stories religiously, and miss 99.99% of the others, so I haven't seen it talked about...)

Now how about cloaking NOARCHIVE. Seems silly to call that deceptive, since you can easily tell that a site is NOARCHIVE'd if the SE's don't cache your pages.

Lord Majestic

11:44 am on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, the operative word is deceptively.

I think the operative word is "different" - bots can't understand deception as good as humans, hell, in many cases most humans won't understand it until they are scammed, so bots have to rely on checking if content generation to SE is different from that shown to the user. That said I'd expect only change in visible elements to trigger it, though in this case probably NOFOLLOW games were not taken lightly, but it's just a guess.

henry0

12:04 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member henry0 is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The second step was to only send search engine directives to the search engines and not the end users. This makes perfectly good sense on many levels because browsers don't support search engine META tags, nor the rel=NOFOLLOW in links on pages, so you can eliminate a bunch of bloat and protect your SEO methods from prying eyes and tools that can rip all that information in seconds and present pretty reports detailing all your keywords and such.

Obviously this is not my field, however it really picks my curiosity, and I absolutely donít get the slightest idea about how you achieved it :)
Could someone translate the above in lame terms?
Thanks

Alcoholico

12:08 pm on Jul 7, 2008 (gmt 0)

5+ Year Member



Wouldn't it be less controversial to do it using the X-robots protocol and fully removing meta tags for both, humans and bots, feeding this info only to bots in HTTP headers? Humans needn't read http headers and same html content will be fed to both.

[edited by: Alcoholico at 12:12 pm (utc) on July 7, 2008]

g1smd

12:53 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Browsers make no usage of the description tag, so it is irrelevant whether they see it or not.

Showing Google, Yahoo, Ask, and Live, all the same meta description as each other, and one that totally represents what that page is all about, is just about as far as a site should need to go to be compliant. Browsers make absolutely no usage of that meta data whatsoever, so it does not matter what they are given, or not.

This is the second time in a week that I have disagreed with Google's stance on something -- the other was when they said that blocking whole areas of the world by IP was against their guidelines. They later revised that, to say that blocking whole areas of the world was OK just as long as all users and bots from that area were equally blocked.

At the top of the page, I have accessibilility links to "skip navigation" etc. On some newer sites, those are only included on pages served to real browsers, and are omitted when Google/Yahoo/Ask/Live call by. Seems like that might be frowned upon too.

Additionally, Mozilla, Opera, Safari, all other browsers that are not (IE5 or IE6 or IE7), and all bots that are allowed by .htaccess, get one style sheet, and anything that really is (IE5 or IE6 or IE7) gets a different stylesheet with all the IE-specific kludges within. Opera or Safari claiming to be IE get fed the non-IE stylesheet too.

rogerd

1:56 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Administrator rogerd is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I tend to concur that cloaking a nofollow attribute seems a bit dicey even if your intentions are good. You may be legit, but someone else might be trading links under false pretenses or otherwise abusing the attribute.

Indeed, I think the operative question is, "Can what I'm doing for a good purpose be abused for more nefarious purposes?" This is, of course, a gray area, but the more the answer is "yes," then, in my opinion, the greater the risk. Google has shown itself willing to crack down on techniques used by those whom they consider to be abusers even if there's some collateral damage.

If I were Google, I'd target questionable techniques but use other site indicators to determine who gets the hammer dropped on them. A site that appears to be cloaking but has superb linkage from authority sites might get a bit more latitude than a site that lacks those links and seems to target a commercial keyword set (for example).

longen

3:18 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google could expand on their Webmaster Tools facility to let site owners import their META's for all their sites, which Google could then use for indexing. I suppose that if the "rel=nofollow" was in Tools as well it would kill off link buying except for traffic.

incrediBILL

3:44 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



X-robots protocol

That's a good point as there's nothing in the Google guidelines whatsoever about cloaking HTTP headers, only the content. Still doesn't address META keywords and descriptions which were my primary concern, but it's certainly less controversial and already somewhat ordained.

seonet

4:02 pm on Jul 7, 2008 (gmt 0)

5+ Year Member



What do you think, is Google overstepping here or am I overreacting?

I don't think you are overacting since Google is, have and will be using their position to rule the world until some big competitor will arrive and change that picture. Until that time one thing to do is follow the big guy.

[edited by: incrediBILL at 4:04 pm (utc) on July 7, 2008]
[edit reason] added quote formatting [/edit]

ConnieS

4:36 pm on Jul 7, 2008 (gmt 0)

5+ Year Member



I don't think what your dong with the meta tags is cloaking. It seems to me Google has a lot more serious issues to deal with other than something that adds to site security.

How many average users even look at the source of a page? I suspect your actions are directed at rouge bots anyway.

If your doing something with nofollow that might be cloaking.

koan

4:39 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Browsers make no usage of the description tag, so it is irrelevant whether they see it or not.

Sure they do, Firefox uses it when you bookmark a site to keep a description of the page, so you can search your bookmarks by keywords. Makes sense.

madmatt69

6:27 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Seems dumb - anyone using phpbb3 is cloaking and being evil then. They have slightly different templates for people that are logged in, guests, and even bots enabled by default.

Google also says to build sites for the users...that's what we do!

ConnieS

8:15 pm on Jul 7, 2008 (gmt 0)

5+ Year Member



"Firefox uses it when you bookmark a site to keep a description of the page"

Your version of Firefox must be a lot different than mine. The only thing I see Firefox recording or any other browser is the Title when a page is bookmarked.

paybacksa

8:17 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if you cloak anything, then rendering tools used to compare Google-served pages to Everyoneelse-served pages will show a difference. That is bad.

Of course Google renders pages. They may not do it en-masse, and may do it through human QualityCheckers with tools, but of course they do it and anything cloaked causes them grief.

And if you disagree that they render some things, wait until next week and then see if you still disagree.

incrediBILL

9:54 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



if you cloak anything, then rendering tools used to compare Google-served pages to Everyoneelse-served pages will show a difference. That is bad.

I've been cloaking certain aspects of my sites for years adding honeypot tracking codes into my text to trap content theft with no ill effect.

Oliver Henniges

10:06 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another site-note on design and css:

I believe that the times are long gone when google only interpreted mere text lynx-like: It is a huge difference concerning on-page-evaluation whether a text-block is placed in the triangle of attention or somewhere at the bottom of a page. The same holds true for the size of the letters and many other factors. And it is pretty clear that google needs your css-file to interpret this.

Seb7

10:54 pm on Jul 7, 2008 (gmt 0)

5+ Year Member



From what I remember from the Google pages, is that changing content based on user agent is not cloaking; changing content based on IP is.

I change content outputted based on the type of browser, mainly for mobile devices.

g1smd

10:57 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



*** changing content based on user agent is not cloaking; changing content based on IP is. ***

That's too wide a definition, and in many cases not true.
It's wrong on several levels.

koan

11:37 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Your version of Firefox must be a lot different than mine. The only thing I see Firefox recording or any other browser is the Title when a page is bookmarked.

I just upgraded to version 3 but version 2 was also doing it. You don't see it when you bookmark a page, you see it if you check a bookmark's property (right click on a bookmark entry and select property, or go to "organize bookmarks...")

Anyway, just wanted to point out the false assumption that browsers didn't use meta headers. I don't really have an opinion on the rest of the matter. Although I do think cloaking the "nofollow" is pretty sneaky and would open a can of worms if you used it on some legitimate link exchanges (especially cloaked by IP instead of User Agent, then it would be impossible to know if its not archived by Google).

Seb7

9:03 pm on Jul 8, 2008 (gmt 0)

5+ Year Member



incrediBILL, I do agree with you, however living with Google doesnt present an ideal world.

Google has to check that people are not trying to fool Google, and the only way it can do this is to compare whats being index with what the general user is getting. The problem is that its not clever enough just look for bad changes, only that changes exist.

I think you main objective as to avoid being marked using cloaking is to make your default output the same as the search engine output.

I personally try to keep the several versions of output very similar - at least on the binary front, and any big changes only for the minority users.

Google guidelines on cloaking:

[google.com...]

[google.com...]

[edited by: Seb7 at 9:21 pm (utc) on July 8, 2008]

 

Featured Threads

Hot Threads This Week

Hot Threads This Month