Forum Moderators: open

Message Too Old, No Replies

hiding text via css

hiding text via css

         

proxyHunter

2:28 am on May 21, 2003 (gmt 0)

10+ Year Member



is this method(trick) to hiding text via css being picked up by google yet?

<example>

<style> .xyz { display: none; } </style>

<a href="http://www.xyz.com" class="xyz">Xyz link for bots only</a>

</example>

does using an external css file prevent this better in terms of stopping google flagging this as an attempt to hide text.

deft_spyder

2:33 am on May 21, 2003 (gmt 0)

10+ Year Member



seems damn obvious to me. try spending the time making a page of content. if it doesnt pick it up now, it will soon.

eaden

2:45 am on May 21, 2003 (gmt 0)

10+ Year Member



if it doesnt pick it up now, it will soon.

This trick has been around for how long? As long as css itself. And "soon" is so vauge, can mean months or years. That seems to be the default WW answer - not a good idea, and one day, google will ban you for spamming. Which is true. But that doesn't mean it will ban you right now.

The question was if putting your code in a seperate CSS file would make it less obvious.

Well, does googlebot look at external CSS files? I havn't seen googlebot request an external css file, so without accessing the file, it can't tell if it's hidden or not. Sure there is still the human touch, but not on 3 billion web pages... And if googlebot does start requesting your css file, you could always put it in a directory that is disallowed in robots.txt.

( note, I don't think this is a good idea, and I don't practice it, just trying to answer the question )

Edit: Actually, I do use display:none, but for ligit reasons, e.g.

<span style="display:none">Your browser doesn't seem to support CSS. Please upgrade to Mozilla</span>

[edited by: eaden at 2:55 am (utc) on May 21, 2003]

BigDave

2:52 am on May 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google's new hidden text checker will look at CSS, external .css, and most likely is not a robot so it does not have to obey robots.txt.

You should be safe as long as your site gets lousy results. At least to start with they will only be using it to help process spam reports. If you do not rank well enough for anyone to care, then you should be safe.

proxyHunter

2:57 am on May 21, 2003 (gmt 0)

10+ Year Member



>>deft_spyder: try spending the time making a page of content

this is not about hiding content, rather than hiding external/internal links so that users will not click these hidden links, apart form this being unethical, i was just asking can google pick up and penalize this?

>>eaden

it has been mentioned on this forum, that disallowing access to a css file could be flagged by google, so doing that technique could penalize your domain,

proxyHunter

3:03 am on May 21, 2003 (gmt 0)

10+ Year Member



>>BigDave You should be safe as long as your site gets lousy results

So the spam reports will be reviewed by humans who manually check these pages and then act if it is found that the page in question is using a spam technique?

pageoneresults

3:07 am on May 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Program the bot to look for any of this in an external file or on page...

display:none;

visibility:hidden;

left:-800px;top:-800px;

With CSS becoming more mainstream, the abuse is sure to follow. It is out there now, but not even close to becoming a problem.

As smart as Googlebot is, and other spiders, detecting funny stuff in css should be fairly simple. All good things usually end up being abused over time.

Penalty for abusing css...

Permanent ban with no chance of reinclusion!

[edited by: pageoneresults at 3:08 am (utc) on May 21, 2003]

BigDave

3:07 am on May 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That is the way that it currently works. Google aparently now has some sort of browser technology that will flag invisible text/links to greatly speed up the process. My guess is that it is as close to being a bot as possible without crossing that line.

John_Creed

3:44 am on May 21, 2003 (gmt 0)

10+ Year Member



>><span style="display:none">Your browser doesn't seem to support CSS. Please upgrade to Mozilla</span>
>>

Whats stopping Google from counting this as hidden text and banning the site?

Nothing.

gilli

4:02 am on May 21, 2003 (gmt 0)

10+ Year Member



Whats stopping Google from counting this as hidden text and banning the site?

There are a bunch of legitimate uses for display:none - it was not specifically designed for spamming. For example until recently Zeldman had a collection of links that were in a hidden div - to browse the links you clicked a "show" link. While I don't think this is very good interface design, the intention certianly wasn't spam.

I think the best policy for Google would be to simply ignore anything thats hidden - rather than banning sites (unless reviewed by a human). As far as I can see this equates to javascript drop down navigation - its not spam but its not followed. If you want GoogleBot to follow links - then provide some atlernative navigation.

On the topic of robots.txt blocking access to external css - if the css is linked from a page that is not blocked by robots.txt, then the css must be considered as part of the page and downloaded regardless.

pageoneresults

4:05 am on May 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Your browser doesn't seem to support CSS. Please upgrade to Mozilla.

I would say the part between the <span> tags. A bot could easily determine when certain css attributes are being used accordingly.

Now, if the bot encountered a <span display:none;> tag, it would then perform an analysis on the content between the <span> tags. How it would determine whether or not it was viable content is beyond me. But, with technology where it is today, these offenses have to be easy to detect.