homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 117 message thread spans 4 pages: < < 117 ( 1 2 [3] 4 > >     
Detection Of Hidden Stuff
seeking pub conference feedback

 10:26 pm on Apr 29, 2003 (gmt 0)

From a post in another thread:

<For those who get to the pubconference and talk about this new hidden text algo, please post it here<

OK party people, anyone sober enough yet to write an intelligible response on this?



 1:03 pm on May 17, 2003 (gmt 0)

Can I just ask:

Hidden divs: example used from a rollover effect on a menu, will be penalised automatically by spambot?


 8:11 pm on May 17, 2003 (gmt 0)

I wanted to rekindle this thread, as it appears that sj has just gone live today (or perhaps yesterday at some point).

In connection with everything that was said above on hidden "entities" (text, links, layers etc.), what sorts of penalties is everyone noting have actually now been assessed to sites using such tricks?



 5:52 pm on May 19, 2003 (gmt 0)

Seems the more I read the more questions I have...

Here's the first one... are words in comment tags (HTML) considered hidden text, ignored by bots, or what?

I would hate a well commented piece of code to get nailed for being disciplined.


 11:34 am on May 20, 2003 (gmt 0)

I really hope, Google will simply ignore the hidden text. I use display:none for accessibility reason on link separators, e.g., to hide "][" on link menus.

And of course there are some display:none DIVs in my print style sheet to hide menus and other irrelevant parts of the web page when printing it.

I trust Google to be smart enough to differentiate between spam DIVs and legal reasons to use display:none.

Then there's z-index. I use it to make sure the most important division (content) stays on top when resizing the browser window to very small, e.g., 200px width.

And there is JavaScript code (I don't use JavaScript) to re-use screen estate when clicking different tabs my hiding one DIV and showing another. Advantage: It's faster than loading another web page.

All this can be abused but there are also valid reasons (and even accessibility requirements) for using display:none, z-index, ...

I don't think Google will stupidly ban a web page cause it uses any of these techniques. It would punish the wrong crowd.


 4:54 am on May 21, 2003 (gmt 0)

Well, 64 posts, and we are still pretty much at the level of supposition, postulation and conjecture... So here is mine, which will make it 65:

Google wants to retain the 'market share' of searches submitted. To do that, their results need to be valid. If they ban half the world's websites for using legitimate DHTML methods for things like menus (be that implemented by javascript, CSS, or a combination), then their results will no longer be valid, and searchers will look to other search engines. So I have confidence that Google won't be that stupid, and any algorithm they deploy would first be tested fairly thoroughly to make sure it doesn't catch innocents in the crossfire.

Apart from not wishing to lose market share, I also have confidence that Google wants to promote good web development practices. Creating a situation where only inexperienced and ignorant web developers produce great DHTML sites using Javascript and CSS would not serve taht interest. It would also be a great shame, given all the effort and years of work that has gone into developing HTML, CSS and ECMA/Javascript standards so that they can work together to allow developers to create good DHTML websites.

So on two counts, I have confidence that Google won't be stupid about it.



 4:58 am on May 21, 2003 (gmt 0)

Interesting comment here:
[webmasterworld.com...] msg7


 5:06 am on May 21, 2003 (gmt 0)

Thanks RC, the comment about the 1x1 pic helps. Still, it doesn't answer the whole question. So now we are up to 66 posts, and we are still pretty much at the level of supposition, postulation and conjecture...

The closest we have to anything definite is a pretty vague statement (msg#34):
>>"..Matt Cutts, during the questions and answers afterwards, seemed rather confident that both css and layers would be taken care of in decent way.."

but then again (msg#7):
"...Matt Cutts (Google) specifically mentioned that hidden anything will be zapped as soon as it is discovered..."

So which is it? Would GoogleGuy or someone who knows like to comment? Would you get penalised for good quality, legitimate DHTML techniques? Would the following site, currently with a PR of 7, suddenly find itself at PR 0 and banned?


[edited by: rcjordan at 5:09 am (utc) on May 21, 2003]
[edit reason] sorry, no specific references to sites. [/edit]


 5:16 am on May 21, 2003 (gmt 0)

I wasn't much interested in the 1x1 pixel comment, but rather the fact that Google has been on a mission to drop big warning flags on hidden text. Now we're seeing a comment that it's been put in service.


 5:40 am on May 21, 2003 (gmt 0)

I'm using 0x0 pixle images to control layout of tables (using 0x0 rather than 1x1 ensures a broken image icon is not displayed in IE if the image can't be found) AND to hide e-mail addresses from spam bots...

if they zap these they're getting dangerously close to being like a government tax department that creates so many intricate rules it becomes increasingly less effective achieving exactly what it sets out to prevent.


 10:05 am on May 28, 2003 (gmt 0)

I just noticed a site with a huge amount of hidden black text on black backround (along with even a hidden link to porn site) that I specifically invoked the name of Googleguy in reporting went gray toolbar. Googleguy wrote that the filters were being tweaked to automatically pick this up

But does'nt work anymore...I have reported a site with white font over white background - but the site still ranking perfectly well due to that hidden text :( and has a PR4


 1:42 am on May 25, 2003 (gmt 0)

Hi all,

I've read various posts about Google's treatement of various tricks such as hidden text...now here is my maybe naif question:

If i've got a page with some text inside a layer (whose overflow is not "hidden"), placed on a negative _X and/or _Y coordinate (so out of sight), is that all regarded as "hidden text"?.

If so, even if the out-of-sight position is not due to a malicious ranking strategy, but for visual purposes (text fluctuating after a click, or after some amounts of secs from the load of the page...etc..)?.

Sorry for my rough english. Bye


 3:10 am on May 25, 2003 (gmt 0)

Read a recent thread about that one, but don't know that I would want to spend the time looking for it...

From what I remember the answer is something like if it wouldn't get you banned TODAY then likely soon. In general probably falls under the "go ahead and do it if you don't care about the site's long term success" type advice.



 10:38 am on May 27, 2003 (gmt 0)

Welcome to WebmasterWorld, cromo

Positioning a layer 'offscreen' is just one of many ways to render a layer invisible (and not perhaps the wisest choice - its an easy thing to spot algo-wise).

There are some very 'creative' ways of making layers invisible that would really need Google to parse external files in quite a detailed manner. They CAN do this, but it would still be difficult to identify which were doing it to 'cheat'.

The whole problem with css positioned layers is that there are valid site design reasons for having them change their status between visible and invisible. For example, there are plenty of navigation systems which work this way. So it would be difficult to justify an outright ban.

Google have stated that they are going after 'hidden text and links', so it pays to tread carefully. I guess the best advice is 'don't use any technique for SEO that doesn't have a valid usage for normal web design'.

I suspect that Google might eventually choose to ignore, or downgrade, text that is not visible on page loading - but an outright ban seems a little unlikely.

Mostly, your risk is from a hand inspection brought on by a spam report. In which case it would be down to how they percieved your intention.


 9:09 am on May 23, 2003 (gmt 0)

G states in its Webmaster Guidelines that one should avoid hidden text.

This sometimes clashes with the need to provide a page with a distinctive design that is also viewable on low-end browser.

Showing the image containing text to high-end browsers and the same raw text to low-end browsers can be achieved using CSS.

Is this hidden text for Google?


 9:23 am on May 23, 2003 (gmt 0)

I assume you mean that which Douglas Bowman (as some people before him) described here: www.stopdesign.com/articles/css/replace-text

It's an important design-principle which can be used on aesthetically oriented sites, such as from graphic designers. Look at this e.g.:

It is also an important design technique for achieving accessability.

Unfortunately it can just as well be used for naughty spamming. And this might be what counts to google. So I doubt strongly they will use the text for ranking. But I am really interested to know wether this would trigger an auto-30-day penalty... that, in this case would be a bad idea... :(

Only (ugly) solution I see, would be to serve a default stylesheet with no replaced text to normal users. And give them an alternate stylesheet they can use/set, but most people won't bother/find out etc. so that would be sad workaround...


 11:48 am on May 30, 2003 (gmt 0)

Hidden text...at bottom – I cannot resist but must have reported 100 times about in.

Sorry in advance :(

This is pure frustration – the site is ranking higher for all the keywords in hidden text! I am sorry to put the URL up but I want the whole world to see what our favorite search engine up to!

[edited by: shaadi at 11:49 am (utc) on May 30, 2003]


 11:58 am on May 30, 2003 (gmt 0)

Thats the problem.

Google says we are going to detect and punish hidden text:

- 200 posts about I got caught.
- 800 posts about my competitor did not get caught.
- 300 posts about my competitor did not get punished long enough.
- 500 posts about what is hidden text.
- 400 posts about hidden text can be useful.

Most important issues:

- Hidden text is not critical to get high rankings.
- Google will never be able to detect all forms of hidden text.


 12:02 pm on May 30, 2003 (gmt 0)

But the truth is hidden text works as guest-book submission - so what we just forget the user and make sites with content hidden/ sign innocent homepages with non-relevant URLs...

After all what’s the use of making a clean, nice site - if it cannot attract any SE Traffic :(


 12:05 pm on May 30, 2003 (gmt 0)

So report them and dont moan about it to everyone on the forum and definatly dont tell anyone who they are.

If all else fails confront the company and send an e-mail to their webmaster and CC it to a manager etc so it definatly gets some immediate attencion. That usually works a lot better...!


 12:07 pm on May 30, 2003 (gmt 0)

After all what’s the use of making a clean, nice site - if it cannot attract any SE Traffic :(

Says WHO?

either you aint doing it right, or u aint doing it right.

and on another note building a site/business purely for SERPS is madness, other ways of driving traffic.



 12:12 pm on May 30, 2003 (gmt 0)

And F***ing! What’s that foolish thing ON THE EARTH called Spam Report Form and that stupid email address for, if it doesn't work at all?

I put in time & efforts to write it down, if they cannot maintain that thing please remove it. Just got no-right to waste people's time inducing them to fill up something which goes to Thrash. And make a fool out of them!

Forget about my competitor whom I am complaining about using multiple sites, sub-domains, hidden-text, doorway pages - I have little cousin sister who found Porn site using a some innocent searches...I am feeling guilty of introducing her to G now!

edit for typo

[edited by: shaadi at 12:30 pm (utc) on May 30, 2003]


 12:13 pm on May 30, 2003 (gmt 0)

If you really think it works and is risk free, why not just do it yourself?

Personally, I don't think it is risk free.


 12:19 pm on May 30, 2003 (gmt 0)

>>>I have little cousin sister who found Porn site using a some innocent searches...I am feeling guilty of introducing her to G now!

Thats a different complaint to your first one?

Now if you report that, chances are the site/page will be removed very quickly.
I think they (Google are very sensitive to that issue).

You are right it is frustrating to not get "real" spammers removed after reporting them.

You have to take into account how many spamreports Google gets a day.
It is humanly impossible to react to everyone.

What would you do if you were Google?

If you remove the spamreport page option, webmasters/searchers will complain in an unclustered way on any another email/form.

Having one central entry port for spamreports lets them most probably and potentially find common grounds on complaints by some automatic system.


 12:20 pm on May 30, 2003 (gmt 0)

either you aint doing it right, or u aint doing it right.

I have good ranking on SERPs – over 4000 page views daily, and not completely depending on SEO in fact Adwords gave a run away success to us…Also we have permanent links on major portals Y! MSN etc. A complete in-house affiliate program with 800+ affiliates…

But feel bad when some one just Spams and top the ranks on the other hand we have to do so much of efforts to get there…it’s a shame – one feels like a fool when some one just makes 14-20 websites and link them together to get a PR6 while we have to exchange links with other affinity sites to get PR6.


 12:23 pm on May 30, 2003 (gmt 0)

Just got no-right to waste people's time inducting them to fill up something which goes to Thrash

I seriously doubt it goes to trash. No doubt the spam report form on googles' site is heavily used, and naturally it will take time to go through each report and investigate. Any report submitted will get queued and dealt with asap.
Its not like you're paying for a service - google is free to use. And don't forget, by submitting a spam report you're helping google improve the quality of the index and it's algorithms, and should not expect the site you've reported to suddenly drop out of the index. As GG has stated many times in the past, they are trying to remove spam through algorithms, and not by hand.

I have little cousin sister who found Porn site using a some innocent searches

If I've learnt anything, its that nothing in this world is ever 100%. Though it's sad to hear that porn was found - its a good reason why children should be supervised when searching the net. You never know when there's likely to be a popup ad showing unsuitable content. Thats the nature of the internet. :(


[edited by: jpjones at 12:25 pm (utc) on May 30, 2003]


 12:23 pm on May 30, 2003 (gmt 0)

What would you do if you were Google?

Create a forum like OPD project did. One can do many things, after all if G can create a once so relavent SE, why can't they make something to counter these tiny spammers?


 12:56 pm on May 30, 2003 (gmt 0)

I have little cousin sister who found Porn site using a some innocent searches

Did this happen even with the safesearch filter in place?


 1:27 pm on May 30, 2003 (gmt 0)

> - 200 posts about I got caught.
> - 800 posts about my competitor did not get caught.

We don't read hundreds of posts saying "I did not get caught" or "my competitor did get caught" because people love to complain. Such is life.

shaadi, it can seem like there's some kind of global conspiracy when you're looking at your own site, or one competitor. You should read vitaplease' "important issues" for the bigger picture.

If you can find me one highly successful Google optimiser who considers hidden text a crucial (or even useful) part of their work, I'll be amazed. It is a red herring.


 1:39 pm on May 30, 2003 (gmt 0)

I think that hidden text filters apply together with other factors; e.g several results of the same sites or from a link farms in the first page results.

Hidden text/links sites having only one result in the first page, seem not to be penalized.

Spammers are banned when there is a damage in the results.


 1:55 pm on May 30, 2003 (gmt 0)

Did this happen even with the safesearch filter in place?

safesearch filter? pls explain.


 2:03 pm on May 30, 2003 (gmt 0)

They may have hidden text but it does very little for their ranking. Title and PR have the highest weight in the SERPS and you can't use hidden text in either. Get those 2 things and you will beat out hidden text every time.

This 117 message thread spans 4 pages: < < 117 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved