Forum Moderators: open
In connection with everything that was said above on hidden "entities" (text, links, layers etc.), what sorts of penalties is everyone noting have actually now been assessed to sites using such tricks?
Best,
Lorraine
Here's the first one... are words in comment tags (HTML) considered hidden text, ignored by bots, or what?
I would hate a well commented piece of code to get nailed for being disciplined.
And of course there are some display:none DIVs in my print style sheet to hide menus and other irrelevant parts of the web page when printing it.
I trust Google to be smart enough to differentiate between spam DIVs and legal reasons to use display:none.
Then there's z-index. I use it to make sure the most important division (content) stays on top when resizing the browser window to very small, e.g., 200px width.
And there is JavaScript code (I don't use JavaScript) to re-use screen estate when clicking different tabs my hiding one DIV and showing another. Advantage: It's faster than loading another web page.
All this can be abused but there are also valid reasons (and even accessibility requirements) for using display:none, z-index, ...
I don't think Google will stupidly ban a web page cause it uses any of these techniques. It would punish the wrong crowd.
Google wants to retain the 'market share' of searches submitted. To do that, their results need to be valid. If they ban half the world's websites for using legitimate DHTML methods for things like menus (be that implemented by javascript, CSS, or a combination), then their results will no longer be valid, and searchers will look to other search engines. So I have confidence that Google won't be that stupid, and any algorithm they deploy would first be tested fairly thoroughly to make sure it doesn't catch innocents in the crossfire.
Apart from not wishing to lose market share, I also have confidence that Google wants to promote good web development practices. Creating a situation where only inexperienced and ignorant web developers produce great DHTML sites using Javascript and CSS would not serve taht interest. It would also be a great shame, given all the effort and years of work that has gone into developing HTML, CSS and ECMA/Javascript standards so that they can work together to allow developers to create good DHTML websites.
So on two counts, I have confidence that Google won't be stupid about it.
Shawn
The closest we have to anything definite is a pretty vague statement (msg#34):
>>"..Matt Cutts, during the questions and answers afterwards, seemed rather confident that both css and layers would be taken care of in decent way.."
but then again (msg#7):
"...Matt Cutts (Google) specifically mentioned that hidden anything will be zapped as soon as it is discovered..."
So which is it? Would GoogleGuy or someone who knows like to comment? Would you get penalised for good quality, legitimate DHTML techniques? Would the following site, currently with a PR of 7, suddenly find itself at PR 0 and banned?
[edit]
Shawn
[edited by: rcjordan at 5:09 am (utc) on May 21, 2003]
[edit reason] sorry, no specific references to sites. [/edit]
if they zap these they're getting dangerously close to being like a government tax department that creates so many intricate rules it becomes increasingly less effective achieving exactly what it sets out to prevent.
I just noticed a site with a huge amount of hidden black text on black backround (along with even a hidden link to porn site) that I specifically invoked the name of Googleguy in reporting went gray toolbar. Googleguy wrote that the filters were being tweaked to automatically pick this up
But does'nt work anymore...I have reported a site with white font over white background - but the site still ranking perfectly well due to that hidden text :( and has a PR4
I've read various posts about Google's treatement of various tricks such as hidden text...now here is my maybe naif question:
If i've got a page with some text inside a layer (whose overflow is not "hidden"), placed on a negative _X and/or _Y coordinate (so out of sight), is that all regarded as "hidden text"?.
If so, even if the out-of-sight position is not due to a malicious ranking strategy, but for visual purposes (text fluctuating after a click, or after some amounts of secs from the load of the page...etc..)?.
Sorry for my rough english. Bye
From what I remember the answer is something like if it wouldn't get you banned TODAY then likely soon. In general probably falls under the "go ahead and do it if you don't care about the site's long term success" type advice.
Kevin
Positioning a layer 'offscreen' is just one of many ways to render a layer invisible (and not perhaps the wisest choice - its an easy thing to spot algo-wise).
There are some very 'creative' ways of making layers invisible that would really need Google to parse external files in quite a detailed manner. They CAN do this, but it would still be difficult to identify which were doing it to 'cheat'.
The whole problem with css positioned layers is that there are valid site design reasons for having them change their status between visible and invisible. For example, there are plenty of navigation systems which work this way. So it would be difficult to justify an outright ban.
Google have stated that they are going after 'hidden text and links', so it pays to tread carefully. I guess the best advice is 'don't use any technique for SEO that doesn't have a valid usage for normal web design'.
I suspect that Google might eventually choose to ignore, or downgrade, text that is not visible on page loading - but an outright ban seems a little unlikely.
Mostly, your risk is from a hand inspection brought on by a spam report. In which case it would be down to how they percieved your intention.
This sometimes clashes with the need to provide a page with a distinctive design that is also viewable on low-end browser.
Showing the image containing text to high-end browsers and the same raw text to low-end browsers can be achieved using CSS.
Is this hidden text for Google?
It's an important design-principle which can be used on aesthetically oriented sites, such as from graphic designers. Look at this e.g.:
www.mezzoblue.com/zengarden
It is also an important design technique for achieving accessability.
Unfortunately it can just as well be used for naughty spamming. And this might be what counts to google. So I doubt strongly they will use the text for ranking. But I am really interested to know wether this would trigger an auto-30-day penalty... that, in this case would be a bad idea... :(
Only (ugly) solution I see, would be to serve a default stylesheet with no replaced text to normal users. And give them an alternate stylesheet they can use/set, but most people won't bother/find out etc. so that would be sad workaround...
Sorry in advance :(
This is pure frustration – the site is ranking higher for all the keywords in hidden text! I am sorry to put the URL up but I want the whole world to see what our favorite search engine up to!
[edited by: shaadi at 11:49 am (utc) on May 30, 2003]
Google says we are going to detect and punish hidden text:
- 200 posts about I got caught.
- 800 posts about my competitor did not get caught.
- 300 posts about my competitor did not get punished long enough.
- 500 posts about what is hidden text.
- 400 posts about hidden text can be useful.
Most important issues:
- Hidden text is not critical to get high rankings.
- Google will never be able to detect all forms of hidden text.
I put in time & efforts to write it down, if they cannot maintain that thing please remove it. Just got no-right to waste people's time inducing them to fill up something which goes to Thrash. And make a fool out of them!
Forget about my competitor whom I am complaining about using multiple sites, sub-domains, hidden-text, doorway pages - I have little cousin sister who found Porn site using a some innocent searches...I am feeling guilty of introducing her to G now!
edit for typo
[edited by: shaadi at 12:30 pm (utc) on May 30, 2003]
Thats a different complaint to your first one?
Now if you report that, chances are the site/page will be removed very quickly.
I think they (Google are very sensitive to that issue).
You are right it is frustrating to not get "real" spammers removed after reporting them.
You have to take into account how many spamreports Google gets a day.
It is humanly impossible to react to everyone.
What would you do if you were Google?
If you remove the spamreport page option, webmasters/searchers will complain in an unclustered way on any another email/form.
Having one central entry port for spamreports lets them most probably and potentially find common grounds on complaints by some automatic system.
either you aint doing it right, or u aint doing it right.
I have good ranking on SERPs – over 4000 page views daily, and not completely depending on SEO in fact Adwords gave a run away success to us…Also we have permanent links on major portals Y! MSN etc. A complete in-house affiliate program with 800+ affiliates…
But feel bad when some one just Spams and top the ranks on the other hand we have to do so much of efforts to get there…it’s a shame – one feels like a fool when some one just makes 14-20 websites and link them together to get a PR6 while we have to exchange links with other affinity sites to get PR6.
Just got no-right to waste people's time inducting them to fill up something which goes to Thrash
I have little cousin sister who found Porn site using a some innocent searches
JP
[edited by: jpjones at 12:25 pm (utc) on May 30, 2003]
We don't read hundreds of posts saying "I did not get caught" or "my competitor did get caught" because people love to complain. Such is life.
shaadi, it can seem like there's some kind of global conspiracy when you're looking at your own site, or one competitor. You should read vitaplease' "important issues" for the bigger picture.
If you can find me one highly successful Google optimiser who considers hidden text a crucial (or even useful) part of their work, I'll be amazed. It is a red herring.
Hidden text/links sites having only one result in the first page, seem not to be penalized.
Spammers are banned when there is a damage in the results.