Forum Moderators: open
It's a little senseless to argue that IE, Mozilla, Opera, Safari etc. know better than the author of a page how a heading should look on that page and I'm sure that Google wouldn't defend this idea...
Of some reason it seems like Google doesn't
follow the @import directive
Eg.
<link rel="stylesheet" href="style1.css" />
<style type="text/css">
@import url("style2.css");
</style>
Here Google will only crawl style1.css and not style2.css.
I hope they don't start to drop sites using hidden text, there
are several accessability issues that would be affected.
Precisely!
It's difficult to argue any case of penalisation when there's an equal, if not greater justification for using anything that CSS allows... Even negative-valued div's have a "legal" and innocent use to people.
Interesting, if there's any mileage in this anyway!
I think the js will be the hardest as you can affect redirects in countless constructs.
Looking for "display:none" or negative positioning would be trivial for Google I would think.
The reason they haven't done it till now is probably the bandwidth involved or server resources it would drain from normal crawling.
Imagine what it would take to download every css or js file...and not index any of it. Just for the sake of battling spam....
Imagine what it would take to download every css or js file...and not index any of it. Just for the sake of battling spam....
Purpose and reasoning aside, couldn't you simply block GoogleBot from your CSS with robots.txt?
[edited by: rrdega at 12:36 pm (utc) on Sep. 15, 2003]
The problem is not as easy as it seems, in my opinion.
I'd imagine they would treat the text in any display:none layer similarly. Use it only if I can't rank this page situations.
Navigation (unless its extremely content rich navigation) rarely makes a difference to on-page relevancy anyhow.
[meyerweb.com...]
I'm not getting rid of it, the people I'm building the site for really like it. But it uses display:none; Is the googlebot smart enough to distinguish text that is visible only when you hover over it, from truly hidden text?
How could Google use this information?
- a subtle downgrade on pages with significant hidden text, whatever the reason.
- as one indicator of possible dodginess, to be correlated with other indicators or spam reports.
- as something to watch for should a manual review be necessary for some reason.
I really like the idea of decorative image heading swapping via CSS, but the spam implications make me nervous.
I may be missing something really simple here, as to date I've only been using CSS for basic formating, but couldn't we simply block GoogleBot from the CSS with robots.txt?
I may be missing something really simple here, as to date I've only been using CSS for basic formating, but couldn't we simply block GoogleBot from the CSS with robots.txt?
If Googlebot wants the layout and formating information, it
would be just a little bit fishy to block it. A better solution must be to just write super safe CSS. A better solution is to write the appropriate alternative CSS to take care of accessability issues. It is possible to take care of all the hidden navigation information with the :before and :after pseudo classes.
Everyone's looking for things that Google might be penalizing for. I think Google might also be giving sites a slight boost if their css validates. Web standards are becoming increasingly important as the Internet becomes a true mass medium.
A site i'm in the process of updateing uses pics for headings. I hate this but have no choise in removing it. What I wanted to do was put a title in a <h1> tag also and then hide it uses css. Either off the page or placing the title image over it with css positioning.
I don't believe this to be spamming, as I'm just letting google read what the viewer can see... although agreed its kinda showing one thing to google and another to viewers, but only kinda ;)
Do you consider this spamming? Would google disapprove? Can you suggest a better alternative?
From the sounds of it, putting this into a css file should stop googl noticeing what i'm doing, but i don't want to do anything wrong as i'm not into cheating!
Your advice would be much appreciated
Thanks
Google does not usually attack problems with a sledgehammer, they're usually quite subtle with things like this.
I don't think they will penalise for using display:hidden or any other legitimate CSS techniques. Nor do I think they will give sites a boost if the CSS validates. Not that what I think matters, but I think we may need to wait for a bit to see how it pans out.
The first thing which occured to me was that they may be taking the CSS file and trying to use it to clean up the way the cache looks for heavily css'd sites (because they can look awful when you look at the cache if they have absolute positioning).
Purpose and reasoning aside, couldn't you simply block GoogleBot from your CSS with robots.txt?
There is nothing to stop you adding instructions to the robots.txt file to block CSS files, however, Google may either simply disregard such instructions or assume (rightly) that the block is to hide dubious code.
If the decision were mine, I would ignore a block on CSS files taking the view that they are an intrinsic part of the HTML pages being spidered. However, only CSS files called by pages being spidered are likely to be scanned.
Add the issue of the cache and things start to get really interesting.
Kaled.
This would just remove lots of legit sites from Google, but spammers would stay anyways!
I think it's a very bad idea!
It's difficult to argue any case of penalisation when there's an equal, if not greater justification for using anything that CSS allows... Even negative-valued div's have a "legal" and innocent use to people.
There are legitimate uses for many techniques that are used for spam. I use - divs to center align fixed elements. This is the same as having keywords stuffed in alt text or any other technique that has the possibility for SE manipulation. There are viable techniques, and google will handle the problem protecting AS MUCH of the innocent as possible
Google does not usually attack problems with a sledgehammer, they're usually quite subtle with things like this.