Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Did GoogleGuy say they are testing automatic checking for hidden text?
We have a white background, blue tables and white text. Do you think Google will be smart enough to see the text is not hidden?
The above seems to have gotten lost in the thread, but I am curious about this too. Hope it's OK to use the above quote in a new thread. If not, sorry to all.
I did a site search and came up with some old answers (2001) of which none where very definitive.
I tend to use white page backgrounds with dark table cells that have white text in them, mostly for nav bars, etc. Can this be construed as "hidden" text in today's se world? I hope not, but I'd like to get up to date ideas on this. The text is meant to be seen, even though it is white, hence the dark table cell background.
Just for background about me, I am new to WW (lurking for quite awhile now) but have been doing Web site development since 1998. I don't claim to be an seo'r, just came here to learn more about se's in general to keep myself out of trouble.
I don't believe that there is any problem with using white text on a dark background. Think about whether the site would pass a hand check on its content. If a human reader can read all the text and it is not just keyword stuffing then you should be okay.
As Brandi said, there does not seem to be a definate answer with regards to an "automated" SE check for this.
We have very similar table sets, used for purely artistic reasons.
I'd hate to think we all may be forced to make our sites vanilla plain in order to not risk such a penalty.
Yes, every bit of text is meant to be seen. Nothing hidden. I even try to make sure color blind users don't have difficulty (hence the white on black, dark blue, etc. No reds, yellows, etc.) Any human review would be no problem because there is no problem, all text is visible.
I was more wondering about SlyOldDogs question of automated checking for hidden text flagging this as possibly spam. I know se's have gotten smarter, just want to be safe.
As for cases where the text and/or background colors are set via CSS, a background image, etc., it is my personal belief that Google will not penalize a site without a manual check. But I don't hold that belief strongly enough to test it out. There's almost always a safe way to implement any design.
I disagree about the manual penalties, I think that Google and other SEs are working hard on automating SPAM removal algorithms. I just think that they have the integrity not to implement them fully until they have been properly tested.
some of these will be difficult to find automatically.
take background colours and text colours for example. you can set background and text colours in the HTML using <font> tags or bgcolor="" attributes. you could set them using inline CSS or using an external CSS file. the combination of methods of setting values for text and background colours makes it difficult enough to check for hidden text.
what about using background images? that just makes it even more difficult. it's perfectly possible for a search engine to download and parse image data to determine colours and to build a "table" showing which colours go in which cells etc. if the image is all or mostly one colour or close colour range, and the text is the same, then great, they've spotted hidden text.
but what if the image is half white and half blue, and the blue text you use is placed above the white part of your image? so positioning of text also needs to be looked at. that means calculations on placement of images and text and of table cell sizes - remember to take into account that background images can extend beyond the size of the cell and that screen resolutions will change cell sizes. don't forget there are a few nice little tricks you can perform with table cells and that you can also set your layout using CSS instead of tables. it's getting more and more difficult to spot the hidden text!
now imagine if you hide your CSS file "below web level" so it cannot be spidered directly ...... that'll make it virtually impossible to detect hidden text automatically. but, read on ...
if you look at your log files, when a web browser (internet explorer etc) hits a page on your site, you will see one line entry for the page and one line for each image and one line for the CSS file. just that one page view creates a lot of log entries. when a spider hits the same page, you see one entry for the page, nothing more. that means the spiders don't get your CSS files and your images and therefore the search engine cannot automatically check for all hidden text.
but, if a search engine can get their spider to act like a standard browser, it could download all images and the CSS file with just one hit. all they need to do then is parse the pages and other files to calculate what goes where and what colour it is or should be. extremely complicated as i've pointed out, and will use a lot of processing power for each check, but not impossible.
i'd guess any search engine attempting to automate checking for hidden text etc will create a special spider acting like IE to download files like IE does. it might even have a special browser tag to make it look like IE so that webmasters don't ban it. i reckon that due to the processing requirements, they would start by only hitting a handful of sites ranked highly for common search terms.
even once they've refined the system and they're sure it works, i reckon they'll only hit pages maybe once every 3 or 4 months and it'll be totally separate from the normal spidering. hitting the pages more regularly would give the game away to webmasters that check their log files regularly.
so we know the difficulties the search engines have in automatically spotting hidden text. we also know they make manual checks for all forms of spam including hidden text and that they can and will remove spammers. what we don't is how far advanced their automatic hidden text spotting is. one thing's for sure - if you deliberately use hidden text to boost your rankings, eventually you'll get caught.
Okay this would only make the spammers work harder to avoid the penalty but if the automated tool was also being developed then it would eventually be able to identify many cases with good accuracy and eliminate them.
Here's the problem, when you have a cell that is black and then you have white text reversed out in that cell, you've created a usability issue. Ever see what white text looks like when printed? It's not pretty, in fact, its not period.
I believe (not sure) the default for IE is to leave print background colors and images unchecked. This means that all backgrounds, including those within cells do not print.
So, big deal you say? Very big deal if the person needs that information from your page. Some may go through their options and check the box to print background colors and get the information they were after. Others may not because they don't know its there.
The above reason is enough for me to not use white text if I absolutely don't have to. I usually end up giving it a color like #efefef which is a light gray and still provides enough contrast that it almost looks white.
Who really knows if they have a filter for this type of design practice? The thing I would look at is usability when it comes to white text. If you have important information that the user needs to print, don't make it white if you don't have to. ;)
Remember that Google is full of a lot of very smart people, who have thought about this issue longer and harder than any of us. They realize how many problems there are with finding hidden text automatically. They will not just run a filter that will remove sites that use white text.
I do think that they will start running a program that will be downloading some high ranking pages with a special version of mozilla that will look for cloaking, redirects and hidden text. A few tweaks to the source code and you would have a mighty fine spam detector that would only have to pop up and ask a human about what action to take on borderline cases.
Example: BGCOLOR of page is WHITE. Table cell TD is DARK BLUE. Text inside of dark blue table cell is WHITE. Main text on page is black on white (default). White text on dark BG is usually ONLY used for navigation and other parts of the page, not the main body of the page (I understand there "could" be other issues still with the white text for nav, but that's not the question here :)).
I am not trying to "trick" the se's or anyone else in any way. ALL text on the page is meant to be seen. The original question had to do with automated color checking by se's and whether this would be flagged as spam.
Everyone has given wonderful opinions, and I thank you for all the ideas. I may just consider changing to a very light shade of gray instead of white in the future just to be safe.
Thank you all very much for your input.
<body link="white" alink="white" vlink="white" bgcolor="white">
They basically have text in different colors, and an image that says "click here to enter". Then they have about 70 links to their individual pages that are like land mines sitting on the bottom of the page. This has been like this for years. I've reported them to Google twice, yet they are still there. If the engines should be able to do ANY recognition, they could start there.
p.s. Sorry to Brette to bring up an old grudge again, I just had to show this.
joined:Nov 11, 2000
A variation on this question... when using html font color and bgcolor attributes...
I've assumed that the engines weren't smart enough to detect the table cell background color... so I'd always advised clients, to avoid hidden text penalties, not to use white (or near white) font color attributes with a white page bgcolor attribute, no matter what color the cell background was.
With the advent of css, where clients have wanted light text against a dark cell background, I've advised that they use css to make the text white and simply not specify the font color in html.
But, if the engines are in fact now smart enough to detect the dark cell background, not using the font color="ffffff" specification in a dark cell may make the engines think the text is being hidden.
You can maybe go around in circles on this forever....