Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
just a question on hidden text and google: is there a way google can be tricked with this?
I use css to do it, but i'm not sure i should
i set the fontsize to 0.2em and the colour very simular (but not identical) to the background colour
is this ok or should i expect a penalty?
There have been reports of googlebot reading external css files but so far as I am aware, this is not standard practice yet.
If your hidden text is simply SEO, I don't have a major problem with it, but if you are trying to attract visitors to a site with no content or irrelevant content in the hope that they'll click your adverts, I would be happy to see your whole site wiped out. I suspect that view is widely shared around here.
Hidden text is wrong no matter what content on on the page.
Not always, sometimes its legit - for example some people write what they really meant but did not want everyone to see in HTML comments (so that you need to see source), or using same color for background/font so that you need to use your mouse select to see REAL text.
Now purpose of that is contrary to what spammers do as spammers don't want hidden text to be seen, where as these legit uses actually want people to see hidden text, its just not for everyone.
Weird? Perhaps. Not legitimate? No. What would most search engines do? Most likely drop page, even though I hope they do additional analysis to put things into perspective, ie total hidden words vs not hidden.
i certainly don't want to trick peolple and i don't have banners on the site
i read on a seo site that hidden links were the way to go, all though i had some doubts, the guy seemed to know what he was talking about and made one good point: google cannot see if a link is hidden if you don't use the exact colour of the background (ie cccccc as background and dedede as font colour)
although i am pretty new to the website building thingy, i allready had some result with this technique.
the thing is, there are a lot of search terms out there that only have very few responses and do fit my products, but i can't put all of these in the site, because it will sound very silly
how would you guys suggest i would go about this?
Infact why you do what more and more people seem to be doing today. Just create a section on each page called 'keywords' and create a comma separated list of all terms even remotely relevant to what you want people to search on.
Or even, just create a robot to grab text off other people's sites who rank well and use it as your own.
I wonder why google is now increasing the relevance of on-topic inbound links? I just can't work it out!
create a comma separated list of all terms even remotely relevant to what you want people to search on.
Good text analysis should catch it because people don't write stuff in form of: "keyword1, keyword2, keyword3" etc. I also think this is where outsourcing to India and other countries with inexpensive labour can be useful, just think about it:
1 person looking at 1 URL for 1 minute (being generous here - should take a lot less time to determine if a page is spam or not) will check ~500 URLs in an 8 hours long working day, or 120k a year.
1,000 people there will cost $10k each (max!) per year, so total cost is $10 mln per year for checking 120,000,000 suspicious (flagged by software) URLs.
Of course going this route for a company with overpriced shares is hard because they will have to admit that sometimes a cheap hammer can do the job without having to hire 10 PhDs.
Now asking whether or not you'll get caught doing something you already know is wrong is a fair question :)
But would you trust another human being to rank your site against others?
I would not trust them to "rank", but I would trust to make simple decision "yes or no" whether particular URL is spam or not - it does not take genious to see whether it was cloaking or not. In DMOZ they don't make decisions as simple as that as its easy to introduce bias, but spam is spam - when I get spam emails I have not had difficulty recognising those that were spam even if I did not see who sent them!
Who will watch the watchmen? Easy - every URL will be tested by different people at different locations, if they all fail to make same decision then higher level reviews will see. $10k in India is very good money, is it worth losing nice easy job that pays well?
$10 mln is nothing to Google, and extra eyeballs could do a lot of harm to spammers, in fact enough can be done to make it very hard to spam.
Talking of DMOZ - I was analysing their dataset recently and found that at least 1% of all URLs point to "adult" websites :o
Hidden text is wrong no matter what content on on the page.
I've always taken this approach. Do whatever you want to your site, it's yours. If Google doesn't like it and ban you, so be it. I don't think we should ever put people down for doing that kind of stuff. It is their property. It is not their fault Google can't figure it out. I mean we have a multi-billion dollar company more worried about playing games with a few hundred SEOs than fixing clear problems in their SERPs like this.
If Google doesn't like it and ban you, so be it.
This may be easy for you to say but when you rely on the traffic generated by Google through one website for your business "so be it" is not an option.
Google delivers about 80% of my traffic because my target market uses Google. I cannot afford to lose this.
There are a few of us in this position.
the guy seemed to know what he was talking about and made one good point: google cannot see if a link is hidden if you don't use the exact colour of the background
it may work, but it's a risky technique. Google won't like it IF they figure out you're doing it, and it could get you banned.
Now you know the risk - it is up to you whether or not to decide to take it. Everyone has their own "risk comfort level" - you have to decide what yours is and work within it.
What about this scenario:Ethically there is nothing wrong with this; however, I believe that Google would and should penalize this because it could be very easily abused.
Site has alot of content, all in images. Is it wrong to css the *exact same text* in a display:none css file? Should Google penalize that?
If the SERP really is competitive, then it's a safe bet if you rocket to the top in Google, other webmasters competing for that SERP will rat you out to Google.
Ratting on people to Google seldom works. I did this on a couple of occasions months ago and the sites I reported are still there and still using the same black hat techniques.
It is only by exception that Google makes manual changes. They may adjust their algo in an attempt to root out offending sites but it does not always work. Think about it, how many such reports per day do you think Google gets? 100? 1000? 10,000?
[edited by: ciml at 5:29 pm (utc) on Sep. 2, 2004]
[edit reason] (turning off notification) [/edit]
Black hat or white hat notwithstanding, I think most of us would agree that if for ANY reason, Google ever put human eyes on a site doing what is described, they would definitely say it was not "OK".
I don't ever want to be in a position of having had a site penalized and then trying to explain my practices to G. Been there, done that, didn't like it! And yes, I did find forgiveness in G's heart)
Should he expect a penalty? If caught, I would say expect a heavy hit if not a complete ban.
Just my 1.75 cents worth of opinion.
google cannot see if a link is hidden if you don't use the exact colour of the background
The guy that told you this, knows nothing about computer science.
I've programmed a tool that detects, among other things, hidden text (and almost hidden text) in web pages. You can set background and foreground colors using CSS, images, HTML tags and attributes....
If the invisibility is achieved using the same color for foreground and background (or with other simple methods, like CSS "hidden" property), the tool will detect it.
Hidden text detection is not easy to program, but it is possible.