Forum Moderators: open
Needless to say, our site was recently banned. Very frustrating since our site has 7000 pages of PURE content and we never spammed or did anything to purposely trick the search engines. We worked hard over 5 years to get the site where it is for the VISITOR to love it, and they do love it, but now it is much harder for people to find... even though we have the content that many people are looking for and can't find elsewhere. (Ugh. Can't sleep...)
Thanks,
Ryan
<...>
[edited by: ciml at 10:58 am (utc) on May 3, 2003]
[edit reason] No .sigs please. [/edit]
It may be worth reading the Webmaster Guidelines [google.com] again. Also, you may find some other reason [google.com] for being removed.
Ryan
Thanks. I understand if you are too busy. You are a very important person.
Ryan
I have seen a lot of sites that use this background image technique to separate parts of their pages. I just can't imagine Google would ban all these sites, but I can't imagine how Google would be able to "tell" that they weren't using hidden text or links. I just wish I knew exactly why were banned and when we will be back. AAAH!
Thanks!
Ryan
What part of that is not hidden text? You've taken it off your home page now, but do you think we can't check cached pages? So yes, I think you did have hidden text on your main page, and yes, I think you got caught. I would go over everything with a fine-toothed comb to make sure you've removed every bit of hidden text from your site. Within a month or so, if all of the hidden text is gone, you should should show back up in the index again.
It's not hard to write those keywords into a small paragraph and make them viewable by everyone.. an introduction for visitors, helpful to spiders, and not illegal.
Also, if you have to change every page of your site manually, now is the time to make it template driven so you never have to do it this way again.
I am unbelievably frustrated and I can only pray that we get back in soon. The bad thing is that we didn't even need ANY type of text on that page since our site is all about content. We get a ton of links to us as well from fans of the site. We have over 100 international writers too. Many of our visitors are e-mailing me saying they were frustrated that they couldn't find it in Google.
Oh well, I guess I need to wait and just keep updating our articles, calculators, databases and contest coverage. I just wish Google could scan the site again now to see that it is ALL good. I am also frustrated that one page out of 7000 could cause the entire site to be banned, rather than just that one page.
Thanks for your help Googleguy. I am embarrassed that this could happen. I have always HATED spam with all my heart and still do. We don't even send e-mails to our past customers, even though people think we could make money doing that. I refuse!
Sincerely,
Ryan
[edited by: bodybuilding at 9:19 pm (utc) on May 6, 2003]
Sort of a hidden text question - but you might understand why I have done it.
If I have an iframe that returns the following text for browser that do not support Iframes (Eg Googlebot) would it be considered hidden text:-
Information on Keyword should appear here - if you do not recieve information on Keyword this is because your browser does not support this function. You can upgrade your broswer at.......
I have wondered about this a long time :)
I guess if I dont use Keyword, related Keyword, Keywords, etc I should be ok?
Dont have to answer this one - just wondering!
Bodybuilding - sounds like you have a good site and it will do well in time if it is kept clean:)
Most people don't even know that it's "illegal" and when they get cought....they're banned.
What about the MANY sites that stuff huge loads of visable keywords on the bottom of their pages?
I don't see where hidden text is that big of a deal. Any competant webmaster can simply write that same (obviously important) text in the context of their webpages.
Yes I fully understand why google feels the need to do this, I just think the penalty is a bit harsh.
Maybe something like a reduction of -2 or -3 PR points would be more fair.
There is an "expert", who is an advisor to Google, who loaded up the bottom of a bio page with microtext misspellings to grab all the search engine spots for his name. Google didn't seem to have a problem with it for a very long time.
Just looking at the site now I see that the microtext is removed but only recently. I wonder if it was asked to be removed in order to prevent a backlash from other sites that were penalized for "hiding" text as well.
At PubCon, When Matt Cutts was talking about the new hidden text filters, he specifically stated that he was aware of a certain site owned by a member of the Google Technical Advisory Council that was using hidden text. He basically said that if the hidden text was still present when the filter went live, the site would be removed.
Ryan, I'm pretty confident that this thread is a learning experience for quite a lot of readers. At least you know how to fix the problem and get back in your wife's good books.
This fella, the tech adviser, had all his microtext set by CSS if I'm not mistaken. The CSS style in question still remains in the CSS file.
Must be some filter in action and sounds like quite an analysis breakthrough to me.
The style he had the text set at was 10% of what the default text was set at. So theoretically if he had his body text defaulted to 100 pixels then the microtext would be 10 pixels and therefore not hidden.
I've checked my logs on a whole spectrum of sites and have not seen anything closely resembling a browser or crawler from Google grab a CSS file.
I checked logs of a few sites last weekend but I too have not seen googlebot or any other crawler walking in external css files but some of the sites with hidden text have disappeared from -sj. Is it due to spam reports sent by competing sites or some algo change that is not sure yet.
If it is an algo change then it will hurt some of the older sites who are used to the gains from hidden links and other established SEO techniques of 1998-99. I think this is the only positive change that I have observed in the update this time.
Although the question is not directed at me but still...
IMHO if googlebot is going to check external css files for any significant number of sites then it will be honest enough to identify itself correctly or else it will find robot.txt blocking it from those who keep an eye on bots (including their IPs) for various reasons that include but are not limited to cloaking.
If it is only for checking sites reported in spam reports then it may be another matter.
If it is a hacked up mozilla, gecko or konqueror, it should identify itself as whatever rendering engine that it is based on.
<added>And there is no reason to believe that the automated checking code is a "bot" rather than a customized browser. If it isn't a bot, it does not have to obey robots.txt</added>
When I say that, I mean that unless Google is going around using some browser in North Dakota or Australia, typing in a search phrase and visiting my site, then it is NOT Google. Some of these sites hardly get any traffic other than from a search engine. The ones that didn't come from a search engine aren't Google either from the surfing pattern.
It would take A LOT of resources to acheive something that would appear to be normal browser acting like a normal user just to surf around and grab the CSS file without anyone getting suspicious.
By the way, thanks for taking so much time, you're really an invaluable resource for myself, and undoubtedly many others.
I have a question. A client of mine wants a site done with a flash menu system. In order for a spider to see everything properly I z positioned a text menu behind the flash. Yes, this is hidden, but it's not underhanded since the text menu is identical to the flash menu just without the flash.
Would this get his site banned? Thanks,
Greg
PS: There is absolutely no way to change my clients mind about Flash. Already tried that, many a time.