|Banned from Google for a Hidden DIV?|
After just two days...
We just launched a new site three days ago. We only had like 2-300 uniques until today and I have been looking for googlebot in the logs. So I found it once a couple of days ago and google detected and registrered it.
Is it worth waiting for google to return or is our site forever banned from google?
I am a no-know in the SEO world so any advice would be great.
Is this the way it went?
You put the site up a few days ago. Saw gbot in the logs, saw your site show up in Google, didn't see more bot hits, and then the site disappeared from Google, all within several days?
If it was just picked up a few days ago, it is possibly just fluxing in and out, and your bot-visit situation might be that it has found you, has you on a to-do list, and will soon come back. One would think that if the hidden div were a problem, the site wouldn't have been listed in the first place.
Be patient. The bot and listing could be back, to stay, before too long.
I've had the flux thing with most new sites. I wouldn't worry.
Thanks guys, that was just the case. The site is back in the google index (phew!)
I've got a friend who got hit by Google a few years ago for putting a link in a hidden div. With something like that, ask yourself if what's in that div would pass human review.
Is it legitimate to place a script for a toplist's counter in a hidden div?
And, by the way, how could it be detected by Googlebot if, for example, you have a hidden div but your css is external and placed within a forbidden to * bots directory (robots.txt)?
> And, by the way, how could it be detected by Googlebot if, for example, you have a hidden div but your css is external and placed within a forbidden to * bots directory (robots.txt)?
It probably can't - at the moment. If Google or other engines decide to check spam by viewing the CSS file, then one of two things could happen if you block the CSS by a robots file. Firstly, it could ignore your robots file for the CSS (taking the assumption that the html should be indexed so they need to check the CSS for spam) or secondly, they could respect your robots file and possibly drop the pages using that CSS file, assuming that if you are blocking it, you do not want it indexed (or anything that uses it).
Is it worth the risk?
> they need to check the CSS for spam
The question is: what could be considered spam in the css file? If it's visibility: hidden or display:none then all the beautiful pull out menus will be spam too? And the things like tooltips, etc ...
I had a 1pix link one time that filtered me right out. I had a site dissapear. I had another domain with backlinks that was ranking ok. I moved the old site to the new one and it left as well. I did not know the thing was there at the time. Once I found it and took it off everything went back to normal. I put all content back where it was and everything is peachy everywhere now.