| 1:05 pm on Mar 27, 2007 (gmt 0)|
That is the very reason why I have refused to use that form...
| 3:35 pm on Mar 27, 2007 (gmt 0)|
>>Do you think that it is reasonable for Google to demand an admission of liability for them to look into what may be an error on their part?
No. I think it's a rather arrogant way to approach this problem.
| 3:56 pm on Mar 27, 2007 (gmt 0)|
I understand what you're saying, but if absolutely nothing has been identified or changed is a reinclusion request going to do any good?
| 4:00 am on Mar 28, 2007 (gmt 0)|
Yea .. I hate that they make you admit to something that either you didn't do, or didn't mean to do etc... However I've had to do it once however i did put a very VERY long explanation of how/why there were dupe pages and how it was fixed on our end.
| 9:52 am on Mar 28, 2007 (gmt 0)|
The problem I have is that I don't know what has caused the sites to drop out. If I could identify a duplicate content or link manipulation issue (or whatever) I would be happy to make changes and file a reinclusion request, but I have no idea what the cause was and, as Bewenched says, the request requires an appropriate explanation and details of changes made.
theBear (many thanks) offered the following possibilities (my responses in Capitals) :
Are there any inbound links to those four subdomains from pages that aren't on that domain?
YES, NOT THOUSANDS, BUT ENOUGH.
Is a common robots.txt file in use and is it by any chance blocking access by Google?
TESTED AND FINE.
Is Google still visiting the subdomains?
VERY SELDOM NOW, BUT YES & TO PARTICULAR PAGES.
Are the subdomains sharing a single IP address and are they interlinked?
THEY DO SHARE AN IP AND ARE NOW INTERLINKED VIA A ROOT INDEX PAGE, BUT WERE NOT INTERLINKED AT THE TIME OF EXCLUSION.
Are the subdomains making use of ODP data?
Do the subdoains make extensive use of rel="nofollow" which could look like link manipulation?
ALMOST NO USAGE, CERTAINLY NOT INAPPROPRIATE
What was on the domain prior to it being used?
NOTHING, THIS WAS A NEW DOMAIN
I'd appreciate any further ideas.
| 10:02 am on Mar 28, 2007 (gmt 0)|
Are you using AS?
| 10:21 am on Mar 28, 2007 (gmt 0)|
Action Script? No, not at all.
| 10:29 am on Mar 28, 2007 (gmt 0)|
| 10:49 am on Mar 28, 2007 (gmt 0)|
| 11:45 am on Mar 28, 2007 (gmt 0)|
A ban of all four subs sounds a bit Draconian and I'd have expected to see some pretty wild and obvious linkage spam to warrant that.
I'd get a cynical third party to check it for "cleanliness", if it passes I'd look for a possible indexing problem, how are they doing in Y! BTW?
| 11:46 am on Mar 28, 2007 (gmt 0)|
|Do you think that it is reasonable for Google to demand an admission of liability for them to look into what may be an error on their part? |
I think its unlikely there was an error on Google's part ... but at the same time, I think it is rather draconian to require anyone to sign their name to that statement prior to determining it is true.
If you submit a reinclusion request and the powers that be at Google look at the site and then determine that you are indeed guilty of violating Google's quality guidelines ... then fair enough, sign the document and make your ammends.
This requirement is more than a bit high handed though. Lots of people make honest mistakes without knowing they have done anything wrong. In law, ignorance of the law is no excuse ... but at least a person is allowed their day in court before being found guilty. The last time I checked though, Google doesn't write laws, just "guidelines".
If this is really important to you, I would ask a lawyer what this admission could mean to your business down the road. Who knows how this could affect your business in ten year's time should Google change their guidelines (without your knowledge) and you once again find yourself on the outside looking in.
The only reason I bring this up is because I once found myself on the outside for reasons I truly didn't understand. Google tweaked their algo and suddenly my site was penalized severely. It was nowhere! The day before, my site was in the top 5 for most of my important keywords, then suddenly, it was nowhere to be found. Long story short, it seems my keyword density was too high for the new algo. I eventually figured it out myself, fixed the site and it bounced right back, but I nearly went bankrupt in the meantime.
Personally, I would never sign that document because I know (in my case) that it just isn't true regardless of how Google changes their guidelines or their algo. The problem is that what is acceptable today may not necessarily be acceptable tomorrow.
I suggest you get an "expert" to have a look at the sites before you make any mistakes you may regret down the road. None of us knows how Google may use that prior admission against you in the future should they arbitrarily change their algo or guidelines. The two are not mutally exclusive.
[edited by: Liane at 11:48 am (utc) on Mar. 28, 2007]
| 12:17 pm on Mar 28, 2007 (gmt 0)|
You have to find the issue why google thinks your site is spammed. Did you spot check source code to see if you were hacked?
| 1:36 pm on Mar 28, 2007 (gmt 0)|
Will the submit fail if you just opt out of checking that box? It might go through with it unchecked.
| 3:23 pm on Mar 28, 2007 (gmt 0)|
@trinorthlighting: yes, looks fine.
@Glengara: Not too bad in Y!. Where would be best to go for such critical review?
| 3:46 pm on Mar 28, 2007 (gmt 0)|
His site is good and full of information.
I would say that the way the urls and subdomains were used caught him in a subdomain spam filter which is more than likely automated. After filing a reinclusion request a google employee should lift the penalty.
Its a good site full of information.
| 4:46 pm on Mar 28, 2007 (gmt 0)|
*Where would be best to go for such critical review? *
I'd probably go to either SEW or [groups.google.com...]
Since you're doing OK on Y! it doesn't sound like an indexing glitch, if it's the way the subs are setup that's causing the problem you may want to reevaluate that strategy...
| 5:13 pm on Mar 28, 2007 (gmt 0)|
OK, I'll play the devil's advocate here:
|[... We] recently launched four small sites around a topic. The first site launched in June, two in August, and one in October. They are created as subdomains of a new domain [...] |
If I were Google, the question I would ask is this:
Do all four sites cover the same topic? Why are four sites necessary?
If all four are owned/operated by the same entity, and all four cover the same basic topic, then that --in Google's eyes-- is spamming the index with four sites when one would be sufficient to provide their users with information on that subject. Four sites takes four times the space in their search servers and four times as much work by Googlebot and the back-end semantic-processing, ranking, and filtering programs, and potentially, four times as much space on the first page of search results. For this reason, Google wants a compelling reason to list pages from four sites.
They would see this situation as an attempt to gain a 'bigger footprint' in search.
Taking off the devil costume and assuming that yours is at worst a marginal and not blatant case of identical/similar site topics, I'm surprised that they didn't just filter all but one site out. I will follow this thread with interest.
| 9:27 pm on Mar 28, 2007 (gmt 0)|
I think you're approaching dealing with Google the wrong way (no offence)
We'd all like to be able to run off to Google whenever a site stops ranking and 'say "I know my lovely site is ok - please do some voodoo to make it rank properly", but there are so many of us that they don't have the resources to hold all our hands.
They have picked a particular exception to this: where a site has previously been in violation of their rules, you can do a check-a-box thing to get them to re-review it.
I suspect this is an automated method that looks at what they have recorded as problems for the site and sees if those items are still there.
I repeat - this process is specifically designed for when there have been problems on your site, and you have fixed them.
Your case? because you have not done anything that is against the rules, a re-inclusion request is not going to be useful to you - this, I think, is why they have that nasty statement in the reinclusion process, to tell people what it is there for.
What should you do?
Google groups, as mentioned above, would seem to be an obvious one.
I would also agree with someone else above and get someone independent to have a look through your sites - its possible that one of your team members is hiding nasty stuff on the company site - it does happen :(
If none of that helps, I'm thinking the 'subdomain spam' diagnosis sounds possible. You might want to review whether separate subdomains are really justified for these sites.
You could also ask the mods permission to put your domain names in this thread for members to have a peek at, but I don't think they normally allow that.
You know what would be nice for webmaster tools? A listing of the rules a site is in breach of. But I suppose the spammers would abuse that to run close to the wind :(
| 2:37 am on Mar 29, 2007 (gmt 0)|
Admitting intention to breach rules is not required, just that you breached them.
| 4:24 am on Mar 29, 2007 (gmt 0)|
I can tell that checking that box means that you acknowledge what part of google guidelines did you breach, and that you actually fixed it.
Failing to do so makes a re-inclusion request of no use as google software engineers will never tell you what the problem(s) your site might have.
| 8:00 am on Mar 29, 2007 (gmt 0)|
@jdMorgan: "Do all four sites cover the same topic? Why are four sites necessary?"
No, they cover four very distinct areas of a loose topic. Without posting the URLs it is hard to explain, but I think that the specific topic that each covers is valid and they hang off the same domain because they vaguely 'fit' together. Certainly, whilst the style of each is similar, the content and subjects are not shared.
@leadegroot: Who should I ask permission from?
| 9:50 am on Mar 29, 2007 (gmt 0)|
Sign the "confession."
| 12:27 pm on Mar 29, 2007 (gmt 0)|
i think the issue here is trust.
It's a new domain and right from the beginning you're creating four sub-domains.
Unfortunately for you, Google probably doesn't manually review most sites prior to banning them, so it has to recognise patterns that spammers use. It identified a pattern, had no trust or faith that you weren't a spammer, and banned you.
| 5:17 pm on Mar 29, 2007 (gmt 0)|
The urls that were used are very spammy looking .info site. Similar to what mfa sites do.
Content is good, its just the url's
| 8:01 pm on Mar 29, 2007 (gmt 0)|
|Unfortunately for you, Google probably doesn't manually review most sites prior to banning them, so it has to recognise patterns that spammers use. It identified a pattern, had no trust or faith that you weren't a spammer, and banned you. |
I think they figure if we ban you and "you" say nothing, well, your guilty since you are not fighting the ban. That's why they don't manually review a site before banning. (also saves manpower).
| 9:56 pm on Mar 29, 2007 (gmt 0)|
Quit whining and go build some links.
| 10:07 pm on Mar 29, 2007 (gmt 0)|
I'd so just register four domains, delist any remnaints of the old sites, and put them up again.
Even if you'd get the subdomain ban lifted this time, if there's nothing on the main domain, a second and a third wave of penalties is almost guaranteed. I wouldn't bet on that the Google employee reviewing the site would be able to permanently flag it as "invulnerable to future subdomain related penalties because it's actually a great site, really".
I've had some fun of Google not being able to remember its own decisions on bans and un-bans. Had to go through the same procedure on an annual basis, mailing them back their own decisions of the same issue every twelve months. Not sure if I'd advise this route.
I'd say get the four sites to look like four sites on the domain level, and forget all your troubles. Including the reinclusion request.
[edited to say] ...what Bennie said.
[edited by: Miamacs at 10:09 pm (utc) on Mar. 29, 2007]
| 10:27 pm on Mar 29, 2007 (gmt 0)|
|I'm surprised that they didn't just filter all but one site out. |
I am too.
Is there any interlinking between the sites?
| 10:33 pm on Mar 29, 2007 (gmt 0)|
"I believe this site has violated Google's quality guidelines in the past."
I thought they removed that garbage.
I thought Google Guy or Matt Cutts even publicly announced the change to stop favoring spammers.
| This 38 message thread spans 2 pages: 38 (  2 ) > > |