Forum Moderators: open
The site which still ranks very well in the new SERPs has had some pages which should not have been indexed because they are only linked to through a form. These pages have not much contect on them but Google has indexed them with a description, title and cached version etc.. which I can see when I do an allinurl command on this site.
The other 8 sites also show similar type pages which should not have been crawled but they have not been indexed like the good site, they just show the URL and have no cached version when I do an allinurl command.
My two main questions are :
1. Is it likely that the pages on the 8 not so good sites will be indexed fully as they have been done on the good site even though the links to them are through a form which i thought Googlebot could not follow.
2. Has anybody come across this type of thing adversely effecting their SERPS before.
I really hope somebody has some feedback to this because this is the ONLY difference between my sites doing VERY good and less than average.
If they are all just different colors of the same widget, you should put them all on the same site, rather than making a unique site for each color and cross linking them.
I know of one site that goes much further and has been #1 forever. They have a lead site, and then a whole series of sub-sites that could easily have been subpages, but were broken out as separate sites with keyword1-keyword2.com types of URL's. To add to it, the content on the subsites is essentially duplicated, almost exactly, from other pages on the main site that also cover the same content. To me, *that* is spam, but apparently not Google, and its far worse than the example noted above.
If I run sites called widgets.com, moppers.com, wiffles.com, gorpees.com and 5 more, and each is comparable in style and design and organization, but the products are not related, why not link them? If consumers like my style of organization and service, and some of those who buy widgets also buy moppers, then it's a service to the widget consumer to let them know we also have a store for moppers.
I've read the cross linking strings in here at length, and also run some mini networks. Have never had a problem, because we *only* do what would intuitively make sense for a consumer, we don't spam the cross links (homepages only, and they use the other site names, not keywords).
Seems to me that there *could* be some other explanation?
even though the links to them are through a form which i thought Googlebot could not follow.
Depending on the "form" you used to link to the other 8 sites, could Google have seen it as hidden links to those other sites and penalized them for it? Googlebot can follow many more kinds of links than most people realize, even if you are trying to disguise them.