|Non Cached pages and Google Ranking|
is there a relationship...
I have 9 sites all linking together all with unique content with each one being specific to a certain colour of widgets, they are al optimized the same and have no illegal context or use any spam techniques.
One site still does VERY well in the new SERPs the other 8 sites dropped quite badly.
The only difference I can see is as follows:
The site which still ranks very well in the new SERPs has had some pages which should not have been indexed because they are only linked to through a form. These pages have not much contect on them but Google has indexed them with a description, title and cached version etc.. which I can see when I do an allinurl command on this site.
The other 8 sites also show similar type pages which should not have been crawled but they have not been indexed like the good site, they just show the URL and have no cached version when I do an allinurl command.
My two main questions are :
1. Is it likely that the pages on the 8 not so good sites will be indexed fully as they have been done on the good site even though the links to them are through a form which i thought Googlebot could not follow.
2. Has anybody come across this type of thing adversely effecting their SERPS before.
I really hope somebody has some feedback to this because this is the ONLY difference between my sites doing VERY good and less than average.
2- yeppers, it's called cross linking... search for it.
Why shouldn't I link these sites togehter, they are all conected in some way.
|... they are all conected in some way. |
If the overlap is almost zero and it is very logic to link from all these sites to each other, why don't you put all the content on one site? Heavy linking is no problem within a site.
I'm not heavily cross linking these sites, they only link to each other from the homepage and that is all.
There is NO WAY this problem is anything to do with cross linking. If Google decided to penalize me for linking my 9 sites together from their respective homepages then I must be the unluckiest person around.
Does anybody else out there have any comments on this problem.
I agree it is probably cross linking. You can usually do it with 2 or 3 sites, but interlinking 9 pages all together from the homepage is probably enough to trip a cross linking filter with Google.
If they are all just different colors of the same widget, you should put them all on the same site, rather than making a unique site for each color and cross linking them.
Is everybody trying to wind me up or am I having a "bad trip" here!
I have 9 sites which are specific to UK locations all with unique content and your saying I can't link their homepages together... surely that can't be the case.
If anybody would like to see the sites just stickymail me.
Brett, I'm surprised a little too. If the sites in question are related (in theme, or in approach) but not in content ... and there are only nine ... I don't see it ... especially if he only links off of the homepages. Those sorts of mini networks *dominate* the SERP's in any number of categories.
I know of one site that goes much further and has been #1 forever. They have a lead site, and then a whole series of sub-sites that could easily have been subpages, but were broken out as separate sites with keyword1-keyword2.com types of URL's. To add to it, the content on the subsites is essentially duplicated, almost exactly, from other pages on the main site that also cover the same content. To me, *that* is spam, but apparently not Google, and its far worse than the example noted above.
If I run sites called widgets.com, moppers.com, wiffles.com, gorpees.com and 5 more, and each is comparable in style and design and organization, but the products are not related, why not link them? If consumers like my style of organization and service, and some of those who buy widgets also buy moppers, then it's a service to the widget consumer to let them know we also have a store for moppers.
I've read the cross linking strings in here at length, and also run some mini networks. Have never had a problem, because we *only* do what would intuitively make sense for a consumer, we don't spam the cross links (homepages only, and they use the other site names, not keywords).
Seems to me that there *could* be some other explanation?
|even though the links to them are through a form which i thought Googlebot could not follow. |
Depending on the "form" you used to link to the other 8 sites, could Google have seen it as hidden links to those other sites and penalized them for it? Googlebot can follow many more kinds of links than most people realize, even if you are trying to disguise them.