Forum Moderators: Robert Charlton & goodroi
I have a 13-year old domain, that has about a dozen or so sub-directory based websites underneath it:
www.example.com/subdir1/index.htm
www.example.com/subdir2/index.htm
,
,
,
www.example.com/subdirN/index.htm
My domain shows up # 1 in Google if I type my domain name or Main Page title into Google. 10 of the 12 sub-directory based websites rank well overall, and 2 of them, which used to rank well, have tanked since June, 2008.
For these 2 sites, if I type the exact name of the site into Google, I never rank better than # 35 or so in Google.....similarly, for any long-tailed phrases pertinent to those 2 sites.
One thing I observed, is that during Google shuffling or filter application every couple of days or so, there is a relatively short period of time (maybe 10-15) minutes, where I will rank at # 4 or # 5 in Google (when I type the exact title of these sites into Google). But, once a "penalty" or "filter" is applied to this site during this shuffle period, I'm back down to # 35 or so. I am tipped off as to this occurrence, by seeing a flood of weblog activity for these sites, during these short durations.
OK, I'll get to the point now. It looks like some sort of filter or perhaps penalty is being applied to these 2 sites, but my domain as a whole, has been otherwise uneffected.
Given the above, is it worth my while to put in for a Reconsideration Request to Google ? I'm unsure of why I'm being hit on these 2 sites, however. Or, is it not worth using my silver bullet here, and saving it for something more worthwhile ? Those 2 sites were big revenue generators for me, and have since, fallen off the perverbial revenue map.
Thoughts on this would be appreciated.
Regards,
Doug
You are describing a kind of rapid "yo-yo" that I'm hearing more and more reporst about recently. Wish I could help you pin down the causes. So far, all I suspect is relatively weak (not very diverse) backlink profiles.
As always, thank you for your reply. I am working on some theories, but as you know testing can be slow and arduous process.
I guess as a general rule, I was wondering if a "-30" filter (my name for it) as described above should be used as a basis for a reconsideration request OR if this type of request was more applicable to those webmasters whose domains have been banned outright ? I recognize that anyone can file one, but I never have, and was wondering if anyone with those types of penalties/filters applied and not total bans, have used (and had any success) with doing so. I want to use it as a last recourse, and if Google is going to dismiss this as a frivolous request OR potentially have it open up a can of worms for me, then I'd certainly want to avoid doing so outright.
Thanks again,
Doug
If the main domain and the subdomains have good independent link profiles, this may not be the issue. If they're heavily dependent on each other but have previously ranked well, it's possible that Google is reassessing this area.
It's also possible that Google may be further adjusting how it treats subdomains. In that regard, are these rankings for queries containing similar or related terms?
These are sub-directory based sites, not sub-domain sites. Not sure, as you suggest, if Google is looking at these more these days. I don't quite understand your question, but the 2 sub-directory sites that are effected are different subject-related sites from those that rank well, and the long tail terms that used to rank well for many years, have been effected as I mentioned above. The terms that have been effected are subject-related, in that they pertain to the content material of the sites.
These are sub-directory based sites, not sub-domain sites.
Thanks. Yes, my answer did switch to subdomains. On the same subdomain, Google limits the number of results on a given query to two. But, particularly on long tail searches, domains with multiple subdomains previously did sometimes show more results. A while back Google took steps to make multiple results more difficult to get. The thrust of my question was whether this was the situation with your site(s). Clearly it's not.
I'd still look to common linking sources to the multiple subdirectory sites, and dependence on crosslinking if any between these sites, as a possible cause of the problem.
Having multiple sites set up as you've described is unusual. I'd think the arrangement would only work if each "site" had a truly independent linking profile.
[edited by: Robert_Charlton at 11:51 pm (utc) on Mar. 7, 2009]
Thank you for your reply. I NEVER heard of the limit of 2 results across subdomains for a given search (which clearly could be extended to my sub-directy architecture). Based on this revelation, I now understand your original question. Do you know approximately when this went into effect ? I don't think this is a problem for me, but I need to think about this some. By limiting the "number of results" do you mean the TOTAL # of results returned on a query (which could be millions) or within the first couple of pages of results ?
I am currently in the process of breaking the linking interdepencies between each of my sub-directory based sites. In another thread, I picked up on this. These sites have always had this dependency, and perhaps Google is now recognizing this inter-linking as links between unrelated sites content-wise (which they are), whereas before they had been considered part of the same domain, and effectively ignored (from a penalty standpoint) by Google. I dunno, but the test I'm currently running should be interesting enough. Very astute of you to consider this, given my somewhat uncommon site architecture.
Thanks for your input.
Regards,
Doug
By limiting the "number of results" do you mean the TOTAL # of results returned on a query (which could be millions) or within the first couple of pages of results ?
For many years now, for a given query, Google has displayed at most two results per host/subdomain (cumulatively on all serps pages). If the results occur on the same page, Google clusters them, as I note in this post...
[webmasterworld.com...]
You can get Google to display more than two (if they exist) by disabling the dupe filter (ie, by adding &filter=0 to the Google search url).
As I suggested in this thread, subdomains were often a way around this. At a Q&A session at the Dec 2007 PubCon in Las Vegas, Matt Cutts discussed Google's new treatment of subdomains, to generally limit multiple results across subdomains. Soon after, he posted this on his blog....
Subdomains and subdirectories [mattcutts.com]
December 10, 2007
For several years Google has used something called "host crowding," which means that Google will show up to two results from each hostname/subdomain of a domain name. That approach works very well to show 1-2 results from a subdomain, but we did hear complaints that for some types of searches (e.g. esoteric or long-tail searches), Google could return a search page with lots of results all from one domain. In the last few weeks we changed our algorithms to make that less likely to happen....
When I drifted away from your original question and commented on subdomains, the above is what I had in mind. I can't imagine that you were ever seeing more than two results from your combined subdirectory sites for any one query.
...and perhaps Google is now recognizing this inter-linking as links between unrelated sites content-wise (which they are), whereas before they had been considered part of the same domain, and effectively ignored (from a penalty standpoint) by Google.
I doubt that this is the case. More likely (and to oversimplify)... because of your site architecture, you were in effect linking to yourself, and Google is now probably wanting more outside confirmation, not too different from the situation as I see it in this current discussion...
Linking self-owned sites bad? (per Google)
[webmasterworld.com...]
There are of course many factors that might be affecting your situation.
[edited by: Robert_Charlton at 7:34 am (utc) on Mar. 8, 2009]