Well, clearly the four wwww's is a mistake. However, according to Matt Cutts here at PubCon, in just a few weeks all subdomains will cluster as one in an sewrch result. So the most any search will show then is two results for any domain name. Then the four wwww's will drop out for you.
In the mean time, it sure looks like somehow a wire got crossed, doesn't it? I'd double check that the wwww to www redirect is really delivering a 301 status.
|I'd double check that the wwww to www redirect is really delivering a 301 status. |
Based on some new info, I might have an issue here; I will let you know.
|all subdomains will cluster as one in an sewrch result. So the most any search will show then is two results for any domain name |
Are you saying that all subdomain links will be included in the 'Sitelinks'?
Sitelinks is a separate thing, a set of expanded internal links just for the #1 spot on certain specific searches. But that doesn't enter into this picture. Here's what I am saying.
Here's what happens now. The first step of results retrieval for any single search still has no limit on how many urls can be returned from a domain. In the early days of Google, a domain could even have all 10 first page spots and still keep on going. It could even be embarrassing!
Today, the preliminary, raw retrieval of roughly 1,000 results still puts no limit on how many urls can be returned from a given domain. But there's a further processing step - a filter kicks in. That filter is supposed to ensure that only 2 urls maximum from any domain will actually be shown.
If those two urls happen to be on the same page, then they will cluster together on that page rather than show at their "true" algorithmically determined position. But through all the total pages of any search result, any single domain is supposed to show up a maximum of 2 times.
Now here's where we've been able to game the current situation. Subdomains are treated like a separate domain, and so you can get two results for www.example.com, two more for sub1.example.com, two more for sub2.example.com, and so on.
Matt Cutts mentioned that Google is working on code to eliminate that possibility for most domains. That is, Google plans to treat most subdomains essentially like any other url on the main domain, and they will limit that domain, INCLUDING all its subdomains, to two positions total on any given search.
At that point, the whole subdomain vs. subdirectory decision will lose most of its importance - and your wwww urls will not show up, even though they may still be causing you trouble behind the scenes.
When checking the current handling of subdomains, I did a query on Google for site:yahoo.com
( I guess it's allright, since the subdomain thread mentiones a query for google already, we need to be fair with these two )
The first result was:
Yes, that's an @ in the subdomain ( host... hehe )
It disappeared within a few days time, and while outranked www.yahoo.com in the site: search, never made it to the regular SERPs.
I wonder what that was.
Does this mean that a certain auction site and a certain bookshop will be treated in this way? I certainly hope so.
What are the odds on there being some exceptions?
What about subdomains like .org.uk or .co.uk or say .us.com where the websites on these subdomains are separate websites in their own right?
This is going to be fun!
kamikaze Optimizer you might want to check on the wild card cases, I have a suspicion that you site will also allow to resolve to asdf.yourdomain.tld and thisainthere.yourdomain.tld, which I would assume be treated as a dup content if it resolves with the same content.
We had an issue like this a while back and it took out a healthy chunk of our site from the index.
I should mention that I posted what I heard here at PubCon reluctantly, because I tend not to like discussing vaporware in any form - and until this change actually happens it's only hearsay or vaporware, even if this source should know what's being planned.
The clear need for some exceptions for some domains could make this change quite problematic and delay or even negate its "launch".
|Are you saying that all subdomain links will be included in the 'Sitelinks'? |
Subdomain links can already show up as Sitelinks. This happens for several of my clients who have a key subdomain with a prominent link on their home page. But it will never be "all" subdomains. Sitelinks has a limit of 8 total.
Matt Cutts on a Sphinn story:
"This isn't a correct characterization of what Google is looking at doing. What I was trying to say is that in some circumstances, Google may move closer to treating subdomains as we do with subdirectories. I'll talk about this more at some point after I get back from PubCon."
Clear as mud then as usual!
Sure it is.
It's gonna be manual.
And evaluated on a case by case basis. *smirk*
I bet they already have a few targets on mind.
Matt pulled me aside at the pub this afternoon to add some information. Here's what I understood:
This change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".
Matt also did a video interview with Michael McDonald of WebProNews this afternoon, where he planned to bring more clarity to this issue. When that video goes live, we'll have even more direct information.
My apology for getting the details a bit messed up first time around.
Thank you for the input, I am looking into it.
It appears that my issue is due to the fact that I have setup 302 redirects (temporary) and not 301 redirects (permanent).
I intend to fix that this weekend.
On the whole sub domain / Matt Cutts comments (which is worthy of a new thread on its own,) I welcome this change.
I would love to see Google start with *.blogspot.com
Granted, BlogSpot has some great sites, but it has also been a breeding ground for some awful spam.
Maybe some kind of a Trust/PR factor would work well here.
[edited by: tedster at 6:33 am (utc) on Dec. 8, 2007]
I'm really confused on this new Google concept myself because of all the blog hosts including blogspot that treat a subdomain as a new user website account as do the auction sites and what not. What about all of Google's, Yahoo's and MSN's subdomains?
I have sites 10+ years old that have many "mysubdomains.mymain.com" because I thought it was logical and easy for the web surfer (10 years ago) to build these large sites with the subdomain structure.
Many clients have been enjoying their number 1 listings on Google.com with 6 to 8 subdomains from their site listed as those fancy "extra links" added. So are these sites going to get the shaft from Google because they have more than 2 subdomains?
I didn't hear anything about getting the shaft - just lowering the current subdomain advantage at getting more than 2 positions on any one search result. Sitelinks are all in the #1 spot - so that counts as the first position with one more left before the damping would kick in, as I understand it.
that makes more sense to me Tedster as so many sites also get #2 listings after the expanded #1 listing. I really want to watch this new Google approach on subdomains evolve because it can indeed clean up duplicated pages in a big way but at the same time start killing off the innocent like any war does.
Matt Cutts has now written about this on his blog.
I love the notion of 'really relevant' results as opposed to 'relevant' results - so does this mean that the plethora of results from a certain auction site for millions of long tail terms have ALREADY disappeared?
You tell me!
Looks like we have some manual manipulation of search results - am I wrong?
Well, algorithms are written manually so the line between manual an algorithmic does get blurred a bit. The thing I note is that this change has already happened. So what ever parts of the sky will fall because of it -- well, they've already fallen.
Here's what Matt wrote [mattcutts.com];
|In the last few weeks we changed our algorithms to make that less likely to happen in the future. |