Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Treatment of a Subdomain Compared to a Domain

         

cangoou

3:17 pm on Nov 20, 2007 (gmt 0)

10+ Year Member



Hi, what happens if I create a subdomain to a well-ranked domain? Will it be treated by Google like a folder of that domain or more like a new domain (concerning ranking, internal links, pr and so on)?

Fiver

5:23 pm on Dec 6, 2007 (gmt 0)

10+ Year Member



Link juice doesn't flow at the domain level - it flows at the unique URL level by links....

Correct, but penalties may flow where PR juice may not - and maybe we hijacked this thread a little by brining penalty into it. Sorry if that's the case, but I think it's worth clarifying, and I'd love feedback on my understanding of penalty flow with respect to subdomains/subdirectories.

> A main domain which has incurred a penalty can pass that penalty to any of its subdomains or subdirectories regardless of links. (hoping for a challenge on this one)

> A subdomain which has incurred a penalty will not pass that penalty to the main domain unless the main domain acknowledges the subdomain with a link. Or, complementarily put:

> A main domain which has not incurred a penalty cannot be affected by penalties placed on any subdomains, unless that main domain links to the penalized subdomain.

tedster

1:16 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



News flash from Las Vegas PubCon. Matt Cutts informed us that Google will very soon begin treating subdomains and subdirectories the same in this fashion: there will be only 2 total urls from a domain in any set of search results, so no more getting 3, 4 or however many spots via subdomains. We didn't get any more information than just that basic heads-up.

pleeker

3:58 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks for sharing that, tedster.

Did he really mean in a SET of search results -- or did he just mean in a grouping of 10 search results (i.e., per page)?

tedster

5:42 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We were talking about the way that you can currently have two spots for urls in the main domain and still other spots for a subdomain under that maidn domain. That's what will end. Subdomain urls from any given domain will be clustered with other subdomain urls from the same top domain in any given search.

Jane_Doe

5:56 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I attempted to contact a couple of the advertisers to request that they not target my site for the whole domain but got nowhere.

That happens even without separate subdomains. In my experience advertisers will target your site sometimes even if you only have one or two pages that are a good match for their product.

callivert

9:28 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google will very soon begin treating subdomains and subdirectories the same

will it be end of blogspot.com, typepad.com, and wordpress.com in the SERPs? They each have thousands of blogs as subdomains.

tedster

9:51 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sure there will be exceptions, for instance the sites you mentioned. Of course, how often do two different blogspot subdomains make it into the same search, anyway?

But still, the need for some exceptions is what makes this a project for Google to code, rather than a simplechange to the filter.

tedster

10:00 am on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's a quote fromanother post that I just made:

Here's what happens now. The first step of results retrieval for any single search still has no limit on how many urls can be returned from a domain. In the early days of Google, a domain could even have all 10 first page spots and still keep on going. It could even be embarrassing!

Today, the preliminary, raw retrieval of roughly 1,000 results still puts no limit on how many urls can be returned from a given domain. But there's a further processing step - a filter kicks in. That filter is supposed to ensure that only 2 urls maximum from any domain will actually be shown.

If those two urls happen to be on the same page, then they will cluster together on that page rather than show at their "true" algorithmically determined position. But through all the total pages of any search result, any single domain is supposed to show up a maximum of 2 times.

Now here's where we've been able to game the current situation. Subdomains are treated like a separate domain, and so you can get two results for www.example.com, two more for sub1.example.com, two more for sub2.example.com, and so on.

Matt Cutts mentioned that Google is working on code to eliminate that possibility for most domains. That is, Google plans to treat most subdomains essentially like any other url on the main domain, and they will limit that domain, INCLUDING all its subdomains, to two positions total on any given search.

At that point, the whole subdomain vs. subdirectory decision will lose most of its importance - and your wwww urls will not show up, even though they may still be causing you trouble behind the scenes.

BradleyT

1:37 pm on Dec 7, 2007 (gmt 0)

10+ Year Member



"For most domains".

I bet Apple will continue their subdomain dominance when you search for Apple. Although that really is an example where it does make sense for them to have many listings.

Fiver

3:53 pm on Dec 7, 2007 (gmt 0)

10+ Year Member



If this is simply a filter applied after a results set is pulled, does it affect any of the previously understood answers to the questions of link and penalty spread this thread has been about?

jimbeetle

4:08 pm on Dec 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Wow, very big and -- as a user -- very welcomed news.

Now, as an SEO, time to come up with a new strategy ;-).

tedster

2:43 am on Dec 8, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Matt pulled me aside at the pub this afternoon to add some information. Here's what I understood:

This change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".

So this change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".

Matt also did a video interview with Michael McDonald of WebProNews this afternoon, where he planned to bring more clarity to this issue. When that video goes live, we'll have even more direct information.

My apology for getting the details a bit messed up first time around.

reseller

8:08 am on Dec 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Matt Cutts has just posted very informative post about the current subject. For the benefit of future discussion, I'm quoting what Matt wrote:

For several years Google has used something called “host crowding,” which means that Google will show up to two results from each hostname/subdomain of a domain name. That approach works very well to show 1-2 results from a subdomain, but we did hear complaints that for some types of searches (e.g. esoteric or long-tail searches), Google could return a search page with lots of results all from one domain. In the last few weeks we changed our algorithms to make that less likely to happen in the future.

This change doesn’t apply across the board; if a particular domain is really relevant, we may still return several results from that domain. For example, with a search query like [ibm] the user probably likes/wants to see several results from ibm.com. ...this change has been live for a couple weeks or so now and no one noticed. :)

Subdomains and Subdirectories [mattcutts.com]

[edited by: tedster at 4:36 pm (utc) on Dec. 10, 2007]
[edit reason] use a shorter quote [/edit]

tedster

7:03 pm on Dec 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Since we now know that this change is already in effect, I thought I'd check out results for some clients when searching on their business name - and indeed some of the subdomains that previously ranked high on page one have slipped a bit - such as from #3 to #8 in one case. And for some businesses I see no change at all.

So this really is a minor tweak, and not a big deal.

KVeil

12:27 am on Dec 11, 2007 (gmt 0)

10+ Year Member



Fiver has nicely outlined the possible link juice rules for subdomains/subdirectories.

Can I ask the similar question how Google treats pages. Eg. if I have a 3y old well-established example.com/widgets.htm that has a PR of 5 and is getting getting good SERPs - and then I add example.com/new-widgets.htm, how does this new page benefit from example.com and example.com/widgets.htm in respect of sandboxing, link juice, etc.

Another way of asking the question is: If I want to get good PR and SERPs for new-widgets, is it better to:

a) add a new-widgets.htm page to my well-established example.com site (eg. example.com/new-widgets.htm) OR

b) get a new domain "new-widgets.com" (eg. new-widgets.com/index.htm)?

K

tedster

12:37 am on Dec 11, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



New domains go through a rather intense trial period - we used to call it the sandbox - so quicker rankings are possible by featuring a new product on an established domain. Page Rank and "link juice" are all about links, whether internal or external in origin. So if your new page has one of 25 links on your home page, it gets a nice shot of juice.

If it's only linked from inner pages, and none of those inner pages have as much PR as the domain root, then it gets less power. Unless, of course, you somehow attract some high PR links from another site - then that's a new source.

Subdomains and subdirectories do not, on their own, have PR. PageRank is all about the page, and not the domain name. some people do talk about a domain as a "PR5 domain" or whatever, but they are really talking about the PR of the domain's Home Page. From there on, it's all about the link structure.

moftary

1:34 am on Dec 14, 2007 (gmt 0)

10+ Year Member



Personally I am positive that if a domain was banned for any reason, all the subdomains would be banned as well but not vica-versa.

KVeil

2:07 am on Dec 14, 2007 (gmt 0)

10+ Year Member



This is good to know :)

K

kwasher

3:23 am on Dec 21, 2007 (gmt 0)

10+ Year Member



Personally I am positive that if a domain was banned for any reason, all the subdomains would be banned as well but not vica-versa.

That just does not seem logical to me.
If webmasterworld.com was banned, should tedster.webmasterworld.com be punished? Its just a host.

steveb

3:35 am on Dec 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Looks like little if any change. Subdomains and dupe www and non-www stuff still ranking as before.

More Google vaporware apparently.

tedster

3:55 am on Dec 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My observation is that this has been the way it works. If a subdomain gets banned, the parent domain does not necessarily get banned too. But when the main domain gets banned, so do the subdomains.

I suppose that the primary domain for a major host of personal sites (like wordpress) might get banned (how would that happen?) and not have the personally owned subdomains get banned. But as far as I know, there is no example of that kind of thing in reality.

g1smd

12:53 am on Dec 22, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think that Matt made some comments about a week ago, that hinted at which direction a penalty or ban might spread.

I can't remember whether it was from domain to sub-domain, or vice-versa.

This 52 message thread spans 2 pages: 52