Forum Moderators: Robert Charlton & goodroi
Link juice doesn't flow at the domain level - it flows at the unique URL level by links....
Correct, but penalties may flow where PR juice may not - and maybe we hijacked this thread a little by brining penalty into it. Sorry if that's the case, but I think it's worth clarifying, and I'd love feedback on my understanding of penalty flow with respect to subdomains/subdirectories.
> A main domain which has incurred a penalty can pass that penalty to any of its subdomains or subdirectories regardless of links. (hoping for a challenge on this one)
> A subdomain which has incurred a penalty will not pass that penalty to the main domain unless the main domain acknowledges the subdomain with a link. Or, complementarily put:
> A main domain which has not incurred a penalty cannot be affected by penalties placed on any subdomains, unless that main domain links to the penalized subdomain.
I attempted to contact a couple of the advertisers to request that they not target my site for the whole domain but got nowhere.
That happens even without separate subdomains. In my experience advertisers will target your site sometimes even if you only have one or two pages that are a good match for their product.
Here's what happens now. The first step of results retrieval for any single search still has no limit on how many urls can be returned from a domain. In the early days of Google, a domain could even have all 10 first page spots and still keep on going. It could even be embarrassing!Today, the preliminary, raw retrieval of roughly 1,000 results still puts no limit on how many urls can be returned from a given domain. But there's a further processing step - a filter kicks in. That filter is supposed to ensure that only 2 urls maximum from any domain will actually be shown.
If those two urls happen to be on the same page, then they will cluster together on that page rather than show at their "true" algorithmically determined position. But through all the total pages of any search result, any single domain is supposed to show up a maximum of 2 times.
Now here's where we've been able to game the current situation. Subdomains are treated like a separate domain, and so you can get two results for www.example.com, two more for sub1.example.com, two more for sub2.example.com, and so on.
Matt Cutts mentioned that Google is working on code to eliminate that possibility for most domains. That is, Google plans to treat most subdomains essentially like any other url on the main domain, and they will limit that domain, INCLUDING all its subdomains, to two positions total on any given search.
At that point, the whole subdomain vs. subdirectory decision will lose most of its importance - and your wwww urls will not show up, even though they may still be causing you trouble behind the scenes.
This change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".
So this change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".
Matt also did a video interview with Michael McDonald of WebProNews this afternoon, where he planned to bring more clarity to this issue. When that video goes live, we'll have even more direct information.
My apology for getting the details a bit messed up first time around.
For several years Google has used something called “host crowding,” which means that Google will show up to two results from each hostname/subdomain of a domain name. That approach works very well to show 1-2 results from a subdomain, but we did hear complaints that for some types of searches (e.g. esoteric or long-tail searches), Google could return a search page with lots of results all from one domain. In the last few weeks we changed our algorithms to make that less likely to happen in the future.This change doesn’t apply across the board; if a particular domain is really relevant, we may still return several results from that domain. For example, with a search query like [ibm] the user probably likes/wants to see several results from ibm.com. ...this change has been live for a couple weeks or so now and no one noticed. :)
Subdomains and Subdirectories [mattcutts.com]
[edited by: tedster at 4:36 pm (utc) on Dec. 10, 2007]
[edit reason] use a shorter quote [/edit]
So this really is a minor tweak, and not a big deal.
Can I ask the similar question how Google treats pages. Eg. if I have a 3y old well-established example.com/widgets.htm that has a PR of 5 and is getting getting good SERPs - and then I add example.com/new-widgets.htm, how does this new page benefit from example.com and example.com/widgets.htm in respect of sandboxing, link juice, etc.
Another way of asking the question is: If I want to get good PR and SERPs for new-widgets, is it better to:
a) add a new-widgets.htm page to my well-established example.com site (eg. example.com/new-widgets.htm) OR
b) get a new domain "new-widgets.com" (eg. new-widgets.com/index.htm)?
K
If it's only linked from inner pages, and none of those inner pages have as much PR as the domain root, then it gets less power. Unless, of course, you somehow attract some high PR links from another site - then that's a new source.
Subdomains and subdirectories do not, on their own, have PR. PageRank is all about the page, and not the domain name. some people do talk about a domain as a "PR5 domain" or whatever, but they are really talking about the PR of the domain's Home Page. From there on, it's all about the link structure.
I suppose that the primary domain for a major host of personal sites (like wordpress) might get banned (how would that happen?) and not have the personally owned subdomains get banned. But as far as I know, there is no example of that kind of thing in reality.