homepage Welcome to WebmasterWorld Guest from 50.17.86.12
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Multiple level subdomains
i.e. neighborhood.city.county.state.country.example.com
Sharper

10+ Year Member



 
Msg#: 28640 posted 5:33 am on Mar 19, 2005 (gmt 0)

Does anyone have any experience with using deep subdomains?

In a "traditional" setup you'd go with subdirectories instead, like example.com/country/state/county/city/neighborhood/, but
how about something like neighborhood.city.county.state.country.example.com, with each subdomain linked to the hierchy and each with unique content?

I'm considering doing some experimenting to satisfy my curiousity to see how Google treats these, but I thought I'd ask for other's experiences first.

I know there may be a problem with Google returning all of them seperately in the same search term results for a common (usually rare) term. Any other problems you've seen with a site organized this way would be helpful.

---------------------------------------------------
Notes:

1. Yes, I know that PR depends on pages/links and not the mere existance of domains, so we don't need to discuss that as a tanget. :)
2. No, my test site won't actually be geography related, that's just an obvious example for demonstration purposes.

 

RichTC

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28640 posted 11:48 am on Mar 20, 2005 (gmt 0)

I think google will bar sites that over use sub-sub-sub domains. I can give you a perfect example of one i found if you sticky me.

However, i have seen them make an absolute killing in Yahoo which loves spammy sub domain sites.

im very interested to know if a sub domain is treated as a totaly seperate website or not and if so, can i conclude that if you link to a sub domain off your home page you are in effect giving page rank away?

Any thoughts anyone on this?

moishe

10+ Year Member



 
Msg#: 28640 posted 11:55 am on Mar 20, 2005 (gmt 0)

I was just looking at buying a text link on a site that uses sub.sub.domains, very well established and ranks very well in Google, PR7 home with lots of 6's etc on the sub domain pages. So it would seem that Google can deal with these sites.

BTW, they are not advertising to sell text links, I contacted them about it.

Wizard

5+ Year Member



 
Msg#: 28640 posted 8:10 pm on Mar 20, 2005 (gmt 0)

If I were in a position of user, and found such disgustingly spammy sub-sub-subdomain in results, I wouldn't click it. Google doesn't like them too.

And how would you avoid crosslinking them? Google penalizes heavy crosslinking, except crosslinking inside the site, obviously, so if you're spreading the site to many subdomains, you'll have to crosslink them just to make a reasonable internal navigation.

And are you going to have each subdomain in different IP, and better in different C-block? If not, Google will easily see they belong to one site.

moishe

10+ Year Member



 
Msg#: 28640 posted 1:28 am on Mar 21, 2005 (gmt 0)

Wizard,
I have a site with 3 subdomains, all on one server, same IP, PR5 and lots of 4's, I have over 1000 KW phrases that I am aware of that rank in the top 20 on the three big search engines with 60% of traffic coming from Google.

This site is 3 years old, has just recently been listed in DMOZ.

How would you explain this?

The largest non-govt related natl park site on the web uses sub-sub domains, search goog for national parks and look at the first non govt site....

jcoronella

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28640 posted 1:52 am on Mar 21, 2005 (gmt 0)

I'm pretty sure it is ok unless you are crosslinking excessively. I've never tried it beyond 3rd level personally, but the government sites go pretty deep.

RichTC

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28640 posted 10:12 am on Mar 21, 2005 (gmt 0)

I dont think you should include .gov also perhaps even .ac, i cant be sure but i think these have a special treatment in the google algo

Certainly i think excessive cross linking with sub-sub domains is a dangerous game

Sharper

10+ Year Member



 
Msg#: 28640 posted 8:58 pm on Mar 21, 2005 (gmt 0)

If not, Google will easily see they belong to one site.

No, all the subdomains would be on the same server, using the same IP. The point of the question isn't to try and fool google into thinking they are different sites, the point is how well Google will deal with spidering/listing them.

My current understanding is that Google won't "penalize" for crosslinking subdomains, but there wouldn't be any special benefit over not using subdomains, either. Is that incorrect? Does someone have an example of such punishment?

disgustingly spammy sub-sub-subdomain

I'm not sure what's inherently spammy about subdomains vs. subdirectories. Perhaps you could elaborate? I mean, there is no difference in length, the biggest difference is using a . instead of a / as the seperater and putting the most relevent term first instead of last in the URL. How does that make one method inherently more spammy than another? Just because one is a more traditional method of organization?

Anyone with more information about any sites with more than sub.sub.domain.tld?

mrMister

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28640 posted 5:34 pm on Mar 22, 2005 (gmt 0)

I dont think you should include .gov also perhaps even .ac, i cant be sure but i think these have a special treatment in the google algo

.ac? Google could give special preference to sites in the Ascension Islands?

I think you mean .ac.uk there ;-)

mrMister

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28640 posted 5:37 pm on Mar 22, 2005 (gmt 0)


disgustingly spammy sub-sub-subdomain

I'm not sure what's inherently spammy about subdomains vs. subdirectories. Perhaps you could elaborate?

He wasn't talking about subdomains per se, he was talking about sub-sub-subdomains.

"http://powerful.blue.widgets.america.domain.com" does look inherently spammy.

Wizard

5+ Year Member



 
Msg#: 28640 posted 10:10 pm on Mar 22, 2005 (gmt 0)

He wasn't talking about subdomains per se, he was talking about sub-sub-subdomains

Exactly - and I have read some theories on this forum, that placing keywords in domain name can be dangerous itself - perhaps it's exaggerating, but it's possible, that having too many keywords in domain name may turn a 'red flag' in Google.

The best way to do SEO is to simulate 'naturally high ranking' site as much as possible. As for domain names, 'Natural' site would, in most cases, use its brand name in domain, not its main keywords. If subdomains are used, they should be informative, not just filled with keywords. At the same time, there are many spammy sites using plenty of keywords in domain names, so I guess Google Team is already trying to detect and filter them.

I wonder if bayesian filtering, about which MrMister said some interesting remarks in other topic, could also lead to problems with sub-sub-subdomains. If they do such filtering, and include url in compared data, perhaps it can.

However, if I were Google algo designer, I'd just compare words in domain with keywords of the page, and if there were too many matches, I'd treat it as suspected spam.

Detecting black hat SEO can be quite easy - just check several other factors like this one, and if more of them can be found on a site, apply penalty. To avoid the penalty, it's better to avoid things that obviously look spammy, even if there aren't much proofs they hurt ranking.

mrMister

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28640 posted 10:39 pm on Mar 22, 2005 (gmt 0)

wonder if bayesian filtering, about which MrMister said some interesting remarks in other topic, could also lead to problems with sub-sub-subdomains.

I'm convinced that Google is using Bayesian spam filtering. A lot of things that I see often mentioned in WebmasterWorld (over-optimisation penalty, specific penalties targetting certain site sectors) are exactly the kind of things that Bayesian filtering would appear to do.

Imagine this scenario (figures made up, I'm just guessing)

If 1% of spam pages had more than 3 subdomains,
and only 0.001% of legitimate sites had more than 3 subdomains, then I'd expect bayesian spam filtering to be able to pick up on that and regard any site with more than 3 domains as spam.

Of course, the bayesian spam filtering would only be one of the 100+ items in Google's page ranking criteria, so it wouldn't alone be enough to seriously downgrade a site, however if it is used with other spammy techniques, they could combine to cause lower rankings.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved