Welcome to WebmasterWorld Guest from 188.8.131.52
In a "traditional" setup you'd go with subdirectories instead, like example.com/country/state/county/city/neighborhood/, but
how about something like neighborhood.city.county.state.country.example.com, with each subdomain linked to the hierchy and each with unique content?
I'm considering doing some experimenting to satisfy my curiousity to see how Google treats these, but I thought I'd ask for other's experiences first.
I know there may be a problem with Google returning all of them seperately in the same search term results for a common (usually rare) term. Any other problems you've seen with a site organized this way would be helpful.
1. Yes, I know that PR depends on pages/links and not the mere existance of domains, so we don't need to discuss that as a tanget. :)
2. No, my test site won't actually be geography related, that's just an obvious example for demonstration purposes.
However, i have seen them make an absolute killing in Yahoo which loves spammy sub domain sites.
im very interested to know if a sub domain is treated as a totaly seperate website or not and if so, can i conclude that if you link to a sub domain off your home page you are in effect giving page rank away?
Any thoughts anyone on this?
BTW, they are not advertising to sell text links, I contacted them about it.
And how would you avoid crosslinking them? Google penalizes heavy crosslinking, except crosslinking inside the site, obviously, so if you're spreading the site to many subdomains, you'll have to crosslink them just to make a reasonable internal navigation.
And are you going to have each subdomain in different IP, and better in different C-block? If not, Google will easily see they belong to one site.
This site is 3 years old, has just recently been listed in DMOZ.
How would you explain this?
The largest non-govt related natl park site on the web uses sub-sub domains, search goog for national parks and look at the first non govt site....
If not, Google will easily see they belong to one site.
No, all the subdomains would be on the same server, using the same IP. The point of the question isn't to try and fool google into thinking they are different sites, the point is how well Google will deal with spidering/listing them.
My current understanding is that Google won't "penalize" for crosslinking subdomains, but there wouldn't be any special benefit over not using subdomains, either. Is that incorrect? Does someone have an example of such punishment?
disgustingly spammy sub-sub-subdomain
I'm not sure what's inherently spammy about subdomains vs. subdirectories. Perhaps you could elaborate? I mean, there is no difference in length, the biggest difference is using a . instead of a / as the seperater and putting the most relevent term first instead of last in the URL. How does that make one method inherently more spammy than another? Just because one is a more traditional method of organization?
Anyone with more information about any sites with more than sub.sub.domain.tld?
disgustingly spammy sub-sub-subdomain
I'm not sure what's inherently spammy about subdomains vs. subdirectories. Perhaps you could elaborate?
He wasn't talking about subdomains per se, he was talking about sub-sub-subdomains.
"http://powerful.blue.widgets.america.domain.com" does look inherently spammy.
He wasn't talking about subdomains per se, he was talking about sub-sub-subdomains
Exactly - and I have read some theories on this forum, that placing keywords in domain name can be dangerous itself - perhaps it's exaggerating, but it's possible, that having too many keywords in domain name may turn a 'red flag' in Google.
The best way to do SEO is to simulate 'naturally high ranking' site as much as possible. As for domain names, 'Natural' site would, in most cases, use its brand name in domain, not its main keywords. If subdomains are used, they should be informative, not just filled with keywords. At the same time, there are many spammy sites using plenty of keywords in domain names, so I guess Google Team is already trying to detect and filter them.
I wonder if bayesian filtering, about which MrMister said some interesting remarks in other topic, could also lead to problems with sub-sub-subdomains. If they do such filtering, and include url in compared data, perhaps it can.
However, if I were Google algo designer, I'd just compare words in domain with keywords of the page, and if there were too many matches, I'd treat it as suspected spam.
Detecting black hat SEO can be quite easy - just check several other factors like this one, and if more of them can be found on a site, apply penalty. To avoid the penalty, it's better to avoid things that obviously look spammy, even if there aren't much proofs they hurt ranking.
wonder if bayesian filtering, about which MrMister said some interesting remarks in other topic, could also lead to problems with sub-sub-subdomains.
I'm convinced that Google is using Bayesian spam filtering. A lot of things that I see often mentioned in WebmasterWorld (over-optimisation penalty, specific penalties targetting certain site sectors) are exactly the kind of things that Bayesian filtering would appear to do.
Imagine this scenario (figures made up, I'm just guessing)
If 1% of spam pages had more than 3 subdomains,
and only 0.001% of legitimate sites had more than 3 subdomains, then I'd expect bayesian spam filtering to be able to pick up on that and regard any site with more than 3 domains as spam.
Of course, the bayesian spam filtering would only be one of the 100+ items in Google's page ranking criteria, so it wouldn't alone be enough to seriously downgrade a site, however if it is used with other spammy techniques, they could combine to cause lower rankings.