| 1:13 pm on Jan 16, 2004 (gmt 0)|
sub domains are very strong in our SERPs also. 14 sub domains from 6 companies in top 3 pages.
| 1:28 pm on Jan 16, 2004 (gmt 0)|
Couldn't agree more. What surprises me is that I thought this was a "been there done that" sort of thing, meaning that G had in the past (apparently) already taken steps to handle this issue.
Maybe they forget the sub domain filter ingredient when preparing the algo du jour...
| 1:33 pm on Jan 16, 2004 (gmt 0)|
P.S. Actually, I should stop being so facetious. My guess is that since many large hub and directory sites use this method, it could have been dialed up by G in an effort to goose the ranks of those kinds of sites, much speculated on since FL.
But of course, the other market segment that uses this tactic frequently = agressive spammers. :-(
| 1:46 pm on Jan 16, 2004 (gmt 0)|
would seem so simple to treat subs as a single domain and serve the two most relevant pages across all subs, almost a no-brainer!
| 3:16 pm on Jan 16, 2004 (gmt 0)|
I wish sub-domains worked for me! I have 2 sub-domains for my site and it's hard to find them on Google, as well as my main domain (unless I search for the URL)!
| 3:23 pm on Jan 16, 2004 (gmt 0)|
I think you need a bunch of them.
| 4:45 pm on Jan 16, 2004 (gmt 0)|
right a bunch of them all interlinked and swapped together in linking campaigns..that way your keywords becomes a significant part of the linking page making it seem all linking pages are ontopic..
| 5:26 pm on Jan 16, 2004 (gmt 0)|
Makes sense. My site isn't that big, so I'll just keep my 2 sub-domains for now.
| 5:38 pm on Jan 16, 2004 (gmt 0)|
>why does Google add weight for sub domains?
IT DOESN'T! It's been always like that. If you flood google with subdomains or regular domains they will stay in the index until the Google techies either figure it out by themself of through spam reports by the users. Subdomains are nothing special and NOT treated special by google. No special weight is given. All that you can do with subdomains you can also do with regular domains.
If you find double listings / different subdomains from the same domain within the same serps for the same search talk about duplicates. Multiple serp listings are just bad seo and living shortly.
This is nothing special only happening with subdomains. Interlink X "regular" domains and you'll get the same affect. It's just not as obviously as subdomains.
| 6:50 pm on Jan 16, 2004 (gmt 0)|
|would seem so simple to treat subs as a single domain and serve the two most relevant pages across all subs, almost a no-brainer! |
And there goes something like dyndns.
| 7:30 pm on Jan 16, 2004 (gmt 0)|
im not so sure that subs are treated just like other domains....seems subs get the both of both worlds..its pretty much well known that you can pretty much as u like onsite wihout being penalised while doing that bewteen sites would get filtered or penalised....this would include stuff like identical or near identical content and crosslinking..seems subs are treated as one site for the grey area stuff and fifferent domains in terms of returning serps or ****ing links.....this would make subs far more value than normal domains....but even above this is see an unexplained bonus for crosslinking subs that you dont get with real domains..
| 10:29 pm on Jan 16, 2004 (gmt 0)|
I was having a winge about this last week. In the areas I am in there is one company that gets the best poistions in the best location because of a load of cross linked subdomains. I have specific domains for each location that got wiped out in Florida. I think google techs will sort this one out.
| 8:11 am on Jan 17, 2004 (gmt 0)|
>>> why does Google add weight for sub domains?
subs giving preference
No, Google does not. Subdomains appear to be given higher preference is only the "EFFECT" and not the Cause.
What Google has added to the relevancy weight is the size of the website and the internal pages such as the hub/node. Size, good internal link structure and incoming links to the internal nodule pages count greatly.
I tends to believe that subdomain and subdirectory are counted the same by Google. However, it appears that Subdomains are more successful. This could be due to the fact that webmasters can get more incoming links to subdomains than subdirectories. In other words, it is easier for a webmaster to ask for 10 incoming links from a link partner to point to 10 subdomains, "but it is very difficult to ask for 10 links pointing to 10 subdirectories of the same domains". That is why subdomains gain higher success rates!
| 9:27 am on Jan 17, 2004 (gmt 0)|
Subdomains are gaining higher success rates because you can get more than two results from the same domain onto one SERP. I'm not seeing them getting any extra weighting other than that which comes about from being part of a larger site. I certainly concur on that part. However, their multiple listings on single result pages is both annoying.about.com and aggravating.worldweb.com. However, maybe this isn't a bug. It could just be 'the way he wants it... well, he gets it. I don't like it any more than you men.' Cool Hand Luke
OT - If there were to be an update/adjustment/tweak/DCU in January, wouldn't it likely be this weekend?
| 4:37 pm on Jan 17, 2004 (gmt 0)|
"OT - If there were to be an update/adjustment/tweak/DCU in January, wouldn't it likely be this weekend?"
I think that every weekend after Florida but it never comes...
| 4:51 pm on Jan 17, 2004 (gmt 0)|
I would say that Google gives far too much weight to keywords in urls whether they are in subdomains, page names or dir names.
I'm giving serious consideration to changing my site, "if you can't beat them, join them".
I would also say that this has become progressively worse in the last six months. My site has slipped in SERPS slightly over the last few months yet I have perhaps three times as many backlinks as before. Most of the sites above me seem to get their positions almost entirely from the urls. Some have very few backlinks.
| 8:06 pm on Jan 17, 2004 (gmt 0)|
For one of ROI keyword I track, top 3SERP are full of Sundomains of same site and furthermore all Subdomains are almsot Mirrored and like Link farms.
May be google giving it a preference assuming a Directory which has subdomains and outbound Links.
Moreever, I have been submitting Spam Report about if for last 2 months, still no action taken.
| 11:39 pm on Jan 18, 2004 (gmt 0)|
The best theory I've seen to explain why subdomians do well is that the new algo works using relevance when awarding credit to links.
If you do a link swap with someone and their page is off topic, this theory would predict you would get little credit for the link.
However - if you have a site on bananas with a subdomain for each class of bananas and you link each subdomain to the others....then all links will be on topic because they all contain links from banana pages. Those subdomains are, as far as google is concerned the same as separate domains, and therefore those "relevant" links will receive a full credit from Google.
I like that theory. It makes sense to me.
| 1:39 am on Jan 19, 2004 (gmt 0)|
>I would say that Google gives far too much weight to keywords in urls whether they are in subdomains, page names or dir names.
This may be just an artifact on high weighting on anchor text. If I own widgets.tld, and have all the stuff about blue widgets on blue.widgets.tld, If anyone just links with a straight [blue.widgets.tld...] I get credit for both "blue" and "widgets".
| 8:17 am on Jan 19, 2004 (gmt 0)|
yup, i suggested the bananas idea before, but i think the benefit for subs goes much deeper than that. I seems to me that among the factors are that you get the best of both worlds. You can crosslink and have identical content as though it was a single domain, but you also get the inbound benefits as though they were standalone domains. I also wonder if links to subs is seen by google as links to inner pages making it look like you have links to many deep pages. I assume this would have the effect of being seen as an information site with deep content.
| 8:33 am on Jan 19, 2004 (gmt 0)|
Subdoamins appear together because they have almost exactly the same algorithmic weight as each other... same number of links, same style of anchor text, same titling, etc.
For some strange reason Google looks at subdomains differently than subdirectories. This makes no sense, and hopefully will be something they change sometime soon. It's silly for topic.tripod.com, topic1.tripod.com and topic2.tripod.com to all get ranked while geocities.com/topic/ is listed geocities.com/topic1/ is indented and geocities.com/topic2/ is not shown except under a "more results" link.
| 9:37 am on Jan 19, 2004 (gmt 0)|
Well put. Another important question (and perhaps THE question here) is whether topic.tripod.com is getting more mileage than tripod.com/topic. I'm guessing it depends on what the domain and 'topic' are relative to the search query.
|It's silly for topic.tripod.com, topic1.tripod.com and topic2.tripod.com to all get ranked while geocities.com/topic/ is listed geocities.com/topic1/ is indented and geocities.com/topic2/ is not shown except under a "more results" link. |
| 10:43 am on Jan 19, 2004 (gmt 0)|
trouble is plenty of free web hosts use subdomains for each seperate web page.
How would Google distinguish those, which are really separately owned pages from subdomains from a single site?
| 11:04 am on Jan 19, 2004 (gmt 0)|
I've seen keyword-stuffed subdomains going about ten deep. Despite reporting the worst offender (a pure spam site) several times, it's still in the index.
If Google can't be bothered to ban trash like that, don't expect them to change their algos to level the playing field.
| 6:55 pm on Jan 25, 2004 (gmt 0)|
This practice will most likely get you into trouble in the long run.
Too many SEOs abusing this with:
When I see a page like this when I am searching for something I never click it, as I know it's just spam.
When the first two pages of the results are pure crap because of this people notice and complain.
This has got worse.worse.worse.com as time goes by. Just a matter of time before it gets wacked.
Used to be on the free sites your www.freesite/mysite got listed. The SEs did not care when their new algo wiped them out, and they won't care when they get wiped out by this filter either.
It's only a matter of time. It's a big problem and everyone I know hates getting these kind of search results.
Make sure the domain at the end of this chain is a throw away domain because it will be rendered worthless when they decide to move on this issue!
| 11:32 pm on Jan 25, 2004 (gmt 0)|
There is a company making a killing using subdomains...in fact for some search results returned for certain phrases you may find that their series of subdomain urls are listed from position #8 all the way out to page (yes that's "page") 14...pretty stunning spam...if you ask me..
Same stuff happening at MSN
If you want some links and keywords to look at...send me a private message and I will get this to you..
Whew! scarry stuff showing up in the serps these days..