Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Maybe they forget the sub domain filter ingredient when preparing the algo du jour...
But of course, the other market segment that uses this tactic frequently = agressive spammers. :-(
IT DOESN'T! It's been always like that. If you flood google with subdomains or regular domains they will stay in the index until the Google techies either figure it out by themself of through spam reports by the users. Subdomains are nothing special and NOT treated special by google. No special weight is given. All that you can do with subdomains you can also do with regular domains.
If you find double listings / different subdomains from the same domain within the same serps for the same search talk about duplicates. Multiple serp listings are just bad seo and living shortly.
This is nothing special only happening with subdomains. Interlink X "regular" domains and you'll get the same affect. It's just not as obviously as subdomains.
No, Google does not. Subdomains appear to be given higher preference is only the "EFFECT" and not the Cause.
What Google has added to the relevancy weight is the size of the website and the internal pages such as the hub/node. Size, good internal link structure and incoming links to the internal nodule pages count greatly.
I tends to believe that subdomain and subdirectory are counted the same by Google. However, it appears that Subdomains are more successful. This could be due to the fact that webmasters can get more incoming links to subdomains than subdirectories. In other words, it is easier for a webmaster to ask for 10 incoming links from a link partner to point to 10 subdomains, "but it is very difficult to ask for 10 links pointing to 10 subdirectories of the same domains". That is why subdomains gain higher success rates!
OT - If there were to be an update/adjustment/tweak/DCU in January, wouldn't it likely be this weekend?
I'm giving serious consideration to changing my site, "if you can't beat them, join them".
I would also say that this has become progressively worse in the last six months. My site has slipped in SERPS slightly over the last few months yet I have perhaps three times as many backlinks as before. Most of the sites above me seem to get their positions almost entirely from the urls. Some have very few backlinks.
May be google giving it a preference assuming a Directory which has subdomains and outbound Links.
Moreever, I have been submitting Spam Report about if for last 2 months, still no action taken.
If you do a link swap with someone and their page is off topic, this theory would predict you would get little credit for the link.
However - if you have a site on bananas with a subdomain for each class of bananas and you link each subdomain to the others....then all links will be on topic because they all contain links from banana pages. Those subdomains are, as far as google is concerned the same as separate domains, and therefore those "relevant" links will receive a full credit from Google.
I like that theory. It makes sense to me.
This may be just an artifact on high weighting on anchor text. If I own widgets.tld, and have all the stuff about blue widgets on blue.widgets.tld, If anyone just links with a straight [blue.widgets.tld...] I get credit for both "blue" and "widgets".
For some strange reason Google looks at subdomains differently than subdirectories. This makes no sense, and hopefully will be something they change sometime soon. It's silly for topic.tripod.com, topic1.tripod.com and topic2.tripod.com to all get ranked while geocities.com/topic/ is listed geocities.com/topic1/ is indented and geocities.com/topic2/ is not shown except under a "more results" link.
It's silly for topic.tripod.com, topic1.tripod.com and topic2.tripod.com to all get ranked while geocities.com/topic/ is listed geocities.com/topic1/ is indented and geocities.com/topic2/ is not shown except under a "more results" link.Well put. Another important question (and perhaps THE question here) is whether topic.tripod.com is getting more mileage than tripod.com/topic. I'm guessing it depends on what the domain and 'topic' are relative to the search query.
If Google can't be bothered to ban trash like that, don't expect them to change their algos to level the playing field.
Too many SEOs abusing this with:
When I see a page like this when I am searching for something I never click it, as I know it's just spam.
When the first two pages of the results are pure crap because of this people notice and complain.
This has got worse.worse.worse.com as time goes by. Just a matter of time before it gets wacked.
Used to be on the free sites your www.freesite/mysite got listed. The SEs did not care when their new algo wiped them out, and they won't care when they get wiped out by this filter either.
It's only a matter of time. It's a big problem and everyone I know hates getting these kind of search results.
Make sure the domain at the end of this chain is a throw away domain because it will be rendered worthless when they decide to move on this issue!
There is a company making a killing using subdomains...in fact for some search results returned for certain phrases you may find that their series of subdomain urls are listed from position #8 all the way out to page (yes that's "page") 14...pretty stunning spam...if you ask me..
Same stuff happening at MSN
If you want some links and keywords to look at...send me a private message and I will get this to you..
Whew! scarry stuff showing up in the serps these days..