Forum Moderators: open

Message Too Old, No Replies

too many subdomain result..

sub domain has been used for getting high ranking.

         

FillDeCube

4:33 am on Nov 5, 2003 (gmt 0)

10+ Year Member



I am seeing more and more affiliates getting high ranking. Many of them are using sub-domain and interlink each subdomain site.

Some of the site with PR3 even rank higher than PR 6 sites...

should I follow the thrend by creating something similar?

soapystar

10:11 am on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



this has annoyed me for many months. You can get every page of a site listed in one serps simply by making each page a subdomain. How can that be right? It is so basic that a subdomain is part of the single main domain why does google insist on treating them as individual sites?

Yidaki

10:23 am on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>why does google insist on treating them as individual sites

I can name thousands that are in fact individual sites.

One of my sites is split into 7 different canonicals (subdomains). All have their own physical machine, ip, data base AND their own topic and content. They all are promoted seperately, all have their own legitimate dmoz entry.

Don't blame subdomains in general.

Subdomain spammers come and go.
My best advice: ignore it.

AthlonInside

12:06 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google will take care of it very soon. I believe it will become a hot topic for the following months but it will soon cool down, just like the Guestbook spamming we used to talk about months ago.

killroy

12:18 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Erm, all comains are subdomains of the TLD.

It's just a matter of registration procedure (domains require it, subs don't).

But for example a dictator could take his country domain and do the same thing. So discriminating by that is clearly not an issue. Sub domains are seperate, they just don't need seperate registration. Potential for abuse? Of course. Not legitimate? Absolutely not!

SN

TravelMan

3:08 pm on Nov 5, 2003 (gmt 0)

10+ Year Member



I too think it would be fairer if only 1 or 2 instances of a website were shown for each search.

I recently noticed that a particular Keyterm search displayed the same website for the first 5 pages of the SERPs, latter pages had a smattering of this websites subdomains too.

Bug? Perhaps so, Is it easy for Google to prevent this? Who knows.

FillDeCube

3:54 pm on Nov 5, 2003 (gmt 0)

10+ Year Member



how I wish google will impose some kind of algo to prevent sub-domain spamming..

Yidaki

5:09 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>algo to prevent sub-domain spamming..

Which IS the duplicate content / multiple domain spam detecting algo. It's about duplicate content. Nothing to do with subdomains. You can do the same without using subdomains. Using multiple www domains is more expensive and has actually the same effect - that's why subdomain duplication / doorway'ing is used more than domain duplication.

Btw: did ya know that www.example.com is a subdomain of example.com?

soapystar

8:32 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



its not about duplicate content. Why does google show only two pages per site? Maybe 20 pages from one site are more relevant than the first 30 results? The reason being to give a choice to the user. Why then should sub-domains be allowed to populate the top 30 slots when they are related to a domain just as pages in the domain. You can argue your subdomains are seperate sites but the fact is they are not. Just because two of my pages have different subjects and content does not make them seperate sites. You might not be spamming but thats not the point. Google is inviting us to have every page as a subdomain and then we can have all out pages in the same serps. This is clearly not good for the user. if your subdomains are seperate sites then IMHO they should have seperate domain names.

woop01

8:43 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sour grapes.

markus007

7:46 am on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In my area, several spammers owning 4000 + PR 5 subdomains have all been completed booted from the serps in the last few days....

TravelMan

8:51 am on Nov 6, 2003 (gmt 0)

10+ Year Member



Exactly! SoapyStar, I couldn't agree more.

Lets face it, setting up sub domains is a piece of pie, what kind of search engine would G be if every webmaster in competitive Serps did the same thing?

I'd like to think that G will eventually take action on this, but I see no evidence of intent thus far. I'm not saying that domains should be booted, as this would be taking a sledgehammer to crack a nut and isn't either fair or necessary.

Showing multiple sub-domains on the same SERP strips out the perception of choice , and denigrates the users experience(IMHO).

Can G afford to allow such a situation to proliferate?

FillDeCube

10:01 am on Nov 6, 2003 (gmt 0)

10+ Year Member



how can I report website that uses sub-domain spam to google?

TravelMan

10:15 am on Nov 6, 2003 (gmt 0)

10+ Year Member



>how can I report website that uses sub-domain spam to google?

Are you sure they are spamming even? Ive seen many websites that are doing the same thing, but I wouldnt call it spamming.

Its an algo issue, not a spam issue. IMO

Yidaki

11:41 am on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Stop whining ... yawn.

www.example.com is a subdomain of example.com

AthlonInside

12:53 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if ($subdomain = "www")
exit;
else
{
... detect to see if it is legite
}

soapystar

1:12 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



www.example.com is a subdomain of example.com

while taking yout point this example isnt really what we are talking and we all know exactly what we mean by subdomains. We all know the www is just a legacy from having to specify the server. If we take the example example :-) it still fits. Why would you specify two sites one as www.example and example.com? All i am saying is these two addresses should be treated as one site and not have 2 pages from each site served in a single serps.

Yidaki

1:54 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>All i am saying is these two addresses should be treated
>as one site and not have 2 pages from each site served in
>a single serps.

This is done by duplicate content filters.

As long as both sites (www.example.om + example.com) return the same content they ARE duplicates. This has nothing to do with the fact that they are both using the same domain.

Saying that they should be treated as one site just because they share the domain name is - well - at least pretty uninformed.

But i see, you would like to get all the subdomains above you removed. If google will remove them you will have much more fun with spotting the duplicate domains www.example.com www.thesameexample.com and reporting those ...

The truth and the tales about canonicals [google.com].

Spannerworks

2:40 pm on Nov 6, 2003 (gmt 0)

10+ Year Member



I agree with everything said by Yidaki. This is nothing to do with sub-domains, it's about duplication. Doesn't matter how or where the duplication is created.

And yes, I'd rather search engines treat subdomains as different to the main. For extremely large sites with many topics or regional info, sub-domains provide an easy and cheaper way to organise everything.

The fact that 'some' people may use sub domains to serve duplicate spam is neither here...

soapystar

3:03 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Saying that they should be treated as one site just because they share the domain name is - well - at least pretty uninformed.

But i see, you would like to get all the subdomains above you removed. If google will remove them you will have much more fun with spotting the duplicate domains www.example.com www.thesameexample.com and reporting those ...

All i can say is how wrong you are in both your unnecessary assumption of my motives and your interpretation of what i was saying. No, this is not about duplicate content, I can only assume you never read my earlier post. Its about a spectrum of choice for the user. If it wasnt then why would google limit its results to two pages per domain and other engines to a single page per domain. Because no matter how many relevant pages a domain has the engine is looking to serve a cross section of relevant sites. When i search i dont want to see one site in the top ten positions. I am not concerned with my own results as you seem to think since they all hold top slots. I am talking about personally when i search. And as for the idea that a duplicate filter takes care of sub-domains with the same content, well i must assume you dont use Google much. One big auction site comes to mind with regular top 6 slots with identical copy but just happen to be using sub-domains.

The bottom line to my point is this, why shouldnt we all use subdomains per page rather than page names? That way every page would get into a single serps. Think about it Yaki before making personal remarks when i am simply stating an opinion on a forum.

TravelMan

3:11 pm on Nov 6, 2003 (gmt 0)

10+ Year Member



If the search query is widgets in region and the serps return

sub.samedomain.tld
sub2.samedomain.tld
sub3.samedomain.tld
sub4.samedomain.tld
sub5.samedomain.tld
sub6.samedomain.tld

etc

Then the serp is not very good.

Does google want this? I very much doubt it.

The issue has bugger all to do with whether www.domain.tld is the same as domain.tld. The issue is about *multiple* subs dominating a serp. It doesnt matter whether they all have unique content or not, the majority of people ( I would think) want to be presented with multiple choices from different suppliers.

People set up subdomains because there is a perception that with google at least, they will receive a competitive advantage. Some serps are dominated by single websites, using multiple subdomains.

Subdomains may well be treated as individual domains in their own right ( by google at least). If people think this is a good thing, then that is their entitlement, I think its a bad thing, no big deal, just my opinion.

Yidaki

3:26 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>And as for the idea that a duplicate filter takes care of
>sub-domains with the same content, well i must assume
>you dont use Google much.

Please point me to the part where i said that the current duplicate content filter would be effective. It's as effective/ineffective for subdomains as it is for other types of duplicates.

>One big auction site comes to mind with regular top 6
>slots with identical copy but just happen to be using
>sub-domains.

And just because of this abuse you think subdomains should get "penalized"? (ignoring their independence is a penalty) Just because some chaps are so dumb that they duplicate / doorway on the same domain but with different subdomains? Just because some heroes abuse canonicals?

Me personally, i'm happy that the spammers use canonicals for their abusive doorway or duplication games. Using examplesauctions.com, auctionsexamples.com, widgetsexamplesauctions.com ... is pretty much harder to spot.

>why shouldnt we all use subdomains per page rather than page names? That way every page would get into a single serps

... or different top domains rather than subdomains?
-> same effect - even better since you can also cheat with whois.

>before making personal remarks

Na i didn't - at least it wasn't my intention.

AthlonInside

3:35 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if ( PagesInSubdomain < 5)
..treat it as part of the main site...
..return 2 pages maximum in SERPS...
else
..treat it as a different site..
..may return it in SERPs along with the main site if relevant..

Google are experts with Search Engine, they can come up with a lot of sophiscated algorithm to handle and further reduce most spams.

soapystar

4:15 pm on Nov 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



to whom it may concern.....

please point me where i said subdomains should be penalised?

when i have two pages from one site of 10,000 pages show its not because 9998 pages have been penalised. Its because Google has applied its algo to serve the two most relevant pages. My point is that subdomains should be treated as a single domain and the two most relevant pages served. As a Googler thats what i would like to see.