Forum Moderators: open
I can name thousands that are in fact individual sites.
One of my sites is split into 7 different canonicals (subdomains). All have their own physical machine, ip, data base AND their own topic and content. They all are promoted seperately, all have their own legitimate dmoz entry.
Don't blame subdomains in general.
Subdomain spammers come and go.
My best advice: ignore it.
It's just a matter of registration procedure (domains require it, subs don't).
But for example a dictator could take his country domain and do the same thing. So discriminating by that is clearly not an issue. Sub domains are seperate, they just don't need seperate registration. Potential for abuse? Of course. Not legitimate? Absolutely not!
SN
I recently noticed that a particular Keyterm search displayed the same website for the first 5 pages of the SERPs, latter pages had a smattering of this websites subdomains too.
Bug? Perhaps so, Is it easy for Google to prevent this? Who knows.
Which IS the duplicate content / multiple domain spam detecting algo. It's about duplicate content. Nothing to do with subdomains. You can do the same without using subdomains. Using multiple www domains is more expensive and has actually the same effect - that's why subdomain duplication / doorway'ing is used more than domain duplication.
Btw: did ya know that www.example.com is a subdomain of example.com?
Lets face it, setting up sub domains is a piece of pie, what kind of search engine would G be if every webmaster in competitive Serps did the same thing?
I'd like to think that G will eventually take action on this, but I see no evidence of intent thus far. I'm not saying that domains should be booted, as this would be taking a sledgehammer to crack a nut and isn't either fair or necessary.
Showing multiple sub-domains on the same SERP strips out the perception of choice , and denigrates the users experience(IMHO).
Can G afford to allow such a situation to proliferate?
while taking yout point this example isnt really what we are talking and we all know exactly what we mean by subdomains. We all know the www is just a legacy from having to specify the server. If we take the example example :-) it still fits. Why would you specify two sites one as www.example and example.com? All i am saying is these two addresses should be treated as one site and not have 2 pages from each site served in a single serps.
This is done by duplicate content filters.
As long as both sites (www.example.om + example.com) return the same content they ARE duplicates. This has nothing to do with the fact that they are both using the same domain.
Saying that they should be treated as one site just because they share the domain name is - well - at least pretty uninformed.
But i see, you would like to get all the subdomains above you removed. If google will remove them you will have much more fun with spotting the duplicate domains www.example.com www.thesameexample.com and reporting those ...
The truth and the tales about canonicals [google.com].
And yes, I'd rather search engines treat subdomains as different to the main. For extremely large sites with many topics or regional info, sub-domains provide an easy and cheaper way to organise everything.
The fact that 'some' people may use sub domains to serve duplicate spam is neither here...
But i see, you would like to get all the subdomains above you removed. If google will remove them you will have much more fun with spotting the duplicate domains www.example.com www.thesameexample.com and reporting those ...
All i can say is how wrong you are in both your unnecessary assumption of my motives and your interpretation of what i was saying. No, this is not about duplicate content, I can only assume you never read my earlier post. Its about a spectrum of choice for the user. If it wasnt then why would google limit its results to two pages per domain and other engines to a single page per domain. Because no matter how many relevant pages a domain has the engine is looking to serve a cross section of relevant sites. When i search i dont want to see one site in the top ten positions. I am not concerned with my own results as you seem to think since they all hold top slots. I am talking about personally when i search. And as for the idea that a duplicate filter takes care of sub-domains with the same content, well i must assume you dont use Google much. One big auction site comes to mind with regular top 6 slots with identical copy but just happen to be using sub-domains.
The bottom line to my point is this, why shouldnt we all use subdomains per page rather than page names? That way every page would get into a single serps. Think about it Yaki before making personal remarks when i am simply stating an opinion on a forum.
sub.samedomain.tld
sub2.samedomain.tld
sub3.samedomain.tld
sub4.samedomain.tld
sub5.samedomain.tld
sub6.samedomain.tld
etc
Then the serp is not very good.
Does google want this? I very much doubt it.
The issue has bugger all to do with whether www.domain.tld is the same as domain.tld. The issue is about *multiple* subs dominating a serp. It doesnt matter whether they all have unique content or not, the majority of people ( I would think) want to be presented with multiple choices from different suppliers.
People set up subdomains because there is a perception that with google at least, they will receive a competitive advantage. Some serps are dominated by single websites, using multiple subdomains.
Subdomains may well be treated as individual domains in their own right ( by google at least). If people think this is a good thing, then that is their entitlement, I think its a bad thing, no big deal, just my opinion.
Please point me to the part where i said that the current duplicate content filter would be effective. It's as effective/ineffective for subdomains as it is for other types of duplicates.
>One big auction site comes to mind with regular top 6
>slots with identical copy but just happen to be using
>sub-domains.
And just because of this abuse you think subdomains should get "penalized"? (ignoring their independence is a penalty) Just because some chaps are so dumb that they duplicate / doorway on the same domain but with different subdomains? Just because some heroes abuse canonicals?
Me personally, i'm happy that the spammers use canonicals for their abusive doorway or duplication games. Using examplesauctions.com, auctionsexamples.com, widgetsexamplesauctions.com ... is pretty much harder to spot.
>why shouldnt we all use subdomains per page rather than page names? That way every page would get into a single serps
... or different top domains rather than subdomains?
-> same effect - even better since you can also cheat with whois.
>before making personal remarks
Na i didn't - at least it wasn't my intention.
Google are experts with Search Engine, they can come up with a lot of sophiscated algorithm to handle and further reduce most spams.
please point me where i said subdomains should be penalised?
when i have two pages from one site of 10,000 pages show its not because 9998 pages have been penalised. Its because Google has applied its algo to serve the two most relevant pages. My point is that subdomains should be treated as a single domain and the two most relevant pages served. As a Googler thats what i would like to see.