| 9:06 am on Mar 15, 2006 (gmt 0)|
Can you clarify your question a little?
Are you talking about hosting multiple domains on one physical server?
Are you talking about setting up an interlinked network of domains?
Are you talking about having substantially similar content on multiple domains?
| 1:29 pm on Mar 15, 2006 (gmt 0)|
Allow me to clarify. I have two domains. They are set up so that when I make changes to a page, both sites are automatically changed. All the internal files and directories are labled as "/directory/file.html". So depending on what site you are on, the root will reflect the different domain names.
Does this clarify enough?
| 1:46 pm on Mar 15, 2006 (gmt 0)|
|Does this clarify enough? |
Is the content of the sites exactly the same? Or do you mean templates are replicated between the two sites?
| 2:11 pm on Mar 15, 2006 (gmt 0)|
The pages from both sites are identical. I have ftp log in for one site, and when I update a page both sites are changed.
| 3:57 pm on Mar 15, 2006 (gmt 0)|
|The pages from both sites are identical. |
and do you expect the search engines to spider and list both domains?
| 4:30 pm on Mar 15, 2006 (gmt 0)|
At one point in time, they were both indexed. If I do a site:www.mydomain.com in Google, there is "no information to be found" for both URL's. But under the "find pages from this site" link, there are pages listed, for both URL's.
What I really would to find out is if I am getting penalized then what is the best way to fix it, AND if I am get penalized, then what is the exact reason?
| 4:43 pm on Mar 15, 2006 (gmt 0)|
3 words: duplicate content penalty
Google, and presumably the other SEs, don't like duplicate content, especially when it's the exact same content, just on different URLs.
Why? Google limits each site to 2 listings in the SERPs. So if no duplication penalty existed, everyone who had the #1 & #2 slots could just clone their site 4 times and monopolize the entire 1st page of results, with all the links showing the exact same content. This is worthless to users, so would quickly devalue the results from Google.
| 6:12 pm on Mar 16, 2006 (gmt 0)|
Looks like the dupe filter, as Life said. That said, wait a week or so, one or even both of the domains may return. Either way, pick one and do a 301 redirect, and apply for a reinclusion request.
A friend of mine who has operated a similar operation to yours, has not had his sites caught/penalized, but lately the domains had been jumping in and out of the index...so he got the message, i guess, and did the 301.
| 12:04 pm on Mar 17, 2006 (gmt 0)|
I'm monitoring a web site with at least 4 domain names pointing to the same web site, of course the content is 100% similar and it's ranking sometimes with 2-3 different domain names in the same result page. So that's a pretty dangerous game, but it may work until Google sees it and dump you from the index.
| 3:05 am on Apr 4, 2006 (gmt 0)|
What happens if you want your content seen on multiple regional serps e.g. .co.uk ; .ca ; .com.au ; .co.nz as well as .com?
How does Google view that in terms of duplicate content?
-Have a shared IP address
-Currently we have varied on page content
-Moved the page structures around [ images and text ]
-Don't interlink these english language sites
We are being careful to make sure that only results relevant to e.g. the NZ market appear high up on .co.nz
Of concern is that the sites:
-have the same domain branding name "xyxxyz"
-use the same url structures
-might be reported as SPAM by competitors when sometimes all 5 show up on the same page in the serps - but then that usually only occurs if there's no competition.
What do you think?
| 3:47 am on Apr 4, 2006 (gmt 0)|
They SHOULD only dupe out ONE site. Duplicate Content is not designed to be a penalty, but a FILTER.
Everyone has the right to put up - for example - a copy of the constitution on their website - without fear of getting banned or penalized. You just won't show up for searches for it, but your site won't be delisted or anything either.
If both sites are original content and belong to you, they should list one of them and dupe the other.
Google THEMSELVES has duplicate content. Much of the content for their various google.tld sites is duplicate.
Google is very logical, they would not knowingly ban two sites because they had the same content.
| 4:35 am on Apr 4, 2006 (gmt 0)|
So how would the filter apply to say these 5 sites. Are you saying all but one will be filtered out?
If so, does that mean that some of the regional sites will not show on localised Google searches and how does Google choose that one?
I guess to put this another way - how can a webmaster put up content on multiple domains and not be have a filter dampner, and are the precautions that we've taken sufficient?
| 5:16 am on Apr 4, 2006 (gmt 0)|
>I guess to put this another way - how can a webmaster put up content on multiple domains and not be have a filter dampner, and are the precautions that we've taken sufficient?
Tough question - and totally different than what the original poster was asking
If you have genuinely unique content on each tld - then you will probably be ok. The amount that has to vary is hard to say, and it isn't a set amount - like you said - if there is no competition - what can you expect.
I don't think anyone can answer your question exactly, but from a theoretical point of view keep in mind that google likes to look at think from a users point view.
With regards to others reporting you:
If you want to stay around long term you have to ask yourself - "If a Google employee saw this - what would they think?"
If the diversification you offer is of value to the user (which it sounds like it might be - or could be made so with some work) - then you will have no problem.
However, much of the time I see things just done with the attempt to get more money/traffic - and adding nothing for the user. It is hard to say what exactly will pass muster, but gooogle's job is not to pass value judgements on sites, but to let its users do it for them.
So you need three things:
1) To change or add something
2) Of value or use
3) To that set of users
As far as the technical aspect goes - with many SEs - google included - sites on that countries IP - will do better in that countries TLD google's search. That is a real PITA for obvious reasons, but it is a reality. Maybe others can comment more on that issue, but that really takes this off topic - which was duplicate content (however I can see how people would confuse them).
| 2:57 pm on Apr 4, 2006 (gmt 0)|
Hmmm okay. So if i have the .co.uk version up and running and indexed, and i buy the .com version of the DN to bring in US customers, are you saying i shouldn't point the .com at the .co.uk?
The ".co.uk" "regional variation" has Google.co.uk SERPS, but doesnt rank in Google.com. But i want the .com version to rank in Google.com obviously. If I 301 it to .co.uk, then Google sees it as .co.uk and I wont achieve any decent Google.com rankings.
But the content (product data) is the same for both US and UK markets. So how would one go about ranking the .com on Google.com and the .co.uk on Google.co.uk with similar data?
Surely Google woudn't penalise the same domain name but with different TLD if it contained the same content, as Google simply ensures the TLD determines the regional Google search required? Also, Google must realise this would be unfair on the honest layman website who simply provides good content to different markets. Penalising someone for delivering honest content on both widget.com and widget.co.uk, simply to attract consumers from different countries, is surely not happening?