Forum Moderators: Robert Charlton & goodroi
Will registering/hosting under a different company name and address be a way around this. I've heard some people say that yahoo etc can even check the credit card details used to register the domain.
Any tips would be appreciated. What do other multiple site owners do?
If you duplicate, both your sites are at risk - and would equally be at risk if they had no connection whatsoever.
Adding links, of course, will accellerate any problem.
If you have one topic, much better to merge the sites, removing all risk, halving your marketing / seo effort, and concentrating your ranking strength.
Quadrille's 7th law: Divide and be conquered.
Also Matt Cutts talked once about how you might use a different domain for different languages, implying it's perfectly okay to split your site into different URLs. So why shouldn't you do that with different topics?
Again though - I'd keep them together if that can work. Better to have one strong site than 2 half strong ones IMO
I make sure that links between the websites are natural ones - they lead to other pages with related subjects. It works very well for me. The only thing that has hurt me besides the rel="nofollow" problem is when I jump the gun and put adsense on a website too soon after publication. Apparently I get hit by some kind of MFA/spam filter.
In fact, I think G sees this, and treats the other sites to some degree like part of the same site, which is perfect. That way we get different names for different destinations, without losing the strength that comes from having a large site with lots of content.
>>The only thing that has hurt me besides the rel="nofollow" problem is when I jump the gun and put adsense on a website too soon after publication. Apparently I get hit by some kind of MFA/spam filter.
You've had problems from using rel="nofollow"? Is that to pages on the site or going to pages off the site?
I don't see this 'Google won't like it' argument, as what's important is the fact that you want to promote the sites to the viewing public, and let THEM decide what they want to look at.
So many threads have comments about 'good behaviour' with Google, but forget the fact that our sites are designed for the viewing public primarily.
It doesn't necessarily work that way in real life. Promoting a site to users means that there has to be some way for users to find the site, which they have to do in order to be able to decide whether they like the site or not. No rankings, no traffic = no users. So how can THEY decide what they want to look at if they don't ever find the site? Email marketing? Sending out postcards? Ads on TV or bus benches?
Google's opinion doesn't always coincide with users' opinion, incidentally. For example, I've got one site that Google thinks is worth no more than Supplemental Index, and the users do NOT agree. It's an incredibly good site for steady conversions to sales, which means VERY happy users are finding and liking it enough to buy stuff, and bookmark it up to 20% some months. At another search engine is where they're finding the site, which apparently has just exactly what they're looking for - even if it doesn't have enough links amassed to please Google.
So no, I don't worry too much about what Google likes as long as there are "happy campers" coming in from someplace. Once a site has proven to me that it makes visitors happy, then that's the time to start worrying about amassing the kind of links that Google "thinks" indicate a site that's worthy of finding - which may NOT be an information site; some people out there are actually looking to BUY STUFF.
Nevertheless, if it's something as simple as not overly cross-linking sites, for which there are very valid reasons, and for spreading out hosting, for which there are also very valid reasons, there's no sense in thumbing our noses at any search engine by doing things that can give an erroneous impression.
[edited by: Marcia at 10:55 am (utc) on July 2, 2006]
Google's opinion doesn't always coincide with users' opinion, incidentally. For example, I've got one site that Google thinks is worth no more than Supplemental Index, and the users do NOT agree. It's an incredibly good site for steady conversions to sales, which means VERY happy users are finding and liking it enough to buy stuff, and bookmark it up to 20% some months. At another search engine is where they're finding the site, which apparently has just exactly what they're looking for - even if it doesn't have enough links amassed to please Google.
Nobody would suggest that Google always gets it right; however, there's really a lot of evidence that 'supplementary' results largely result from duplication - and removing that would make life easier for everyone, not just Google.
Your 'happy users' would be just as happy, though not necessarily at your site - if there was no duplication.
And wouldn't the Internet be a better, uncluttered place?
And there is zero doubt that duplicating your own pages increases marketing effort, reduces ranking, and often confuses customers. There really is no good reason for dividing your efforts.
Your 'happy users' would be just as happy, though not necessarily at your site - if there was no duplication.
And wouldn't the Internet be a better, uncluttered place?
And there is zero doubt that duplicating your own pages increases marketing effort, reduces ranking, and often confuses customers. There really is no good reason for dividing your efforts.
I'm afraid that's mere imaginative gainsaying. There is zero duplication of pages, and no dividing of efforts that we know of in any sites mentioned here, nor have we seen them. Except in my case, my own sites, and I can vouch for the fact that those haven't got duplication. All unique as far as we're concerned, since no one here mentioned that they had duplicated any pages, nor is there any evidence of it. That is one heck of a lot of assumption to be making about a site you've never seen. :P
Duplication and/or even near-duplication are *not* the only reason for pages going supplemental - particularly on non-data driven sites with all uniquely written content on all the pages. And on newer sites as well. Any number of sites have other issues, it's not that confined or narrow. Let's not confuse and/or mislead fellow members and readers with allegations that could be taken the wrong way.
And BTW, if I read the title correctly, this thread is about the topic: "Owning multiple sites penalty."
[edited by: Marcia at 1:14 pm (utc) on July 2, 2006]
Not knowing your site, I could only possibly offer a 'general' response, which would be useful to most people, because most supplementary results do result from duplication.
If, as you say, yours don't (and I really don't know), then why do think your site has been made supplementary?
The thread is about multiple sites with similar themes, so don't worry, we're right on topic!
As far as "unique" sometimes not being enough:
Define: checksum [google.com]
It isn't hard and fast, because I've also seen pages with a ton of original content go supplemental, with probably close to 3-400 words worth of text, but again - insufficient IBLs from within the site, especially links from what could be considered the more "important" pages. Particularly if a site doesn't have many inbound links - which can happen when IBLs aren't actively pursued in the usual ways.
In some cases I've seen, pages went supplemental after being in the regular index and ranking well when they were "retired" seasonally and didn't have as many significant IBLs from within the site as they had when in season. When updated and re-linked from "important" pages when the "season" came again, they came out of the Supplemental index.
All in all, with regard to what the OP is asking about putting up a second site, wherever possible it's probably better to add to an existing site if the topics are compatible, if only for economy of effort and energy in promoting, and also because sometimes newer sites can seem "lean" if pages are put up before the site is fully developed and has enough inbound linkage well established to support the whole site. It seems to be necessary nowadays to have enough link juice coming in to a site as a whole in order to have it trickle down through the navigation as value for the inner pages.
There can be any number of reasons for pages going supplemental - or being supplemental when first indexed, especially in cases where pages go up that will be further developed as time goes on. It can sometimes be a case of redundancy rather than duplication. And actually, "unique" may not be good enough in all cases.
I would not dispute a word of that - but the fact remains that the vast majority of cases, duplication is the main issue - or the only issue.
Sometimes the content is totally unique - but the page is code-heavy, and so the unique stuff (yes, even 3-400 words) is dwarfed by shared copy, such as site promos, navigation and - most commonly - bloated code.
More often, and very, very commonly in the case of dynamic sites, the content may be unique, but there is simply not enough of it on many pages - a paragraph or a few sentences is often still 'a duplicate page' in google's eyes.
Often removing bloat by externalizing code in a css file, and removing boring marketing bumph (which really shouldn't be on every page!) is all that's required.
If interneal linking is a contributory factor, then a site map may move mountains.
I'm necessarily generalizing here, and these comments should not be taken to apply to any particular site - do I really need to say that every post? ;)
But they do arise out of looking at scores of sites where the specific problem has been "I've gone supplementary", and they have been found and reported by many others, too.
Part of the problem is that many people look at the finished page and say "it's unique" - while Google looks at the code and says "That's 2% of the page - call that unique? C'mon!"
Interestingly, I've found that 'apparent self duplication' is what many people miss - 'external duplication' is widely understood, and people tend to accept the risks - or know what to do.
I have two experiences where many urls from a domain were marked Supplemental. In both these cases, all those urls contained identical meta descriptions. When the site owners made those meta descriptions unique and specific to the page where they appeared, within a few days those urls once again were listed in the regular index.
It's not "proof" but it was highly suggestive, especially because it happened with two different domains. I also know of another person who saw similar activity. So a year ago I would have completely agreed that Google essentially ignores the meta description. But not today.
Even when meta tag descriptions are too similar, pages will end up in the supp index, which happened to another site of mine.
In another thread in the Google forum a poster suggested meta tag descriptions are not necessary for good rankings, I wonder how true that is? I'm not sure if I want to try that :)
I have two experiences where many urls from a domain were marked Supplemental. In both these cases, all those urls contained identical meta descriptions. When the site owners made those meta descriptions unique and specific to the page where they appeared, within a few days those urls once again were listed in the regular index.
Absolutely - we had exactly that experience ourselves when I noticed some of the site meta tags were the same for most pages. Changing that brought the pages out of supplemental.
However - that's clearly not the problem everyone is having. It is just the first thing to check, along with any other basic issues of uniqueness, functionality and and design of the pages.