Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Owning multiple sites penalty

Whats the way around this?

         

chrisandsarah

3:15 pm on Jun 30, 2006 (gmt 0)

10+ Year Member



Hello
I'm thinking of starting up a new site, but i'm worried that although it will be unique in content from my other main site, it will be similiar themed.
Would it be wise when registering the domain name, to use a different contact name,address,email etc (is hotmail ok?)
I definitely will be linking from my main to the new site, as the visitors will find it interesting, but i'm worried the SE's will see them as too closely themed, and not ban it, but not rank it so high. Especially if i interlink.

Will registering/hosting under a different company name and address be a way around this. I've heard some people say that yahoo etc can even check the credit card details used to register the domain.

Any tips would be appreciated. What do other multiple site owners do?

tedster

5:41 pm on Jun 30, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the content of each individual site is strong, and they both gain quality natural backlinks from independent sources, then a few cross-links with your own properties is not a problem, in my experience. But if the majority of links are from your own properties, then those links may be devalued or ignored and so it is more diffictul to show up in competitive SERPs.

Quadrille

5:48 pm on Jun 30, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If anything hurts you, it will be links and duplication, not ownership details, hosts or contact emails.

If you duplicate, both your sites are at risk - and would equally be at risk if they had no connection whatsoever.

Adding links, of course, will accellerate any problem.

If you have one topic, much better to merge the sites, removing all risk, halving your marketing / seo effort, and concentrating your ranking strength.

Quadrille's 7th law: Divide and be conquered.

ScottD

6:17 pm on Jun 30, 2006 (gmt 0)

10+ Year Member



I think if you can, just keep one site. However we run different sites for different travel destinations, and that seems to be okay. If I had enough content for a global travel site though, that would be better.

Also Matt Cutts talked once about how you might use a different domain for different languages, implying it's perfectly okay to split your site into different URLs. So why shouldn't you do that with different topics?

Again though - I'd keep them together if that can work. Better to have one strong site than 2 half strong ones IMO

mimmson

6:28 pm on Jun 30, 2006 (gmt 0)

10+ Year Member



This makes me a little nervous also. I have 2 sites however one feature the content and store, another is solely for the BBS. based on advice here, I put the BBS on it's own domain to be safe against spam, hacks, etc.

Quadrille

6:30 pm on Jun 30, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So why shouldn't you do that with different topics?

No worries doing it for different topics, though you fail to concentrate your ranking. You'd need to duplicate marketing / seo to some extent, anyway.

The problems only really arise in splitting your site within the same topic.

gdawg

8:47 pm on Jun 30, 2006 (gmt 0)

10+ Year Member



I am very familar with this issue as the company I work for has two websites with very similar content in that they actually share the same database. Obviously, the sites designs are completely different, but they do have similar file structures. All the whois info for both sites are exactly the same (parent company). The biggest things to look out for are duplicate content and cross linking as mentioned above. I try not to cross link unless I absolutely have to. I would make sure both sites have seperate IP addresses and if possible, different C classes. From my experience the SERPS tend to favor one site over the other with the larger of the two sites getting more rankings.

JudgeJeffries

11:40 am on Jul 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wearing my tin foil hat I would advise different everything. I also suspect that writing styles may be identifiable so different writers is the order of the day for me.
Just because you're not neurotic doesnt mean to say they're not looking at you.

DoingItWell

11:55 am on Jul 1, 2006 (gmt 0)

10+ Year Member



I'm doing this balancing act with several travel destination websites that cover close destinations and to a point the same content. I have links between them when it is natural - related subjects - and a general set of reciprocal links on the front pages: a text "Similar sites in this group" and the links with just the destination name as link text. Works like a charm, same WHOIS info, same server for most of the domains etc. I had a rel="nofollow" on the reciprocal links for a while, but it seemed to hurt the newer sites so I removed it.

I make sure that links between the websites are natural ones - they lead to other pages with related subjects. It works very well for me. The only thing that has hurt me besides the rel="nofollow" problem is when I jump the gun and put adsense on a website too soon after publication. Apparently I get hit by some kind of MFA/spam filter.

ScottD

9:58 am on Jul 2, 2006 (gmt 0)

10+ Year Member



We're also in travel, and I can't think of anything more natural than offering other destinations on the home page.

In fact, I think G sees this, and treats the other sites to some degree like part of the same site, which is perfect. That way we get different names for different destinations, without losing the strength that comes from having a large site with lots of content.

Marcia

10:06 am on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Aside from anything else, unless there's good reason not to, it can't hurt to have sites on the same or similar topic hosted on different C-blocks.

>>The only thing that has hurt me besides the rel="nofollow" problem is when I jump the gun and put adsense on a website too soon after publication. Apparently I get hit by some kind of MFA/spam filter.

You've had problems from using rel="nofollow"? Is that to pages on the site or going to pages off the site?

Eazygoin

10:40 am on Jul 2, 2006 (gmt 0)

10+ Year Member



If you have a more than one site, what is wrong with promoting each site from every other site?

I don't see this 'Google won't like it' argument, as what's important is the fact that you want to promote the sites to the viewing public, and let THEM decide what they want to look at.

So many threads have comments about 'good behaviour' with Google, but forget the fact that our sites are designed for the viewing public primarily.

Marcia

10:51 am on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>I don't see this 'Google won't like it' argument, as what's important is the fact that you want to promote the sites to the viewing public, and let THEM decide what they want to look at.

It doesn't necessarily work that way in real life. Promoting a site to users means that there has to be some way for users to find the site, which they have to do in order to be able to decide whether they like the site or not. No rankings, no traffic = no users. So how can THEY decide what they want to look at if they don't ever find the site? Email marketing? Sending out postcards? Ads on TV or bus benches?

Google's opinion doesn't always coincide with users' opinion, incidentally. For example, I've got one site that Google thinks is worth no more than Supplemental Index, and the users do NOT agree. It's an incredibly good site for steady conversions to sales, which means VERY happy users are finding and liking it enough to buy stuff, and bookmark it up to 20% some months. At another search engine is where they're finding the site, which apparently has just exactly what they're looking for - even if it doesn't have enough links amassed to please Google.

So no, I don't worry too much about what Google likes as long as there are "happy campers" coming in from someplace. Once a site has proven to me that it makes visitors happy, then that's the time to start worrying about amassing the kind of links that Google "thinks" indicate a site that's worthy of finding - which may NOT be an information site; some people out there are actually looking to BUY STUFF.

Nevertheless, if it's something as simple as not overly cross-linking sites, for which there are very valid reasons, and for spreading out hosting, for which there are also very valid reasons, there's no sense in thumbing our noses at any search engine by doing things that can give an erroneous impression.

[edited by: Marcia at 10:55 am (utc) on July 2, 2006]

glengara

10:54 am on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



*In fact, I think G sees this, and treats the other sites to some degree like part of the same site, which is perfect.*

Hope you're right Scott, it would make me somewhat nervous, but I'm not in the travel sector D.G. ;-)

Quadrille

12:40 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google's opinion doesn't always coincide with users' opinion, incidentally. For example, I've got one site that Google thinks is worth no more than Supplemental Index, and the users do NOT agree. It's an incredibly good site for steady conversions to sales, which means VERY happy users are finding and liking it enough to buy stuff, and bookmark it up to 20% some months. At another search engine is where they're finding the site, which apparently has just exactly what they're looking for - even if it doesn't have enough links amassed to please Google.

Nobody would suggest that Google always gets it right; however, there's really a lot of evidence that 'supplementary' results largely result from duplication - and removing that would make life easier for everyone, not just Google.

Your 'happy users' would be just as happy, though not necessarily at your site - if there was no duplication.

And wouldn't the Internet be a better, uncluttered place?

And there is zero doubt that duplicating your own pages increases marketing effort, reduces ranking, and often confuses customers. There really is no good reason for dividing your efforts.

Marcia

1:02 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Your 'happy users' would be just as happy, though not necessarily at your site - if there was no duplication.

"My happy users" are casting their votes after seeing the pages and reading the text - they are not guessing about pages they've never seen. FYI, if you are referring to MY pages, there is not one single speck of duplication. Nor is there duplication on *many* other pages and on many other sites I've seen that are Supplemental.

And wouldn't the Internet be a better, uncluttered place?

A better place if what? Better based on what facts or conditions, or upon what presumptions?

And there is zero doubt that duplicating your own pages increases marketing effort, reduces ranking, and often confuses customers. There really is no good reason for dividing your efforts.

Whose own pages? Whose pages is that referring to? Which of the sites referred to in this thread would that be about?

I'm afraid that's mere imaginative gainsaying. There is zero duplication of pages, and no dividing of efforts that we know of in any sites mentioned here, nor have we seen them. Except in my case, my own sites, and I can vouch for the fact that those haven't got duplication. All unique as far as we're concerned, since no one here mentioned that they had duplicated any pages, nor is there any evidence of it. That is one heck of a lot of assumption to be making about a site you've never seen. :P

Duplication and/or even near-duplication are *not* the only reason for pages going supplemental - particularly on non-data driven sites with all uniquely written content on all the pages. And on newer sites as well. Any number of sites have other issues, it's not that confined or narrow. Let's not confuse and/or mislead fellow members and readers with allegations that could be taken the wrong way.

And BTW, if I read the title correctly, this thread is about the topic: "Owning multiple sites penalty."

[edited by: Marcia at 1:14 pm (utc) on July 2, 2006]

Quadrille

1:12 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wow, slow down there.

Not knowing your site, I could only possibly offer a 'general' response, which would be useful to most people, because most supplementary results do result from duplication.

If, as you say, yours don't (and I really don't know), then why do think your site has been made supplementary?

The thread is about multiple sites with similar themes, so don't worry, we're right on topic!

Marcia

1:39 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There can be any number of reasons for pages going supplemental - or being supplemental when first indexed, especially in cases where pages go up that will be further developed as time goes on. It can sometimes be a case of redundancy rather than duplication. And actually, "unique" may not be good enough in all cases.

As far as "unique" sometimes not being enough:

Define: checksum [google.com]

It isn't hard and fast, because I've also seen pages with a ton of original content go supplemental, with probably close to 3-400 words worth of text, but again - insufficient IBLs from within the site, especially links from what could be considered the more "important" pages. Particularly if a site doesn't have many inbound links - which can happen when IBLs aren't actively pursued in the usual ways.

In some cases I've seen, pages went supplemental after being in the regular index and ranking well when they were "retired" seasonally and didn't have as many significant IBLs from within the site as they had when in season. When updated and re-linked from "important" pages when the "season" came again, they came out of the Supplemental index.

All in all, with regard to what the OP is asking about putting up a second site, wherever possible it's probably better to add to an existing site if the topics are compatible, if only for economy of effort and energy in promoting, and also because sometimes newer sites can seem "lean" if pages are put up before the site is fully developed and has enough inbound linkage well established to support the whole site. It seems to be necessary nowadays to have enough link juice coming in to a site as a whole in order to have it trickle down through the navigation as value for the inner pages.

Quadrille

2:51 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There can be any number of reasons for pages going supplemental - or being supplemental when first indexed, especially in cases where pages go up that will be further developed as time goes on. It can sometimes be a case of redundancy rather than duplication. And actually, "unique" may not be good enough in all cases.

I would not dispute a word of that - but the fact remains that the vast majority of cases, duplication is the main issue - or the only issue.

Sometimes the content is totally unique - but the page is code-heavy, and so the unique stuff (yes, even 3-400 words) is dwarfed by shared copy, such as site promos, navigation and - most commonly - bloated code.

More often, and very, very commonly in the case of dynamic sites, the content may be unique, but there is simply not enough of it on many pages - a paragraph or a few sentences is often still 'a duplicate page' in google's eyes.

Often removing bloat by externalizing code in a css file, and removing boring marketing bumph (which really shouldn't be on every page!) is all that's required.

If interneal linking is a contributory factor, then a site map may move mountains.

I'm necessarily generalizing here, and these comments should not be taken to apply to any particular site - do I really need to say that every post? ;)

But they do arise out of looking at scores of sites where the specific problem has been "I've gone supplementary", and they have been found and reported by many others, too.

Part of the problem is that many people look at the finished page and say "it's unique" - while Google looks at the code and says "That's 2% of the page - call that unique? C'mon!"

Interestingly, I've found that 'apparent self duplication' is what many people miss - 'external duplication' is widely understood, and people tend to accept the risks - or know what to do.

Quadrille

2:57 pm on Jul 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OOoops!

I forgot to add that lack of meta description / TITLE or duplication of them, is increasingly a factor; Google has suddenly got serious about meta tags.

Which may be a serious clue for the future ;)

mboydnv

6:55 pm on Jul 2, 2006 (gmt 0)

10+ Year Member Top Contributors Of The Month



I too am in travel, and added adsense and did not have any problems. Why would adsense activate a spam filter?

jackson992

6:46 am on Jul 5, 2006 (gmt 0)

10+ Year Member



Nowhere on any forum will you see it being said that the supplemental index issue is caused by duplicate content. This is only an opinion and should not be presented as a fact. There are several reasons pages go in the supp index and very often they come back out with no action being taken on the webmasters part. Meta tags are also irrelevant as it is widely accepted that Google does not base any kind of rankings on them. The only thing they sometimes do is use the meta description in the description on a google search.

tedster

7:46 am on Jul 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do not believe that meta tags are completely irrelevant to the Supplemental Index.

I have two experiences where many urls from a domain were marked Supplemental. In both these cases, all those urls contained identical meta descriptions. When the site owners made those meta descriptions unique and specific to the page where they appeared, within a few days those urls once again were listed in the regular index.

It's not "proof" but it was highly suggestive, especially because it happened with two different domains. I also know of another person who saw similar activity. So a year ago I would have completely agreed that Google essentially ignores the meta description. But not today.

whatcartridge

2:27 am on Jul 6, 2006 (gmt 0)

10+ Year Member



I can back you up there Tedster. I had a couple of sites where meta tag descriptions were identical and they ended up all supp. Once I made the meta tag descriptions unique, the pages were back in the regular index.

Even when meta tag descriptions are too similar, pages will end up in the supp index, which happened to another site of mine.

In another thread in the Google forum a poster suggested meta tag descriptions are not necessary for good rankings, I wonder how true that is? I'm not sure if I want to try that :)

ScottD

5:25 pm on Jul 6, 2006 (gmt 0)

10+ Year Member



I have two experiences where many urls from a domain were marked Supplemental. In both these cases, all those urls contained identical meta descriptions. When the site owners made those meta descriptions unique and specific to the page where they appeared, within a few days those urls once again were listed in the regular index.

Absolutely - we had exactly that experience ourselves when I noticed some of the site meta tags were the same for most pages. Changing that brought the pages out of supplemental.

However - that's clearly not the problem everyone is having. It is just the first thing to check, along with any other basic issues of uniqueness, functionality and and design of the pages.