homepage Welcome to WebmasterWorld Guest from 54.211.235.255
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Safely Crosslinking Sites for Users
Is the "rel nofollow" attribute appropriate?
Robert Charlton




msg:734526
 10:02 am on Jan 26, 2005 (gmt 0)

Over the past several years, there have been numerous forum discussions about the problems that excessive crosslinking can cause.

Some of the discussion has come from the perspective of using crosslinks to give other sites a PR or anchor text boost from a common network, with most questions about "how much crosslinking can you get away with?"

I have a question from another perspective. For some sites of company subsidiaries, I want to keep the crosslinking for users and... to avoid crosslinking complications... get rid of it for PR and anchor text. I'm thinking about the new "rel nofollow" attribute to partially unrelate the sites.

Some background for those who may not have observed it... as Google has come down hard on crosslinked networks, there's been a fair amount of collateral damage, in my opinion, to sites under one ownership that crosslink for business and navigation reasons, rather than for SEO reasons.. These generally are sites for different brands or product lines or locations, but having some commonality.

What I've seen is that Google often appears to be "clustering" the results of some sites that it sees as related or similar, dropping out returns from all but one of the sites on search queries that in any way overlap.

Different imprints of a textbook publishing company might be a good example. One imprint might be for science textbooks, another for law textbooks, another for history. They're all established as separate companies. They have different public images. But the separate sites for these may all be hosted on the parent company's server, and because they all share the term "textbooks," all but one might disappear.

The site commonalities may often go beyond simply crosslinking. There may be hosting or IPs in common, nameservers in common, similar patterns of inbound and outbound links, or related search targets.

Assuming some of these other issues are taken care of, and the sites exist more or less independently with independent inbound links, I'm wondering whether the new "rel nofollow" attribute might help in maintaining navigation between the sites while preventing at least some of the clustering difficulties.

Or, might this be an inappropriate use of the attribute that could in fact backfire?

Or, might the attribute not be necessary if the sites do have sufficient independent inbounds.

Lots more to be said about the above, about why companies have different product lines and locations in the first place, etc, and how this may also relate to Google's desire to cluster... but given that this is the way many companies are built, can "rel nofollow" help here?

Related reading...

Similar sites on same IP address
Ranking issues
[webmasterworld.com...]

 

caveman




msg:734527
 6:17 pm on Jan 26, 2005 (gmt 0)

Robert this is a pet issue of mine. The SE's have come to interpret cross linked networks as necessarily an indication of spam. But in fact there could be all sorts of reasons like those you mention (especially marketing reasons) why one would want different sites for different user groups and target audiences. Car companies make different cars, cereal companies make different cereals. This is a matter between the marketer and the user. But the SE's have inserted themselves to the detriment of users in at least some cases.

The obvious problem from the SE's point of view is that if a large network of related sites is created for marketing reasons (NOT SE reasons), and the network is cross linked, the network can artifically inflate it's standing in the SERP's, and is therefore at risk. There was a time when the issue was simply cross linking. Now it extends to common templates, dup text even if it makes sense for the user, etc.

But I stray.

With respect to your question, I would not use the nofollow tag for anything except user inserted content. Here is one case where following the SE's rules makes sense to me. To do otherwise seems to me to invite unnecessary scrutiny of your site as "overly SEO'd." At least until the SE's tell us otherwise.

The related larger issue you raise is how to overcome what has become a SEO issue when it is really a marketing issue. Ironically, my best solutions so far have been to combine network cross links onto separate "nav only" pages, and then use noindex on those pages. I don't bother with nofollow any more. (And the alternative is to get into redirects of various sorts, which is even more irritating, and comes with its own set of risks.) I say 'ironic' because I'm in essense using a variation of the nofollow tag when I use noindex for a nav page (both tell SE's to avoid something). For me, it is purely to avoid getting in trouble with the SE's.

The whole thing strikes me as stupid, since in this case I am not creating good, intuitive, useful nav for the user. I am using less than intuitive nav, and artificial workarounds, *for SE reasons*...to the detriment of the user. And it's not just a guess. My users have commented on it in the form of 'how we could improve' suggestions.

stace




msg:734528
 10:05 pm on Jan 26, 2005 (gmt 0)

I've never been sure just what degree of crosslinking I can get away with, so I've always tried to be cautious about doing it, even though nearly all of my sites are interrelated and share the same e-commerce store. None of my sites share IP addresses, which I figure has to be a good thing, although I don't have a dedicated server for any of them (too expensive...)

I just sort of figure as long as I keep the ratio of links to my own sites relative to other websites pretty even - I'll be ok. I also have a couple of sites that are only linked by 1 or 2 and not all of the sites. Obviously I have no great strategy in place and try to make it up as I go along! Each site does focus on something different - teaching vs. software vs. music vs. blogging - and it seems logical to me that these sites could all be owned by separate people, who are affiliates of the same store, so if I'm tripping a filter w/ that, a large percentage of Amazon affiliates, etc would as well - and I know that isn't happening...

I'm VERY cautious w/ my PR 6 (at one point it was a PR 7) main website - and don't have any outbound links on my homepage going to any of my other sites. Out of 600 or 700 pages, I'd say less than 10 links are going to other sites in my network from that main site.

If anybody has a good set of dos and do nots w/ regards to interlinking your own sites, I'd love to hear it!

Robert Charlton




msg:734529
 10:56 pm on Jan 26, 2005 (gmt 0)

With respect to your question, I would not use the nofollow tag for anything except user inserted content. Here is one case where following the SE's rules makes sense to me. To do otherwise seems to me to invite unnecessary scrutiny of your site as "overly SEO'd." At least until the SE's tell us otherwise.

caveman - Thanks for your thoughtful answer. I tend to agree about this use of rel nofollow.

What about javascript links? In some cases it would be nice not to have to confine crosslinks to an isolated link page. Downside to javascript... I hate to rely on js for navigation, period; and I'm also guessing at some point that Google will be crawling js links.

Car companies make different cars, cereal companies make different cereals. This is a matter between the marketer and the user. But the SE's have inserted themselves to the detriment of users in at least some cases.

This sums up the problem extremely well. Again, I think the effect we're seeing is collateral damage as the engines are trying to root out networks that monopolize the top pages of serps, but it could also be a difference of philosophy.

In the "real" world, marketers using multiple brands are trying to do something that's essentially similar to some using multiple sites... to catch eyeballs in lots of places. In the cases I'm talking about, I would think the sites would survive even manual review, because they're about very different product lines, but maybe not. Maybe there's a philosophical difference, or a difference in say, how Google regards the shelf-space in a store versus the listing space on page one of Google, and they're being extremely strict about it.

I just sort of figure as long as I keep the ratio of links to my own sites relative to other websites pretty even - I'll be ok.

I've thought this too, but it's hard to isolate the factors in a real world situation. I've also thought that a high percentage of genuinely independent inbounds will cure a lot of ills. I'm trying to cover all bases.

caveman




msg:734530
 12:14 am on Jan 27, 2005 (gmt 0)

I hate to rely on js for navigation, period; and I'm also guessing at some point that Google will be crawling js links.

Me too. Partly becasue it's not standard and I'd be doing it in this case, again, for SE reasons rather than because it makes any coding sense. And partly because of what you said about the SE's reading js.

marketers using multiple brands are trying to do something that's essentially similar to some using multiple sites... to catch eyeballs in lots of places.

Yes and not only that. It's also a user thing. One analogy that makes lots of sense to me is magazine publishing. A given publisher may put out a Golf magazine, a Tennis magazine and an Polo magazine. They don't bundle all three and tell you you should buy them as one magazine called "Sports for the Rich." Each is separate for a reason. Different audiences, different interests, different profiles. Yet the SE's would have site a site owner build one site and make the Tennis visitors navigate past Polo content. Or risk being penalized. Just plain goofy.

I understand that producing hundreds of hotel sites, one for each city, with identical templates and all interlinked causes problems for the SE's (I don't run any hotel reseller sites btw). But what if they are e-commerce sites and one is about widgets and one is about blammo's and widgets and blammo's have nothing to do with each other, and have very different user bases?

Or what if widgets and blammo's share intersecting but not identical user bases? If users of the widget site like the way the site is organized and like the way the cart works, and then learn by way of a link that the same publisher also makes a site for blammo's, they might like that site too. What if there are ten our twenty sites, all featuring very different products/services? Should a user who likes one not be made aware that the others exist too by some sort of nav structure?

There has to be a better way than just forcing them all out of the SERP's because they interlink, even if only conservatively.

======

One common refrain is just not always true: "Build sites for the user." Sometimes, it's all about the SE's. Sadly.

Robert Charlton




msg:734531
 5:52 am on Jan 28, 2005 (gmt 0)

I'm seeing something interesting on searches for vanished related sites. Try the old trick of inserting negative nonsense words into the query. It seems to be working again, at least for now.

From msg #37 on the Nailing Down the Sandbox thread...
[webmasterworld.com...]

See where your sandboxed site would rank if it weren't sandboxed.
your keyword -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf

(must be at least 13 nonsense words)

One site that I'm using as an example can't be found for its own name, at least not in the top 500. The page that does rank on this search is its "sister" site, another division of the same company. It ranks for the page that links to the site in question. Unfortunately, all of the targeted searches of the site have also disappeared, and the sister site doesn't perform nearly as well.

With the negative nonsense words added to the search, the example site comes up #s 1 and 2 for its own name (and also for the rest of its targets). I've tried this on another set of sites I'm just beginning to analyse, and some other old examples... and results are pretty much the same.

These can't be "sandbox" penalties in the traditional sense, as all the sites in question (and their primary inbound links) are well over a year old.

Remembering the discussions about these nonsense words during Florida, it was theorized that there was some sort of link pattern analysis going on, and the nonsense words do something like exhaust the number of link patterns that Google can check. That certainly would fit here. Why age enters into it with the sandbox is another topic.

There has to be a better way than just forcing them all out of the SERP's because they interlink, even if only conservatively.

Well, they interlink, and there are some other relationships.

I do agree that there has to be a better way, but I will say this... Looking at some of the serps with the nonsense words applied (ie, without Google's current filters), I far prefer what we've got now, collateral damage and all. The junk that rises to the top is unbelievable.

caveman




msg:734532
 6:40 pm on Jan 28, 2005 (gmt 0)

>The junk that rises to the top is unbelievable.

Yeah, undoubtedly this is why they do what they do. Like I said, I understand their motivation. They are probably not sufficiently motivated to figure out ways to separate the wheat from the chaf.

Regarding your comments on kw's lost: It seems to me that when a site is part of a network and is hit this way, the site loses all or almost all of its relevant kw's, even if the site is not overly optimized and is simply part of a network.

I've even seen it happen when the site does not particluarly benefit algo-wise by being part of the network (e.g., gets 20 network links, and thousands of non-network links).

This sort of phenomenon became pretty apparent starting with the Florida update. Some people were saying that it was money kw's (and I personally believe that money kw's played a role), but the phenomenon was not limited to money kw's in a lot of cases.

ciml




msg:734533
 2:11 pm on Jan 29, 2005 (gmt 0)

[google.com...]
From now on, when Google sees the attribute (rel="nofollow") on hyperlinks, those links won't get any credit when we rank websites in our search results.

It would be nice to think that those links would be ignored before any crosslinking or 'too similar anchor text' calculations are made, but in the absence of certainty I think I'll stick to the tried and tested.

Current options include IFRAME with the framed page robots excluded (loses PageRank?), linking to a redirect script with robots excluded (loses PageRank), a server side image map, or JavaScript (in an external file as Google will often follow a URI in a <SCRIPT> element on a page).

caveman




msg:734534
 7:31 pm on Jan 29, 2005 (gmt 0)

> It would be nice to think that those links would be ignored before any crosslinking or 'too similar anchor text' calculations are made, but in the absence of certainty I think I'll stick to the tried and tested.

Heh, heh. It would be nice if they ignored all sorts of evidence of SEO. ;-) Seems to me however, that their "red flag/SEO" sniffers take into account every little piece of evidence they can find.

Robert Charlton




msg:734535
 7:44 pm on Feb 17, 2005 (gmt 0)

Regarding your comments on kw's lost: It seems to me that when a site is part of a network and is hit this way, the site loses all or almost all of its relevant kw's, even if the site is not overly optimized and is simply part of a network.

I've even seen it happen when the site does not particluarly benefit algo-wise by being part of the network (e.g., gets 20 network links, and thousands of non-network links).

I've waited until Allegra has settled down before posting this....

On most datacenters now, I'm seeing considerable relaxation of the downgrading of this effect. Legitimately linked sites, as long as crosslinking is not excessive and there are some independent inbounds, seem to have come back. It appears that Google has become much more discriminating in which site in the linked group it's suppressing and for which keywords/phrases.

So... if partner-A mentions partner-B and links to it, partner-B no longer gets nuked for its own name or for other phrases unique to it.

Not completely sure about interlinked sites going after the same keywords... This appears to have been relaxed somewhat. Anyone else seeing this?

I'm also seeing movement upwards on some sites I've been called upon to fix, where the setup is in fact really junk right now and deserves to be nuked. Common inbounds from mirrored links pages from an SEO company in India... all sites with ROS interlinks, etc... and yet these have moved up from roughly #300 to now about #90 (for a very non-competitive 4-word phrase ;) ).

So, I see more relaxation of this filtering than I in fact was even wishing for. It indicates to me that cleaning up the inbound links to sites in a group and getting some independent inbounds would perhaps help more.

I'm still wondering about whether I should let the sites interlink with straight unblocked html links, though. This kind of flitering could happen again.

caveman




msg:734536
 9:54 pm on Feb 17, 2005 (gmt 0)

Hi Robert,

Out of curiosity, when you refer to post-Allegra loosening of cross linking, are you referring to a "handful" of related sites, or larger networks?

I must say I don't see any loosening for larger networks (e.g., 20+ sites).

If you do, it woud prompt me to take a closer look to see if some apsects of cross site linking among related/owned sites have been loosened.

twebdonny




msg:734537
 11:27 pm on Feb 17, 2005 (gmt 0)

travel.yahoo.com

That's why I believe cross linking has no negative affect at all.

Robert Charlton




msg:734538
 5:39 am on Feb 18, 2005 (gmt 0)

Out of curiosity, when you refer to post-Allegra loosening of cross linking, are you referring to a "handful" of related sites, or larger networks?

I'm talking about several handfuls I monitor... groups of two, four, six sites. These do have a few independent links (by "independent links," I mean links not coming from the same page or domain the other sites' links are coming from).

With one pair of sites, the site with the slightly lower PR has returned from oblivion to top rankings for its terms.

I must say I don't see any loosening for larger networks (e.g., 20+ sites).

The one larger network I watch is about 40 sites... all independent brick and mortar establishments with completely separate real-world identities under the same corporate umbrella. Here they have basically the same corporate links page on every site but practically no crosslinking elsewhere.

They're on the same server, though, and they often share the same directory category when they're listed in directories, because they're related by location and by business category. To varying degrees, they all have some independent links as well.

I'm seeing that for, say, 15 of these sites that are going after some of the same keywords, one or two are now ranking in the top 10. Once upon a time, they dominated half of the top 20... and, more recently, they'd all dropped way down.

I was surprised to see them return with Allegra. This is the group I was referring to when I said: "Not completely sure about interlinked sites going after the same keywords... This appears to have been relaxed somewhat."

travel.yahoo.com

Haven't look at it in any detail, but I've got to believe it would be a special case. I remember the first time I had a PR7 site to optimize, I could virtually do no wrong. It's amazing what several thousand really good, solid, independent inbounds will do to make a site credible (which in fact it was). travel.yahoo.com is much more credible than my PR7 was.

twebdonny




msg:734539
 5:05 am on Feb 21, 2005 (gmt 0)

www.google.com/sitemap.htm

that should be the definitive answer

Spine




msg:734540
 7:07 pm on Feb 21, 2005 (gmt 0)

One guy in my sector has quite a few sites on slightly different subjects. Most of them are interlinked, but in a circuit a > b > c > . He has links incoming from dmoz and random blog, board and other links, but only links out to his own sites and to affiliate pages (about 14 or 15 links a page, 1/2 affiliate and half his own sites).

He does quite well in the SERPs, and his IPs are similar enough, with the same whois info, that G would be able to spot his network easily I'd think.

It might depend on the PR of the interlinking sites and / or being in DMOZ.

Robert Charlton




msg:734541
 7:31 am on Mar 5, 2005 (gmt 0)

I'm noticing that one Google dc, 66.102.7.104, appears to be more likely to cluster results than some of the others. I don't want to go into details of what I'm seeing right now, as my thoughts are based on fleeting observation plus speculation, but I'm wondering if anyone is seeing more clustering on this set of results.

Common inbounds from mirrored links pages from an SEO company in India... all sites with ROS interlinks, etc... and yet these have moved up from roughly #300 to now about #90 (for a very non-competitive 4-word phrase

This site has dropped down now to about #150 for this search... from what I can tell, pretty much across the board (ie, on all data centers).

I should mention that there's another thread that's started up on roughly the same subject, whether to crosslink some company sites for users or not...

Branding Issues
Decided to put all domain names under one "umbrella".
[webmasterworld.com...]

Re twebdonny's comment...
www.google.com/sitemap.htm
that should be the definitive answer

twebdonny - I've got to confess, I don't get your point.

phantombookman




msg:734542
 9:41 am on Mar 5, 2005 (gmt 0)

I cross link all my sites and there have been (touch wood) no negative effects, quite the contrary in fact.

I do it sparingly and logically for the benefit of the visitor.
It is only natural to promote your other businesses if you had a B&M store selling red widgets and another selling blue ones you would surely advertise the fact in both stores, what you would not do is plaster 100's of posters for one store all over the other.

I believe Google realises and accepts this hence no problem

glengara




msg:734543
 11:18 am on Mar 5, 2005 (gmt 0)

"..for the purpose of improving PR or ranking"

IMO crosslinking is viewed as a potential links scheme, if the rest of the linkage is "clean", G seems fairly tolerant of it in moderation.

Some months ago I noticed quite a number of sites that had dropped had been crosslinking quite happily until they acquired some ROS links, at which point G seems to have decided the linkage pattern was for ranking/PR purposes.

I hope we get some more clarification on how the SEs view and handle the attribute, as it struck me as the ideal answer to crosslinking when it first appeared.

Tropical Island




msg:734544
 1:43 pm on Mar 5, 2005 (gmt 0)

We have three sites and, for the sake of easy explanation, we will say that one is the "State or Province" site, one is the "City" site and one is a "Business" site within the "City". These 3 sites are hosted by the same hosting company and have separate IP's.

It would be unfair to visitors not to have links between these three sites and as a result we have seen them been affected from time to time by changing Google algos.

We have not changed the basic structure of these sites for the last 3 years and we are happy to say that all appear very well within their niches in Google and other search engines.

I do not believe that reasonable interlinking is harmful. It is how the Internet is designed. It may be that this interlinking only involves three sites and is acceptable in Google's eyes.

We support the business site with AdWords & Over to take care of the fluctuations in the algos. Makes it easy to sleep at night.

spaceylacie




msg:734545
 2:25 pm on Mar 5, 2005 (gmt 0)

My sites are all cross-linked, all related to arts and crafts... one site has a different host(all have different IPs), the rest have the same host.

The cross linking with the site with a different host seems to be much more effective, accounting for many backwards links.

My conclusion: cross linking works better when the sites are hosted by different entities.

Kimkia




msg:734546
 9:48 pm on Mar 5, 2005 (gmt 0)

Some months ago I noticed quite a number of sites that had dropped had been crosslinking quite happily until they acquired some ROS links, at which point G seems to have decided the linkage pattern was for ranking/PR purposes.

Could you please tell me what ROS links are?

Robert Charlton




msg:734547
 12:49 am on Mar 6, 2005 (gmt 0)

Could you please tell me what ROS links are?

Hi Kimkia - From the WebmasterWorld Glossary (there's a link to it at the top of your page)...

"Run of Site. An ad that can be placed anywhere on a website without restrictions.."

Something interesting I notice about this definition... it automatically assumes such links are "ads," an assumption that search engines undoubtedly make too.

For a discussion of the value of ROS links, see...

What is the Value of Run of Site Text Link Purchase?
How do you use this potential goldmine/ pitfall?
[webmasterworld.com...]

FourDegreez




msg:734548
 4:04 am on Mar 6, 2005 (gmt 0)

When people talk about cross-linking, it would help if I knew exactly what kind of cross-linking you're talking about. Are you talking about ROS links? Links from the main page? Links from a links page? Links from a few deep content pages? Links embedded in content? All of the above (i.e. the SE doesn't care...a single cross link anywhere on each domain is enough to raise a flag)?

spaceylacie




msg:734549
 5:59 am on Mar 6, 2005 (gmt 0)

What I mean by "cross-linking" is having a link to the main page of all of my sites on almost all of my pages.

I do it by adding something like this to the bottom of my pages:

Family Of Related Sites:(my sites are arts and crafts so I want it to feel personal)... then I list my sites...

Robert Charlton




msg:734550
 7:14 am on Mar 6, 2005 (gmt 0)

When people talk about cross-linking, it would help if I knew exactly what kind of cross-linking you're talking about. Are you talking about ROS links? Links from the main page? Links from a links page? Links from a few deep content pages? Links embedded in content? All of the above (i.e. the SE doesn't care...a single cross link anywhere on each domain is enough to raise a flag)?

A reminder that the topic of this thread is not about how much cross-linking you can get away with. That should be another thread. This thread is about how to make cross-linking that's primarily intended for users safe, and, in that regard, caveman and ciml probably answered the question.

A follow up question, though, might be: are these techniques legit? Google clearly doesn't want networks of sites to dominate their serps, and I do understand their problem. So, might sites that "safely cross-link for users" really be trying to have their cake and eat it too... trying to pretend separation to permit overlapping serps?

How much cross linking? I assume it might be any degree of cross-linking. I could give you examples in client sites I've seen that have ranged from...
- two related sites sharing only one link each (ie, one reciprocal link).
- 50 sites with a corporate links page on each site that linked to each one of the 50, but that was the only cross-linking.
- a half-dozen sites cross-linking to each other on every page.
- a half-dozen sites cross-linking only from corporate links pages and some home pages.

It turns out, in my experience anyway, that the sites that don't cross-link a whole lot for users are the ones that aren't trying to game the engines either. ROS (Run Of Site) links are immediately suspect. There quickly arise some shades of grey, though, as some real-world separate brands of related products do end up overlapping in serps somewhat.

I think separation of hosting, separation of inbound link sources, separation of search terms, the competivity of search terms, and the insulation of the sites directly from each other are all interrelated.

My concern in this thread is primarily with this last point, because there are clients who like corporate identity to be shared among brands. Google has been variable enough on this lately that I'd just prefer to keep sites as separate as possible. I'd like to do it legitimately, though I suppose even that could be seen as manipulation. ;)

FourDegreez




msg:734551
 5:07 pm on Mar 6, 2005 (gmt 0)

A reminder that the topic of this thread is not about how much cross-linking you can get away with.

That's not what I want to know. What I want to know is how agressive is Google being about going after cross-linking, because surely some of the types I listed are more legitimate than others. For instance, let's say I have a site about widgets and a site about parasailing. I have a page on my widgets site about widgets for parasailers, and on that page I put a link to my parasailing site because the content is related and visitors may be interested. Both sites are hosted on the same server. I've done this linking sparingly, but I think it is legitimate.

So when people are talking about penalties for cross-linked sites, I want to know, specifically what kind of cross-linking are they talking about.

- two related sites sharing only one link each (ie, one reciprocal link).
- 50 sites with a corporate links page on each site that linked to each one of the 50, but that was the only cross-linking.
- a half-dozen sites cross-linking to each other on every page.
- a half-dozen sites cross-linking only from corporate links pages and some home pages

How has Google responded to these different scenarios, in your experience?

<added>
Also, I'm curious about deep-content pages from Site A linking to deep-content pages from Site B.

Robert Charlton




msg:734552
 4:10 am on Mar 7, 2005 (gmt 0)

How has Google responded to these different scenarios, in your experience?

FourDegreez - As I've mentioned in this thread and in the earlier thread I cite, I've seen them all get nuked. I've recently seen the filters (or whatever you want to call them) on the two sites with one link between them apparently get turned off, but I anticipate that this could happen again.

I've also seen some filters on sites with other linking arrangements get relaxed a bit, but I'm also reading reports of people going for the same targets from multiple sites getting hit harder... and not everybody sees the same things I'm seeing.

Again, there are a whole bunch of variables, which I point out in msg #25., giving different Google responses for different conditions. To repeat...

I think separation of hosting, separation of inbound link sources, separation of search terms, the competivity of search terms, and the insulation of the sites directly from each other are all interrelated.

Inbound linking patterns are, I believe, a big factor. There is no one answer, though, and the landscape is apparently ever changing.

I optimize sites for clients, often where the domain is the brand name... so I look at this differently from someone who may have twenty widget sites with a hundred more domains in the closet. I don't think it's worth a PR5 link or whatever to cross-link two widgets sites.

What is more important to many clients, though, is navigation among the corporate family. At this point, I'd only put a group of those family links on a "noindex" page.

Whether I'd put additional cross-links on the site more prominently, and use, say, javascript, is something I'm still debating. Haven't yet looked into encoding a link, the way you might encode an email address, but that is another option I'm considering.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved