Welcome to WebmasterWorld Guest from 220.127.116.11
I have a network of sites that link to each other. Each site, on every page links to the country specific version of that site i.e. the UK version (.co.uk) the US version (.com) and the Canadian version (a .com but slightly different domain name).
Each product is different with different attachments, instructions, reviews etc. for each country which is why they are not all hosted on the same site. But, we get visitors to the wrong site from search engines as the products are generally called the same no matter what country you are in.
So, to me, this is a natural use of sitewide links. I use an image of each country's flag as the link out and each link goes directly to the main page of that country specific site as that page has a full sitemap and product search facilities.
1. Could sitewide links trip a filter generally?
2. Could linking between same name domains with different TLDs (e.g. example.com, example.co.uk, example.ca etc.) be fine but it is the .com with a different name that is getting me penalised?
I use iframed urls for sets of "customer service" utility links, for example. That helps to sculpt PageRank without joining the "Cult of the Nofollow" ;)
I also commonly use iframes for large chunks of boilerplate text, such as legal disclaimers. I prefer having some indexable text, rather than using an image file and iframes have served quite well.
That helps to sculpt PageRank without joining the "Cult of the Nofollow" ;)
Glad I'm not the only one. I've started removing all the nofollow tags from my sites slowly, I no longer feel a leash around my neck. Hopefully one day the web and Google will go the way of Russia and the Romanovs. Keep strong my brothers, freedom is coming ... ;)
Could be that all the bad vibes from iframes were coming from less-than-white SEOs that may have abused them. I'll revisit some threads and re-evaluate.
It disturbs me to do things like iframes or nofollows just to set completly reasonable links.
I agree, I think this is over the top for legit links. I've used a few site wide links sparingly between related sites and I haven't noticed any penalties. I do put them up gradually however (not all sites linking to a new one at the same time, I wait a few months).
It just seems unnatural to me to use unfriendly solutions just not to offend some search engines. A cart before the horse type of situation.
I see big companies massively using sitewides to cross-promote their websites.
Big companies will tend to have less filters applied against them. The rest of us who don't have Matt Cutts cell phone number can get 950'd for a few simple semantic errors!
I use blogrolls and I have links on blogrolls, and I have not noticed any negative effects. Although I am aware of other people claiming negative effects from such type of linking.
Is this natural linking?
Yes, I believe it is. Blogging software has been designed to help non-html users get on the web and publish. They have the option to add links, many do. Surely this is natural linking?
<added> Never mind, this only applies when the iframe is pulling third party content and not a page from your own site.
[edited by: pageoneresults at 1:40 pm (utc) on May 20, 2008]
That was entirely my opinion until my income plunged. Google is just too powerfull to go against them. I've started removing sidewide links between my sites although I'm still not sure they really gets peanalized.
The I-frame idea sounds good to me.
Since my first website, I have felt that short, concise menus perform best and are friendlier for the user in any case. i also feel that "utility" links should be easy for the user to find but minimized in the signals that they send to automated programs. Throw in the cross-domain factor, and my concerns go into overdrive.
I do have some "networked" sites that link cross-domain in a minimal way without nofollow or iframes - and i never tried to hide the relationship. The cross-links were in place since the sites launched, and they are there for the user. I always expected Google to at least minimize the link juice between these clearly related sites. After all, the Hilltop Algorithm has a long history, and also Google acquired it in 2003.
[edited by: tedster at 10:16 pm (utc) on May 22, 2008]
But it has created this on-going problem for sites offering regional variations via country specific domains. The issue is especially bad when it involves the same language across multiple regions, e.g., a US .com and a British .co.uk. But it isn't limited to that.
The result is dup issues, cross linking issues, etc. There are a lot of confusing variations to the problem too, involving language, presense of local-specific content, IP's, backlink profiles, authority levels of the site(s), and more.
Unfortunately, site owners with links to sister domains in other countries need to find ways of limiting the visibility of those links to bots, at least at the deep page level (if not completely sitewide), especially when the sites are same language.
Personally I favor creation of a central corporate site in this case, using that site to handle the visible links...and obfuscating any cross-links on the regoinal sites, just to be safe. To hide the links, I used to recommend JS, but now I'm with tedster on iframes. And/or, I also like redirects through protected dir's.
I also *really* agree with tedster's comment about not trying to hide the connection between the sites. The point isn't to hide it; that can make things look sneaky when that was not the intent. The intent is purely to avoid getting tripped up by automated processes related to selecting and ranking sites for search queries.