| 5:49 pm on May 17, 2008 (gmt 0)|
Try displaying the flag links via an iframe. That way the user sees them on every page but they only appear in the source code of one url - the source of the iframe.
| 8:24 pm on May 17, 2008 (gmt 0)|
Thought sitewide iframes were an even bigger red flag than sitewide links?
| 8:43 pm on May 17, 2008 (gmt 0)|
I use sitewide iframes frequently, with no problems whatsoever to date. Not sure where you heard that they were a problem -- but if they have been trouble for some people, I guess it depends on the intent and content of those iframes. Certainly iframes are used in injection attacks, to insert cross-domain links for various exploits and the like. Iframes of "zero by zero" size are also used in some spamming attempts. But neither of those are the kind of footprint I'm talking about at all.
I use iframed urls for sets of "customer service" utility links, for example. That helps to sculpt PageRank without joining the "Cult of the Nofollow" ;)
I also commonly use iframes for large chunks of boilerplate text, such as legal disclaimers. I prefer having some indexable text, rather than using an image file and iframes have served quite well.
| 7:11 pm on May 18, 2008 (gmt 0)|
|That helps to sculpt PageRank without joining the "Cult of the Nofollow" ;) |
Glad I'm not the only one. I've started removing all the nofollow tags from my sites slowly, I no longer feel a leash around my neck. Hopefully one day the web and Google will go the way of Russia and the Romanovs. Keep strong my brothers, freedom is coming ... ;)
Could be that all the bad vibes from iframes were coming from less-than-white SEOs that may have abused them. I'll revisit some threads and re-evaluate.
| 8:23 pm on May 18, 2008 (gmt 0)|
It disturbs me to do things like iframes or nofollows just to set completly reasonable links. However, in reality, my attitude may lead to suboptimal SERPs from my POV.
| 5:15 am on May 20, 2008 (gmt 0)|
|It disturbs me to do things like iframes or nofollows just to set completly reasonable links. |
I agree, I think this is over the top for legit links. I've used a few site wide links sparingly between related sites and I haven't noticed any penalties. I do put them up gradually however (not all sites linking to a new one at the same time, I wait a few months).
It just seems unnatural to me to use unfriendly solutions just not to offend some search engines. A cart before the horse type of situation.
| 7:41 am on May 20, 2008 (gmt 0)|
I see big companies massively using sitewides to cross-promote their websites.
If it makes sense, I see no reason to avoid it.
Sitewide links are a no-no if terms of link buying but not in terms of cross-linking between real sites.
| 7:56 am on May 20, 2008 (gmt 0)|
|I see big companies massively using sitewides to cross-promote their websites. |
Big companies will tend to have less filters applied against them. The rest of us who don't have Matt Cutts cell phone number can get 950'd for a few simple semantic errors!
| 12:56 pm on May 20, 2008 (gmt 0)|
Googles own Blogger software encourages sitewide links.
Would google penalise users of its own resources?
I use blogrolls and I have links on blogrolls, and I have not noticed any negative effects. Although I am aware of other people claiming negative effects from such type of linking.
Is this natural linking?
Yes, I believe it is. Blogging software has been designed to help non-html users get on the web and publish. They have the option to add links, many do. Surely this is natural linking?
| 1:06 pm on May 20, 2008 (gmt 0)|
In reference to <iframes>, won't they cause an alert with security software? You know, that "do you want to allow scripting across frames" type thingy?
<added> Never mind, this only applies when the iframe is pulling third party content and not a page from your own site.
[edited by: pageoneresults at 1:40 pm (utc) on May 20, 2008]
| 1:25 pm on May 20, 2008 (gmt 0)|
"It just seems unnatural to me to use unfriendly solutions just not to offend some search engines. A cart before the horse type of situation."
That was entirely my opinion until my income plunged. Google is just too powerfull to go against them. I've started removing sidewide links between my sites although I'm still not sure they really gets peanalized.
The I-frame idea sounds good to me.
| 1:51 pm on May 20, 2008 (gmt 0)|
Here's my thinking - I try to create the clearest possible relevance signal on every page. So I work to limit any content, including links, that isn't truly page-specific.
Since my first website, I have felt that short, concise menus perform best and are friendlier for the user in any case. i also feel that "utility" links should be easy for the user to find but minimized in the signals that they send to automated programs. Throw in the cross-domain factor, and my concerns go into overdrive.
I do have some "networked" sites that link cross-domain in a minimal way without nofollow or iframes - and i never tried to hide the relationship. The cross-links were in place since the sites launched, and they are there for the user. I always expected Google to at least minimize the link juice between these clearly related sites. After all, the Hilltop Algorithm has a long history, and also Google acquired it in 2003.
[edited by: tedster at 10:16 pm (utc) on May 22, 2008]
| 9:54 pm on May 22, 2008 (gmt 0)|
Indeed. The kind of cross-linking described in the OP can cause some problems with G's algo's. I believe it dates back to all those spammy mini-nets and maxi-nets that used to get created six or seven years ago, and subsequent attempts to minimize or exclude the impact of those set-up's.
But it has created this on-going problem for sites offering regional variations via country specific domains. The issue is especially bad when it involves the same language across multiple regions, e.g., a US .com and a British .co.uk. But it isn't limited to that.
The result is dup issues, cross linking issues, etc. There are a lot of confusing variations to the problem too, involving language, presense of local-specific content, IP's, backlink profiles, authority levels of the site(s), and more.
Unfortunately, site owners with links to sister domains in other countries need to find ways of limiting the visibility of those links to bots, at least at the deep page level (if not completely sitewide), especially when the sites are same language.
Personally I favor creation of a central corporate site in this case, using that site to handle the visible links...and obfuscating any cross-links on the regoinal sites, just to be safe. To hide the links, I used to recommend JS, but now I'm with tedster on iframes. And/or, I also like redirects through protected dir's.
I also *really* agree with tedster's comment about not trying to hide the connection between the sites. The point isn't to hide it; that can make things look sneaky when that was not the intent. The intent is purely to avoid getting tripped up by automated processes related to selecting and ranking sites for search queries.