Forum Moderators: Robert Charlton & goodroi
So, as a solution, I started to link to Google instead. I would come up with a carefully crafted Google search, which would present the Page(s) that I wanted to link to. By doing this, when one of my previously linked-to pages disappeared, Google would automatically link to the next best web page(s).
My perceived advantages are:
1. I don't have to spend hours and hours checking for link rot,
2. Google doesn't see any dead links when they index my pages, and
3. I certainly don't have to worry about "no follow" (ha-ha). I don't think that Google would ever get upset, by my linking to THEM!
Do you see any disadvantage to this approach? It seems to be working quite well.
.
.
Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
Matt Cutts also discusses this in his blog...
Search results in search results
[mattcutts.com...]
Google already does similar things with our web search results, Froogle, etc. to try to prevent our web search results from causing problems for any other engines’ index. In general, we’ve seen that users usually don’t want to see search results (or copies of websites via proxies) in their search results.
Google considers how a site's content adds to the user experience. In the light of the above, it's unlikely that linking to Google would reflect well on your site.
. . it's unlikely that linking to Google would reflect well on your site.
Wow. It looks like I have to rethink my strategy. I thought I was doing the right thing, but maybe not. I certainly don't want to be penalized, for using what I thought was a "good for the user" tactic.
It looks like I will be linking out far less frequently. Because I am getting tired chasing link rot. And I CERTAINLY don't want to do anything that Google disapproves of, knowingly.<edit>OR UNKNOWINGLY!,</edit>
.
.
[edited by: Sally_Stitts at 9:10 pm (utc) on Aug. 14, 2009]
Not sure how your users would react but on some of my client's sites I have set up a way to report broken links.
It is just a little icon that appears next to external links, clicking it brings up a box asking if they want to report this link as broken. If they click yes then it sends me an email.
It isn't perfect but it does help track down bad links and may save you some time. If links can be added via a CMS or user submissions then it is easy to build in the "report broken link" logic.
Tedster - It happened once to me - a link went pron. I dumped that link instantly. But, it looks like eternal vigilance is required.
phranque - xenu is PC only, and I am Mac only. However, I have used deadlinksdotcom with good success. But it is very time consuming. And it doesn't tell you when a link goes "bad", as Tedster mentioned.
The arguments for linking out -
1. Some visitors may find the link useful
2. ?
"Linking Out" has been weighed in the balance, and been found wanting.
I believe I am going to stop linking out, and I am going to start removing my outgoing links. They are simply too much of a hassle, they can get you in trouble, and they offer little in return.
Thoughts?
.
.
And, some of my best performing pages don't have ANY outgoing links. So I don't think there is any penalty for not having outgoing links. And I haven't noticed any "Boost" on my pages that have quite a few outgoing links.
.
.
[edited by: Sally_Stitts at 8:12 pm (utc) on Aug. 17, 2009]
I believe I am going to stop linking out, and I am going to start removing my outgoing links.
Generally users are not fond of having useful features removed.
Do you know how many of your visitors leave your site via your outbound links?
Have you tracked to see which of your outbound links your visitors use?
Which of your outbound links cause your visitors to recommend your site to their friends?
... stop linking out ... start removing ... Thoughts?
Your competitors will appreciate it!
Google does not like search results pages to appear in its serps
It was never my intent to have the search results appear in the SERPs. It was my intent to send my visitor to a web page that confirmed and illustrated some point I was trying to make, or to confirm specific data. Since pages change, I thought it was a BETTER OPTION to let Google select the CURRENT best page to make my point, by selecting a very specific search phrase.
When I check the results of the searches I specify, I am happy with the pages that Google chooses to present. If I am not happy, I change the search, so that it DOES reflect exactly what I want. Sometimes, I will use as many as 8 words in my search phrase. If that is not quite specific enough, I will use quotation marks to ensure that visitors will see exactly what I want.
My main purpose in doing this, is to avoid sending the visitor to a dead link, which WAS excellent, but which is now gone. There is not much "good stuff" on the net, that does not appear in multiple places. I am trusting Google to present LIVE links, that are dead-on appropriate - IF I can create a very apt search phrase. That has been my theory, and goal.
They copy and paste and then find it is a dead link. Not convenient for them.
They are simply too much of a hassle, they can get you in trouble, and they offer little in return.
And precisely because of this, outbound links can help tell the good sites from the bad ones.
Any webmaster can do trivial work like stuffing meta tags or increasing font sizes; but finding and maintaining links to external pages which complement your own pages and enrich your users experience requires knowledge and effort.
If I were Google, I would be trying to identify which sites face the hassles involved in providing useful external links to their users, and would reward those sites accordingly.
I have also been tempted at times to use the Google "site:" operator to link to websites that have a lot of useful content but a lousy or nonexistent internal search facility.
As long as you make it clear to your site visitors what they'll see when they click a given link, this seems totally fine to me.