Nope - in most cases a 301 redirect transfers most all of the link equity that is there for the original URL. Those redirected links even get reported to you in Webmaster Tools
What you can do is just let all the "www" links return a 404 Not Found header if you want. However, if you have GOOD backlinks that point to the "www" version, then you would want to request a change from that website.
Thanks tedster, so if I return 404 for www. those links that point my site to wwww. version will not count?
I have a 404 custom page error I must use nofollow for the links of that page?
Especially if you are using a custom 404 error page, make sure that you are returning a 404 status in the HTTP header for the original request.
With custom error pages, it's not uncommon to see things like 301 status > 200 status and that will not do the job. In fact, it can get you into deeper trouble.
I dont understand your last comment:
|With custom error pages, it's not uncommon to see things like 301 status > 200 status and that will not do the job. In fact, it can get you into deeper trouble. |
you mean a custom page error that redirects to an existing page?
using an http viewer the notfound page give me the following:
Status: HTTP/1.1 302 Found
I don't think g### cares whether the linkers can actually reach your site or not. They just look at the link; they don't try to follow it. After all, they'll happily list links to pages that don't even exist-- or pages that nobody but you can access, like the piwik linkoids I grumbled about recently.
|you mean a custom page error that redirects to an existing page? |
No. For starters, appreciate that ordinarily, default 404 handling is this:
1) A user agent requests a URL
2) The server replies with 404 Not Found status - plus some human readable, bare bones text that the server returns to say to the person, the requested page was "not found."
When a custom 404 page is created to go beyond that bare bones text, many times the actual 404 response in the header gets dropped completely. The server admin or CMS programmer is so focused on the user's experience they forget about the HTTP header completely. So then the original request gets this response:
1) A 301 or 302 redirect status, and the redirect points the user agent to
2) A 200 status page delivered with human readable custom information
Notice what happened here. At no time did the user agent (e.g. googlebot) ever get a 404 status - it was never told the original URL is not there! Especially with 302 > 200 status, the potential for total havoc is strong.
This technical error (a soft 404) is so common that Google today tries to adjust for it - and they do at least some of the time. But when they don't, a site's search traffic can go belly up a few weeks later and the site owner has no clue why.
Getting your custom error pages to use the correct HTTP status can be a challenge, depending on your type of server and in some cases, your CMS. When you don't have admin access to the server, you can be in pickle. Sometimes, when a big web host does things wrong for hundreds of thousands of domains, Google catches that and compensates. Sometimes, but I wouldn't count on it.
|using an http viewer the notfound page give me the following: |
Status: HTTP/1.1 302 Found
Do check that your ErrorDocument is defined like this:
ErrorDocument 404 /pagename.html
and NOT like this:
ErrorDocument 404 http://www.example.com/pagename.html
The Apache manual specifically warns that the latter example will return an incorrect 302 response.
thanks, I used the control panel configuration, I did not use htaccess. I will deactivate the control panel and use htaccess, you recommend that g1smd?
how can I make, that all "www" links return a 404 Not Found header ?
Use a RewriteRule configured as an internal rewrite to rewrite all requests to a non-existent path. This action will trigger the server 404 response.
To ensure that this happens only for www requests, add a preceding RewriteCond testing HTTP_HOST for hostname beginning www.
This code wil go near the beginning of the htaccess file.
I have a problem with all these request to remove a link or stop them from resolving to the site.
Example. I had a bunch of links built from seocompany. It worked well but now I have an issue with the anchor text all the links are hyperlinked too. So I go about getting them removed and or 301'd.
What signal do you think this is sending Google. I got caught and I am now fixing my bad boy ways. I don't see this fixing anything really but just telling Google yes I am guilty.
Can someone tell me the advantages of what martin will gain from this. I just don't see this as a fix.
If there is a manual "action" (Google's new word for penalty) against the site, then a Reconsideration Request that lists all the questionable links that were handled can get that manual action lifted.
Ted if what you say above works why go through the trouble of calling to get them off the site or setting up were they point to a 404 page. Just send in the request with all the links the SEOcompany built up and be done with it.
Reason I say this we have an article directory site and jeeze the calls we are getting to get the links in the articles removed is crazy. I tell them our site isn't the issue all the links you added are nofollow anyway so they don't figure in the equation. They ask can you remove them anyway. Like all these website owners are looking for the needle in the haystack.
I won't side track the discussion I just feel it is a waste of time IMO.
You certainly can return 404 or 410 HTTP code on the URLs of your site that were linked to, but that won't stop Google from counting those bad links toward your transgressions, so to speak.
If there's any manual penalty against your site (all my experience with this is pre-Penguin. Don't know if algo demotions are quite the same), making the link target URLs disappear one way or the other will not get the penalty lifted. I highlighted "site" because I suspect that a link to a missing page still counts toward the site that page was on, and Google does not want that.
I have started returning 410 on the target URLs but no movement whatsoever was occurring for more than two months (all the while RRs were dispatched and even emails with G exchanged) until some of my requests to remove links became successful and also a large abandoned autoblog which accounted for almost 25% of bad links to my site alone, had its domain expired and went offline.
In other words, I could only see any improvements when the bad links have started disappearing from the Web in statistically significant numbers, and nothing I was doing on my site made any difference.
They don't seem to want you to have an easy way out with these types of transgressions. Links are so dear to Larry Page that he's probably making it a company policy to punish any funny business related to links. They are still the core of their ranking algorithm, so I can't really blame him [too much].