| 1:51 am on Mar 25, 2013 (gmt 0)|
I personally think people stress too much over links. (I've recently put pages in the top 5 by adding them to an xml sitemap without another mention anywhere on the Internet, which says to me links are dying (or other factors are increasing) as far as rankings go.)
I've actually done the same thing you're talking about, not because of unnatural links, but for other reasons, and the new version with reworked HTML is doing exactly what it should, but I 410ed the old page rather than 404ing it, because I wanted to say "I removed this page intentionally".
In your situation, I'd probably do exactly what you're asking about, except I'd 410 rather than 404 the old page.
| 2:20 am on Mar 25, 2013 (gmt 0)|
thanks for the tip on 410ing it, sounds better than a not found error, maybe googlebot will stop crawling this page expecting it to come back if it gets a "removed intentionally" message.
But I wonder if this action will do me any good on google's eyes, because the unnatural links will still be out there, just now they are linking to a 410ed page.
| 2:26 am on Mar 25, 2013 (gmt 0)|
|... maybe googlebot will stop crawling this page expecting it to come back if it gets a "removed intentionally" message. |
In my experience they usually stop crawling it and remove it from the index sooner with a 410.
|But I wonder if this action will do me any good on google's eyes, because the unnatural links will still be out there, just now they are linking to a 410ed page. |
They have said repeatedly broken links to your site don't hurt, so by "breaking the links" intentionally I would guess it will be interpreted as you intend algorithmically, the only other real options you have are to get all the links removed or disavow them.
I'd personally go the route you're talking about first as long as it does not impact visitors (in other words if no one clicks on the links to the page) and see what happens, then if necessary I'd try to have the links removed or disavow them as a "last resort", but my personal opinion is if I was writing an algo and the links to a page were broken and those links were not updated it would seem silly to count those against a site, because if I did it would likely open up some loopholes for negative SEO by doing things along those lines.
The one thing I would definitely not do in your situation is redirect the page to a new location.
| 2:49 am on Mar 25, 2013 (gmt 0)|
If the links are completely loathsome you might even be better off with a 404 because then it just makes the linker look bad :) There are also the human visitors to consider: are the repeaters-- bookmarks and so on-- worth keeping? Humans don't know from URLs. They just know "This good stuff used to be here and now I get a page saying it's been removed." Hit a 404 and you might think there's a mistake in your bookmark, so stick around and search the site.
Google stops crawling 410s pretty soon, but Bing really doesn't seem to make a distinction. I've got whole directories where the only 410s are from the bingbot.
| 2:54 am on Mar 25, 2013 (gmt 0)|
|now they are linking to a 410ed page |
To be exact, their link points to a URL that gets a 410 response from the server. That's a bit of a difference, because their link no longer points to "a page."
| 4:01 am on Mar 25, 2013 (gmt 0)|
I think the links are not that bad, it's a link in a widget that several users put on their blogs. Most of the time the blogs are related to my site, anchor text is my site's name, just the link in the widget used to point to this specific page that I want to remove, because it's not related to the widget. I have corrected(long ago) this and now the widget link points to the page where the users can get the widget. Not sure if I should point the link to my domain home page instead of linking to the widget page.
Visitors to this page most of the time are people browsing from other pages of the site because the site is small, so some users eventually land on this page even if I change its url.