| 10:46 pm on Dec 31, 2013 (gmt 0)|
#1 You gotta make up your mind. Either it's a bad link or it isn't. If there is a real possibility of human traffic, it isn't a bad link.
#2 Redirecting anything to google search-- or, for that matter, to any unsuspecting third-party site-- is never a good idea.
|and then it will link to a 404 page |
Not unless you've made a serious error in coding. The request will get a 404 response, which in turn will lead to the human visitor seeing the content of your 404 page.
| 10:57 pm on Dec 31, 2013 (gmt 0)|
|So I am wondering if it would hurt anything to redirect those bad page to Google search that will show them the new pages.. This way so it doesnt pass the bad juice to the page but might still send the visitor back to me? |
I read this the same way as Lucy24 initially, but the second time I think you're talking about cloaking [redirecting Googlebot requests to a different page, but not everyone] and not actually sending the visitors to Google when they request a page.
If that's the case, just don't do it -- Disavow the bad links instead. I know "everyone" seems to think "it's cool to outsmart Google", or "use the lastest trick", but there's really no point to it or need to do anything tricky when a simple disavow file will likely do the job and in this case the idea of redirecting Googlebot will backfire and redirect the links also, leaving you in exactly the same situation except you'll have 10% to 15% less inbound link weight than you do now.
| 11:09 pm on Dec 31, 2013 (gmt 0)|
So cloaking is redirecting them away from my site? but to a google search where they can find what they are looking for in the new page.. i done the disavow just cant get the links removed.. didnt really want the person if ANY going to a 404 page..
[edited by: Robert_Charlton at 11:17 pm (utc) on Dec 31, 2013]
[edit reason] fixed typo at poster request [/edit]
| 11:39 pm on Dec 31, 2013 (gmt 0)|
But if they're bad links, are you getting valuable traffic from them?
Cloaking is showing something different to Googlebot than you do to human types. Not sure if that's what you're going for or not; it's a bit confusing.
| 12:54 am on Jan 1, 2014 (gmt 0)|
|didnt really want the person if ANY going to a 404 page.. |
Why don't you make a really good custom 404 page that has links to these these new URLs? This should solve your problem.
| 10:40 pm on Jan 1, 2014 (gmt 0)|
I know its now fashionable to think that 404 is treated the same as 410 and that Google (Was it John Muller?) even confirmed this, but we still get better results with 410 and would always choose that over 404 for killing bad links in their tracks. Not done any recent testing but noticed quicker dropping with 410.
| 11:44 pm on Jan 1, 2014 (gmt 0)|
|404 is treated the same as 410 and that Google (Was it John Muller?) even confirmed this |
This sounds like another case of google's left hand not knowing what google's right hand is doing.
404 = "I have no idea"
410 = "Yes, I know, stop bugging me"
Food for thought: Is it possible that 404 and 410 are only viewed differently on sites that are known to use both? 404 happens by default; 410 requires a deliberate action.
| 3:09 am on Jan 2, 2014 (gmt 0)|
|This sounds like another case of google's left hand not knowing what google's right hand is doing. |
Actually, in the wording I remember hearing, they have said essentially "they treat them about the same way", but that does not mean exactly the same way at exactly the same time.
IME and AFAIK, over time a 404 will be treated the same as a 410 is initially, so what I've heard them saying is true: "404 and 410 are treated/handled about the same way", meaning: in the long-run, and over time, it really doesn't matter which is used, but in the world of "instant gratification", a 410 has always worked better [faster] in situations where I've tested both.