| 11:21 am on May 1, 2012 (gmt 0)|
"Abondon ship" said the Russian captain.
| 9:19 pm on May 1, 2012 (gmt 0)|
I was hoping somebody would have some insight into or thoughts on this. I am just going to wait and see what happens for now but I am thinking longer term if I can't get things fixed it might be a worthwhile switch.
| 9:26 pm on May 1, 2012 (gmt 0)|
i like the idea, but why not 301 to a more substantial subdomain, like better.example.com... (not non-www), google might see the www and non www as quite similar...
| 11:07 pm on May 1, 2012 (gmt 0)|
Did you get a direct message from Google in December? Did you reply to them if so?
| 7:51 am on May 2, 2012 (gmt 0)|
Wouldn't a 301 pass on the bad link juice?
No that is when the penalty was initiated. I received an email (replyable to) in February stating that it was related to inorganic links and I have been busy trying to remove them and sending reconsiderations (or replying to the email). Again, www and non www are treated as seperate entities on WMT and each have different links pointing to them.
| 5:19 pm on May 2, 2012 (gmt 0)|
I suppose I will ask a few 'simple' questions to try to work this out.
I know it is a bit unconventional but this is an unconventional situation I suppose. Really what I want to do is 'start again' as far as linking is concerned with minimal direct impact to my visitors.
If I were to implement a 302 redirect from www to non-www would this block off 'link juice' flow (good and bad) whilst allowing the non-www to accumulate and build up authority of its own? Would rankings depend on the new non-www authority that is building up rather than the previous 'www' authority?
| 5:29 pm on May 2, 2012 (gmt 0)|
If you are mostly concerned about your visitors then just ignore what G is telling you and concentrate on quality content and good visitor experience.We webmasters have no control over who links to us.
If G thinks some of those Bl are not to their high standards then those Bl should be ignored. Is not our work to clean up their SERPs.
We already serve them our content at no charge, maybe if we start charging G for using our content, maybe then, they start putting more appreciation in to this process. What do you think?
| 5:37 pm on May 2, 2012 (gmt 0)|
Whilst I subscribe to this logic myself the reality of the situation is that Google has penalised the website for external factors (links) which are 'outside the quality guidelines'. This new 'Penguin' update seems to have simply intensified this. There is always the option to just sit tight and see what happens but I can't help thinking it would be easier to start again. At least in a way that as far as I'm aware can be reverted if I can't get where I need to.
I certainly think I've got the content element under control and my website has not been a target of any of the recent Google Panda updates. I can also redirect and build upon the powerful links and know I will get plenty of natural backlinks in the future. At the moment I am getting a fair whack of my traffic from people discussing my website on various forums and in that sense I am not wholly dependent on Google. I just see a lot of wasted potential at the moment.
| 6:41 pm on May 2, 2012 (gmt 0)|
|Wouldn't a 301 pass on the bad link juice? |
it may or may not....
| 3:01 pm on May 3, 2012 (gmt 0)|
Any thoughts on my second last post re 302 redirects? The other option is simply a custom manual redirection to the new home page.
| 5:34 pm on May 3, 2012 (gmt 0)|
301 WILL definitely pass bad links, you're going to be clean for a week or two and may see improvements but eventually everything will come back to what you have now. I most of those spam links point to the main page, there is nothing you could do but removing those links or abandon the domain.
| 5:52 pm on May 3, 2012 (gmt 0)|
Well that is my point and certainly how I thought the 301 would act. The links only point to the www version of the domain - the non-www has a completely seperate link profile and they are effectively treated as different domains (or subdomains). By switching off the www version the links would point to a dead end, they wouldn't transfer to the non-www version. The logistical challenge here is redirecting the users appropriately if they visit the www version without the search engines passing on the link value from www to non-www.
| 6:45 pm on May 3, 2012 (gmt 0)|
how will you block www? By simply choosing non-www in your WMT will not help, 301 redirect is useless.
| 6:55 pm on May 3, 2012 (gmt 0)|
By removing any useful content from there. To give an extreme example - return a 404. A manual redirect. Same way you'd move to a new domain without a 301. Google gives clear guidance and indication that the 'www' and 'non-www' versions of the website are treated as seperate entities and are ranked seperately just like different domains. A 301 is out of the question as it transfers everything (including link attributes and profile) but the other options aren't. And I wouldn't set anything as a preferred domain in GWT because I don't want link power to transfer.
| 7:35 pm on May 3, 2012 (gmt 0)|
If you have another not important site to test it out, everything looks legit from what I see.
| 7:47 pm on May 3, 2012 (gmt 0)|
That's a good idea. I'll see if I can do that.
| 7:23 pm on May 4, 2012 (gmt 0)|
This might be a stupid question but a 404 page would cut off all pagerank/trust metric flow wouldn't it? I mean if you link to various pages from a page that returns a 404 response code those links are purely for the users not the search engines?
| 8:08 pm on May 4, 2012 (gmt 0)|
You got that right. A 404 as an http status in the header is exactly as you said. It is not actually a "page" and the text message/content that is returned means nothing. Google just sees the the URL is "not found".
| 1:06 pm on Sep 27, 2012 (gmt 0)|
Just an update on this. I have implemented this 410 redirect for the 'www' version of my website after requesting removal of the site on Google. I then put up my content on the 'non-www' and things seemed good... At first.
It seems Google is trying to crawl the www pages quite frequently. It should have been seeing these 410 errors for a few months now. Reading a little more into things it is mentioned in one of Google's own help articles that you need to also have a robots.txt block in place if you are wanting to remove an entire site or directory....... I can't really create a robots.txt specific to the 'www version' of my site (can I?) so I am wondering what to do here. How long does it take before Google would actually 'give up' on the www version. Or do I really have to move to a new domain here?
| 1:17 pm on Sep 27, 2012 (gmt 0)|
|It should have been seeing these 410 errors for a few months now. |
I can say from experience that Google will request those pages for years to come, so I think you should keep the 410 up. Are you seeing improved rankings with the redirect in place?
| 1:29 pm on Sep 27, 2012 (gmt 0)|
For some pages yes. Well I was at first. But I have been going through what I thought were my 'good' links and making sure they point to the non-www version... But now I have 'manual action' on the site related to unnatural links. I can't work out if it is due to links that I re-pointed, new links acquired or that Google is actually looking at some links from the www site. This has basically muddied the water completely.
I do see some examples where WMT lists a link "via this intermediate link" and the intermediate link is a 'www' version of the page. I haven't redirected in any way and this concerns me because it is as if they are redirecting automatically. I have no preferred domain set in WMT and I am serving a 410 on all of the www pages so I don't know what is going on.
Edit: I've had a good look through in WMT and actually a lot of instances of 'via this intermediate link' are there. I have a feeling something has gone wrong here.
| 7:03 pm on Sep 27, 2012 (gmt 0)|
|I haven't redirected in any way and this concerns me because it is as if they are redirecting automatically. |
Time to fire up an HTTP Header Checker and see for sure what's happening on your server. I use a Firefox extension, but there are others, including free online checkers.
| 10:08 am on Sep 28, 2012 (gmt 0)|
It definitely returns '410 Gone' according to any header check I run.
It could be that the original URLs weren't properly removed from the index and Google has sort of redirected automatically. The number of 'via this intermediate link' instances and just the rankings of certain pages seems to indicate this might be what has happened.
I am running a little test at the moment. I have the 410 returned on all 'www' pages. I have requested removal of a specific page from the www version (it isn't displaying in the search results of course - only non www pages are). Strangely this has caused the non-www version to drop from the index. What's up with that? It is like no matter what I do it is viewing them as one and the same... Although the list of URLs removed in this tool is unique for both the 'www' and 'non-www'.
| 5:56 pm on Sep 28, 2012 (gmt 0)|
Well this is very interesting. I posed a question to John Mueller about this via the Webmaster Central office-hours, September 28, 2012 (b) hangout. He addresses the question 16 minutes in and basically says that Google does equate them as one domain. Returning a 410 on the affected version can work 'to some extent' but isn't as clean as starting on a new domain. He specifically said that algorithmically it is treated as one website!