| 9:00 pm on Dec 15, 2010 (gmt 0)|
It has been widely debated, (and I agree with the idea), that any link that has a 301 between it and its destination is worth less than a straight link.
301ing could have reduced the power of your backlink profile.
| 9:14 am on Dec 16, 2010 (gmt 0)|
there have been many experiments on this and it does reduce by about 10-15% and reduced even more over time.
As a rule, i dont redirect a high ranking page if i can avoid it, you are asking for trouble doing that
| 9:56 am on Dec 16, 2010 (gmt 0)|
|there have been many experiments on this and it does reduce by about 10-15% |
Welcome to WebmasterWorld headlesschicken.
Is the 10%-15% your personal estimation based on your gut or are there actually experiments out there that have each proven a percentage between 10%-15% to be the reduction?
I'd like to see these experiment reports, because I couldn't disagree with the numbers more.
|and reduced even more over time. |
Again, is this an experiment(s) you've read about or is it your own personal opinion/guess?
Again, I couldn't disagree with this more and have never actually heard of this theory before today.
| 12:11 pm on Dec 16, 2010 (gmt 0)|
Thanks for the input...
I tend to agree - as Google had indexed the non-www version we should have stuck to that and redirected www to non-www.
However, it's been done now and we need to find a way to minimise the damage.
We were getting around to 3500 uniques a day this time last month, yesterday we got our lowest yet of 1006.
In GWT I've made sure that the www version of the domain is setup the same as the non-www version. The domain preference has also been set.
The big difference is the for the non-www version we have over 232,000 links, while the www version shows just under 83,000.
Yes we have setup 301s for the non-www content so eventually part of the link juice should carry over...
| 12:41 pm on Dec 16, 2010 (gmt 0)|
Have also just seen there are loads of internal links using non-www version of the url.
I know...I would be using relative urls but the site is run off a CMS. It uses the full url.
Fixed those links now so they are all the www version.
| 3:30 pm on Dec 16, 2010 (gmt 0)|
We took one of my clients the other way - from www to non-www - in September. There were a lot of other changes at the same time (complete platform change, urls changed - everything) Google seemed to pick up the changes quite quickly (except for some reason the home page still displays as www) but we did experience a drop in non-branded traffic. On the upside, it started climbing sharply again right after Thanksgiving, and right now is close to where it was before we launch - and still climbing. We're liking what we see so far.
| 4:51 pm on Dec 16, 2010 (gmt 0)|
|Have also just seen there are loads of internal links using non-www version of the url. |
IMO, this is the problem, and changing these should help quite a bit (assuming, ie, that this is the only problem you had). Once upon a time, I used to think that the purpose of the 301 was to fix internal coding errors, but Jim Morgan straightened me out on that one. ;) Your internal coding needs to be right. The 301 is to canonicalize external requests.
|I know...I would be using relative urls but the site is run off a CMS. It uses the full url. |
Actually, I think you're better off using full absolute urls rather than relative urls, but you've got to make sure they're correct and consistent. I think there's more chance for errors with relative urls, particularly when you deep link in a way that skips some levels in your directory structure.
|Is the 10%-15% your personal estimation based on your gut or are there actually experiments out there that have each proven a percentage between 10%-15% to be the reduction? |
I'd like to see these experiment reports, because I couldn't disagree with the numbers more....
A loss from 301 redirects has been confirmed and discussed at length in the thread below. The 15% figure is theoretical, and probably comes from the last post in the thread. I suggest reading the whole discussion, though, to see how that was arrived at, and where it might or might not apply....
301 Redirect Means "Some Loss of PageRank" - says Mr Cutts
| 5:30 pm on Dec 16, 2010 (gmt 0)|
I have many test domains that i conduct all sorts of experiments on mostly in relation to site interlinking but i also did a study on 301's, pre mayday, i would agree that the loss from 301's is near zero, however i have noticed significant reductions since mayday which prompted me to investigate it... and i concluded the above.. these are not numbers i have plucked out of thin air..
Nothing in SEO is a confirmed science, thats why i conduct experiments myself rather than depend on others opinions, however this my opinion, and as far as i am aware.. its correct
Thanks for the welcome, i have been reading the forum for years, about time i signed up :)
| 7:16 pm on Dec 16, 2010 (gmt 0)|
I left 2 websites uncanonized to see the effects. Over the last 4 years the uncanonized have out performed all canonized versions by far.
WMT gives us canonization tools to help us , sitemap tools, tracking tools etc. Lots of tools that are supposed to help us while many that have used analytics and WMT , sitemaps etc have dropped like rocks in the serps after years of steady traffic without sitemaps, robots text file or WMT geo target.
I think its odd that they havnt implimented a couple obvious ones.
For example you are popular enough that you have a lot of links from yahoo answers for example. At the same time, how do they know you didn't post those trying to gain ranking ? Do they pass on PR or not. How do they know, because it looks like your gaming the SE but it might be legit popularity or it might be black hat trying to make you look like a forum spammer.
They should give us tools like these, the reason they don't? I can't be sure, but i think its got to do with money.
- disregard these BS links from buddies obvious black hat sabatoge sites linking to nonexistent urls.
- htaccess tool to suggest you deny from ip, user agent etc....
They know who the BS servers are, the bad ips all of it. They could do it but they don't. Sure it might be complicated to implicate but I think they've proved they can do about anything they decide to except return consistent results, and of course that may be by choice.