homepage Welcome to WebmasterWorld Guest from 54.196.201.253
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Limiting number of redirects - what is this about?
shaunm




msg:4646587
 9:32 am on Feb 19, 2014 (gmt 0)

Hi All,

So when we say we should limit the no of redirects since too many redirects will cause negative impacts on the site.

What do we actually mean by too many redirects? Are we talking about external or internal redirects?

Link A redirects to Link B

1. Link A is internally linked from various pages - Is that the concern?
2. Link A is externally linked from various websites- Is that the concern?

If Google is the main concern then it's going to automatically drop Link A, and will only have the new redirect target which is Link B in its index.

Then, what's that we are actually talking about? This is something that's bugging me. Could you please help?

Thanks,

 

goodroi




msg:4646628
 11:51 am on Feb 19, 2014 (gmt 0)

When people say that limiting redirects they are often talking about two different issues.

Total redirects for a website, you want to avoid having a huge amount of code in your htaccess because it can impact site performance. This is not a common problem.

Total redirects for a single url (URL A redirects to URL B redirects to URL C) this is not good because each redirect loses some link juice. It is not a big loss but you ideally want to avoid this. This is a more common problem as websites might implement redirects each time they redesign and it starts to build up. I have seen old sites with their urls going through a chain of 7 redirects.

shaunm




msg:4646629
 12:05 pm on Feb 19, 2014 (gmt 0)

Thanks @goodroi

Total redirects for a website, you want to avoid having a huge amount of code in your htaccess because it can impact site performance


Can you please elaborate as to how this impacts site performance?

Like the search engines bots first look for /robots.txt before doing a site wide crawl, does the server responds to each URL redirect request by reading 'huge amount of code in your htaccess' every time? Is that how it makes an impact on the site performance by taking too much of time and wasting the server resources?

Total redirects for a single url (URL A redirects to URL B redirects to URL C) this is not good because each redirect loses some link juice. It is not a big loss but you ideally want to avoid this. This is a more common problem as websites might implement redirects each time they redesign and it starts to build up. I have seen old sites with their urls going through a chain of 7 redirects


Again this fall under chain redirects and issues not related to what is widely known as 'too many redirects' right?

:(

lucy24




msg:4646643
 12:31 pm on Feb 19, 2014 (gmt 0)

Yes, the server has to read through every last line of code before it can even start acting on a request.

And then, after plowing through all the mods and detouring to look for an htaccess in every directory in its path and finding that the request has been blocked six ways from Sunday ... it meets a <Files "robots.txt"> and has to throw everything out the window :)

phranque




msg:4646693
 4:16 pm on Feb 19, 2014 (gmt 0)

it depends on what you mean by "too many redirects", but they are all bad.

if you have chained redirects, it's bad for the visitor who has to wait for the extra request(s) and for your pagerank because you're leaking a percentage for each redirect.

if you have too many directives to handle redirects in an .htaccess file it can take too long for the server to process those directives for each request that hits that part of the filesystem.
this is less of a problem if you put these directives in the server configuration file, which only gets parsed once when the server is started instead of once per filesystem request.
in some cases it's faster to internally rewrite non-canonical requests to a redirect script.

if you redirect requests for urls that are not internally linked then it shouldn't be a problem as long as each redirect goes to a url serving equivalent/relevant content.

Link A is internally linked from various pages - Is that the concern?

you don't want to send redirect responses for any internal links.
you should internally link to the canonical url.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved