Welcome to WebmasterWorld Guest from 54.226.67.166

Message Too Old, No Replies

Limiting number of redirects - what is this about?

     

shaunm

9:32 am on Feb 19, 2014 (gmt 0)



Hi All,

So when we say we should limit the no of redirects since too many redirects will cause negative impacts on the site.

What do we actually mean by too many redirects? Are we talking about external or internal redirects?

Link A redirects to Link B

1. Link A is internally linked from various pages - Is that the concern?
2. Link A is externally linked from various websites- Is that the concern?

If Google is the main concern then it's going to automatically drop Link A, and will only have the new redirect target which is Link B in its index.

Then, what's that we are actually talking about? This is something that's bugging me. Could you please help?

Thanks,

goodroi

11:51 am on Feb 19, 2014 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



When people say that limiting redirects they are often talking about two different issues.

Total redirects for a website, you want to avoid having a huge amount of code in your htaccess because it can impact site performance. This is not a common problem.

Total redirects for a single url (URL A redirects to URL B redirects to URL C) this is not good because each redirect loses some link juice. It is not a big loss but you ideally want to avoid this. This is a more common problem as websites might implement redirects each time they redesign and it starts to build up. I have seen old sites with their urls going through a chain of 7 redirects.

shaunm

12:05 pm on Feb 19, 2014 (gmt 0)



Thanks @goodroi

Total redirects for a website, you want to avoid having a huge amount of code in your htaccess because it can impact site performance


Can you please elaborate as to how this impacts site performance?

Like the search engines bots first look for /robots.txt before doing a site wide crawl, does the server responds to each URL redirect request by reading 'huge amount of code in your htaccess' every time? Is that how it makes an impact on the site performance by taking too much of time and wasting the server resources?

Total redirects for a single url (URL A redirects to URL B redirects to URL C) this is not good because each redirect loses some link juice. It is not a big loss but you ideally want to avoid this. This is a more common problem as websites might implement redirects each time they redesign and it starts to build up. I have seen old sites with their urls going through a chain of 7 redirects


Again this fall under chain redirects and issues not related to what is widely known as 'too many redirects' right?

:(

lucy24

12:31 pm on Feb 19, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Yes, the server has to read through every last line of code before it can even start acting on a request.

And then, after plowing through all the mods and detouring to look for an htaccess in every directory in its path and finding that the request has been blocked six ways from Sunday ... it meets a <Files "robots.txt"> and has to throw everything out the window :)

phranque

4:16 pm on Feb 19, 2014 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



it depends on what you mean by "too many redirects", but they are all bad.

if you have chained redirects, it's bad for the visitor who has to wait for the extra request(s) and for your pagerank because you're leaking a percentage for each redirect.

if you have too many directives to handle redirects in an .htaccess file it can take too long for the server to process those directives for each request that hits that part of the filesystem.
this is less of a problem if you put these directives in the server configuration file, which only gets parsed once when the server is started instead of once per filesystem request.
in some cases it's faster to internally rewrite non-canonical requests to a redirect script.

if you redirect requests for urls that are not internally linked then it shouldn't be a problem as long as each redirect goes to a url serving equivalent/relevant content.

Link A is internally linked from various pages - Is that the concern?

you don't want to send redirect responses for any internal links.
you should internally link to the canonical url.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month