JD_Toims - 8:21 am on Oct 17, 2013 (gmt 0)
what if anything should be done about this?
Nothing since "bot 2" is getting the correct page and response.
If it was more widely reported there could probably be a more definite conclusion, but for a guess, to me it sounds almost like split processing -- EG "bot 1 on IP 1" starts a spidering run and requests non-www [or https or whatever]. If it receives a 200 OK it sends the page to processing, but if it receives a redirect the new location is passed to "bot 2 on IP 2" and "bot 2" spiders URLs "bot 1" is redirected to.
It actually makes a bit of sense to me they might "pass redirects" to a different bot/ip since they have said if there are more than 3 or so redirects they may not be followed.
It also seems like they could speed up indexing and processing by "passing redirects" to different bots, because they could have one spider "everything" and send "good" requests to be processed, then another with a dynamic URL list only request the locations the first bot was redirected to and send "good" request to be processed, then a another with a dynamic URL list only request the locations the second bot was redirected to and send "good" requests to be processed, then just dump anything after N redirects, which could be 3 or 4 or whatever they feel like today.