Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How long for Google to respond to a 301 redirect

         

Teri

10:45 am on Jun 9, 2011 (gmt 0)

10+ Year Member



Hello everyone,

We've been trying to deal with some duplicate content issues by using a 301 redirect.

The duplicate pages now all point to their respective 'preferred' page each of which includes a canonical tag. These changes were made around 27 May ... but so far there is no evidence that google has honoured the redirect - in WMT all these pages are still being flagged as being pages with duplicate titles. The site is definitely being crawled.

Is a delay like this usual? If not, is there any action we can take to move things along?

Thanks in advance for any advice.

Teri

tedster

3:56 pm on Jun 9, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A delay in Webmaster Tools reporting is very common. The actual SERPs often respond much faster.

lucy24

7:22 pm on Jun 9, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Adding new pages and deleting old ones are two separate actions. Right now I've got two almost identical lists of pages in the "crawl errors" area of gwt: one under "can't find", one under "blocked by robots.txt". (Short version: I moved about 30 pages to a new non-indexed directory, with 301 for the use of humans, leaving three pages to hold down the fort in the original location.) If the new directory weren't robots-blocked, I'd be getting the duplicate-name error instead.

Hm. Would it be legal (google-legal, that is) to make a 301 for humans and a 410 for google for the same page?

tedster

8:04 pm on Jun 9, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Would it be legal (google-legal, that is) to make a 301 for humans and a 410 for google for the same page?

That's a textbook definition of "cloaking" - and you're asking for trouble if you try it. Robots.txt disallow should be fine, though I would expect some kind of extended period before the actual SERPs settle down for you.

If the new directory weren't robots-blocked, I'd be getting the duplicate-name error instead.

How so? The old URLs no longer resolve, correct?

Also, don't go too crazy over WebmasterTools warnings abut duplicate titles. If you know you've handled the issue effectively, then it's just an out of date report. These duplicate notations are just input shared with you on an FYI basis, and not a literal "you have an error that could kill your rankings" slap.

MelissaLB

9:47 pm on Jun 9, 2011 (gmt 0)

10+ Year Member



Teri, we did the same thing to our site a few months back. WE had multiple duplicate content pages for each of our pages due to pages being listed in multiple categories on our site. We did the 301's and canonicals and started with about 10,000 duplicates reporting in WMT. WE saw the firs sign of change after about 2 weeks when the total duplicates dropped to about 9000. Update after update it kept going down down down. Took quite a while though and now WMT is still reporting duplicates (about 100) but when we click the pages listed, we're directed to the 'preferred page'.

In our experience, it dos take a while. Bots are on crawling away every day but it did take a while.

Good luck!

g1smd

11:03 pm on Jun 9, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



After a redirect is added, the new URL gets listed quite quickly but the old URL takes a very long time to be removed from the SERPs; sometimes more than a year. However, don't worry too much about "Duplicate Content" problems once Google has "seen" the old URL returning the 301 status.

Teri

7:50 am on Jun 10, 2011 (gmt 0)

10+ Year Member



Thank you all for your replies - I feel a little more re-assured now!
I assume the delay in updating of reports also applies to the reports showing (change in) rankings for specific search terms?

Regards,

Teri

tangor

8:20 am on Jun 10, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Takes time. Google speaks of their speed... but it is only "speak"... in reality, the G is pretty dang slow in updating... and has a memory like an elephant (metaphor) and NEVER FORGETS ANY URL IT HAS CRAWLED, since day ONE.