Forum Moderators: Robert Charlton & goodroi
I'm worried that Googlebot has become upset about seeing so many 301-redirects.When Googlebot encounters a lot of redirects, its immediate reaction is to make sure you’re not serving up soft 404s. It does this by requesting files with nonsense names that it can be confident don’t really exist. Your logs should therefore show a steady stream of
66.249.79.xyz - - [19/Apr/2020:20:07:37 -0700] "GET /exftbbclrvcbdsu.html HTTP/1.1" 404 6636 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"where the request is any random string of 15 letters. From your examples it appears you are redirecting to unrelated .html URLs.
Processing 'new' content slows it down for a while in my experience.
It does this by requesting files with nonsense names that it can be confident don’t really exist. Your logs should therefore show a steady stream of
RewriteRule ^oldpage\.html https://www.example.com/newpage.html#fragment [R=301,NE,L]
where the [NE] flag (“no escape”) is to prevent the # character from turning into something the browser won't recognize. RewriteRule ^(page1|page2|page3)\.html https://www.example.com/newpage.html#$1 [R=301,NE,L]
where you're cleverly using the name of each former page as the exact name of the fragment, so it can all be lumped into one rule. This is something I have actually done, combining multiple thin pages into one slightly plumper one.
If you're consolidating pages, be aware that it is possible to redirect to a fragment, so that a human who requests a former page is sent directly to the part of the new page that has the content you want.
$newLink = $mybaseURL . "/" . $productLabel . "/main#photos";
header("HTTP/1.1 301 Moved Permanently");
header("Location: $newLink");
exit();