Forum Moderators: Robert Charlton & goodroi
Rewards and Risks of Changing to Hierarchical URL Structure
Sorry - didn't follow that part. I'm not sure what you meant there.
You have to make sure you do only one 301 redirect from you old URL to a new one with https (not old url to new url via 301 and then redirect to 301).
I do have some concerns about the admins getting that right.
ncheck the box to "Always Follow Redirects". Spider site. Go to Reports > Redirect Chains
Non-sense!
My htaccess file had taken about 20 hours to create and is about 200k!
Using .htaccess files slows down your Apache http server. Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance.
[httpd.apache.org...]
Do it all at once
Can't you still create breadcrumbs with a flat url structure?
suggested list of pages on 404 pages until you've setup 301s
the current plan is to launch the new site right after Christmas
the current plan is to launch the new site right after Christmas (ca Jan 15)I thought I'd kick this thread up now, not because I'm expecting a reply in the near future, but simply because I don't want it to expire before mid-January.
Make a site with a clear hierarchy......to this wording...
Design your site to have a clear conceptual page hierarchy.
I can't say too much b/c I don't know IIS, but I know this would be at most an hour's work on Apache.
<?xml version="1.0"?>
<configuration>
<system.webServer>
<rewrite>
<rewriteMaps>
<rewriteMap name="Redirects">
<add key="/sub/" value="/sub/page"/>
</rewriteMap>
</rewriteMaps>
</system.webServer>
</configuration> # Force no browsing to /index
RewriteRule ^(default|index)\.asp / [R=301,L]
RewriteRule (.+)/(default|index)\.asp /$1/ [R=301,L]
# Force everything to be lowercase
RewriteCond %{REQUEST_METHOD} (GET|HEAD) [NC]
RewriteCond %{REQUEST_URI} ([^?]+\u[^?]*)(?:\?.*)?
RewriteRule (.*) $1 [R=301,CL]
It's hard for me to understand why these things are hard.
- Dev team is under-resourced and possibly under-qualified
Matt Cutts said that if you went over 3 hops on your redirects, there was a decent change Google would give up on following them.
Avoid chaining redirects. While Googlebot and browsers can follow a "chain" of multiple redirects (e.g., Page 1 > Page 2 > Page 3), we advise redirecting to the final destination. If this is not possible, keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.
-- [support.google.com...]
(2) Number of redirects should be minimized
Additional HTTP redirects can add one or two extra network roundtrips (two if an extra DNS lookup is required), incurring hundreds of milliseconds of extra latency on 3G networks. For this reason, we strongly encourage webmasters to minimize the number, and ideally eliminate redirects entirely - this is especially important for the HTML document (avoid “m dot” redirects when possible).
--- [developers.google.com...]
My understanding is that the first hop preserves the transfer of equity and any subsequent hops fragment that equity.
The site is live?
dev team is stuck with website technology that is a massive pain in the neck
[edited by: ergophobe at 1:58 am (utc) on Apr 17, 2016]
Too many 301s shouldn't be a problem, In general, what happens is Googlebot will follow up to five 301s in a row, and then if we can't reach the destination page we will try again the next time.
be sure not to stack redirects. Doing so almost ensures we won’t pass the value through to the end.
If you have control over something of this nature, you NEVER let an indexing entity "handle it" via their error routines.
If Bingbot sees a 302 redirect, say 5 times in a row, we’ll assume you meant 301 and transfer value as if it’s actually a 301 redirect. No need to go back and clean things up for Bing