Forum Moderators: phranque
Google is now listing non-www and the site has gone down to around the 8th page after 6-7 years on the first page of the SERPs, a good part at #1.
The site cannot be redirected from non-www to www, we tried and the MIVA shopping cart beast will not work that way, it won't save cookies and won't function to make sales. Anything related to MIVA needs to be non-wwww - YET we need the rest of the site to be www - as it always was until some "fancy footwork" and changes by the previous, original developer, at which time the rankings tanked.
Easy-peasy to do the whole site from one to the other, but how can we put in a directive to handle the two parts of the site the opposite way?
It's almost like a choice between having the cart work and lousing up for Google traffic or dealing with the canonical for Google and not being able to do any sales.
Is there any way, in addition to coding all the links to and within the MIVA beast absolute without the www, to redirect all pages, excluding the cart /Merchant2/ from non-www to www, and also make sure any requests within /Merchant2/ go to non-www?
RewriteCond $1 !^Merchant2
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^www\.example\.com
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
#
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^example\.com
RewriteRule ^(Merchant2.*)$ http://example.com/$1 [R=301,L]
Jim
[edit] Fixed typos cited below [/edit]
[edited by: jdMorgan at 8:38 pm (utc) on July 5, 2007]
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example.com
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
But isn't now, when trying the other code with a folder excluded.
Is there any way, in addition to coding all the links to and within the MIVA beast absolute without the www, to redirect all pages, excluding the cart /Merchant2/ from non-www to www, and also make sure any requests within /Merchant2/ go to non-www?
If there are further exceptions or requirements, they have to be coded; Otherwise it will do exactly what is quoted.
Or am I missing something?
Jim
[added] Stop the presses -- I had forgotten to start-anchor the domain names. See corrections above. That's it -- I'm not writing any more code today... Can't concentrate properly, obviously. [/added]
Jim
[edited by: jdMorgan at 8:40 pm (utc) on July 5, 2007]
>start-anchor the domain names
You know, this mod_rewrite is the hardest thing to me (I do some for every site, but copy and paste stuff); but I think I'm starting to get the hang of what all those little "squidgets" mean. I was looking at that exact spot and it looked "funny" - should have checked with the ones that are working.
What with the fact that mod_rewrite directives and regular expressions are extremely compact, very potent, hard to "read," and utterly unforgiving of any tiny error or omission, the chances of success with that approach are close to zero. Even if the code works, it may have unintended (or un-stated) and undesired side-effects that will forever haunt and baffle the unsuspecting Webmaster.
Another problem is that no two sites are ever identical. So again, cut-n-paste code is unlikely to be correct, perfectly-applicable, and optimized all at the same time.
It's interesting that you should mention that it "looked funny" because that is exactly how I spotted the problem myself. Subconscious pattern-recognition -- Scary! :)
Jim
It's one thing to get the logic (I've got that down), what I see as having been an issue for a lot of people is that the syntax isn't clarified enough to be able to grasp it. I do know that with textbooks for programming classes, there are always ample examples given to illustrate syntax and what each and every thing means.
I looked for hours before posting (to no avail), but thank you SO MUCH. After making changes to internal navigation, and implementing this correct way to handle it, correct pages started to be crawled, and after specifying the right "domain" in Webmaster Central, pages are being re-crawled day by day and as of today --the site is back for the main keyword phrase!
There were other issues as well (and still are), such as the homepage title and meta description being used over hundreds of other pages - it was easier to change the original pages for an expedition solution for now.
Thanks so much - the site owner is calling it a miracle. :)
Personally, I found "reading" regular expressions to be the hardest hill to climb. Once you can read them, writing them is almost trivial. But you can't learn to write them until you can read them, and so many people get stuck.
The reason it's easier to write regex and RewriteRules is that you *know* what URLs you are trying to match and under what conditions, so you know the "goal." Whereas, when reading them, you also have to deduce the original goal, and that makes it much more difficult.
Jim