Forum Moderators: open
Having read some threads in this forum regarding htaccess redirects and how Google would follow them properly I did prepare the move from http://site.com to http://www.site.com.
Now Goglebot is just scratching the surface of my site and every hit I get from it is a 301. except from robots.txt.
This is an average visit example:
64.68.82.54 - - [02/Jul/2003:23:24:54 -0400] "GET /robots.txt HTTP/1.0" 301 240 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.82.54 - - [02/Jul/2003:23:25:00 -0400] "GET /robots.txt HTTP/1.0" 200 146 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.82.54 - - [02/Jul/2003:23:25:07 -0400] "GET /anypage-at-anydepth.html HTTP/1.0" 301 242 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
and then it is gone.
My question is:
What can I expect from the next update?
- Will the site show the same data under www.
- or will it just be gone altogether?
Would I be better off submiting the site again under www.site.com
This has been going on for about two weeks now and Google doesn't get where the site is.
Thanks in advance for your comments.
More info about the 301 Issue and google can be found in the following thread. Some of the people there supported, that 301 is not working as intended.
301 not being supported? ¦ old, deleted pages being indexed? (forum3/6486.htm) [webmasterworld.com]
hope this will help you a bit.
-hakre
I did it a couple of months ago for my main site and Googlebot handled it perfectly. It simply kept both URLs in the index for approximately 6 weeks.
That worried me a bit, because of the temporary duplicate content.
Simply make sure that you have no remaining links without the www in your site and everything should be OK in a few weeks from now. I don't think resubmitting your site would change anything.
Dan
Just takes time I guess.
It is reassuring to know I have not made the biggest mistake (at least not alone ;-))
It's been nearly a month now and Googlebot has almost visited the entire website.
I figure the process works like this:
1.- The bot has to visit each and every page in the index looking for it in the last known location, it will get a 301 redirect and NOTES the new location (but doesn't clrawl it).
2.- Once it has figured out the new location of all those pages (and I mean all of them) Googlebot is ready to crawl the new pages. It will begin hiting pages asking for the www.version of them. No more 301's.
3.- After it crawls all the pages in the new location the index gets updated properly.
I guess it is just a matter of waiting...
I will post the conclusion to this story. (if any)
we changed from a .com to .co.uk with a 301 reflecting this, this was to get listed on google.co.uk aswell as the .com site since we were hosted in america.
our site wasnt that great in google rankings but when it followed the 301, it did crawl the redirected pages.... but it also took it off the serps too
ok this was to be expected eventually but i would have thought the old comes off and new on straight away. now the redirected pages are off google and the index hasnt been updated leaving us off google for the recently crawled new site too.
I have read about peoples experiences with a .htaccess file and I would like to make one. I have never done it before. I have tried several of the examples that I have found here in this forum in notepad and used cuteftp to upload to my site this is what my hosting company told me to do. I must not be doing some thing right because it's not working. Here's an example of what I added:
RewriteCond %{HTTP_HOST}!^domain\.com
RewriteRule ^.*$ [domain.com%{REQUEST_URI}...] [R=301,L]
I would like to direct to [domain.com...] since I have more links and that's how they have always been indexed.
I tried that and others and with www. and it still didn't work. Am I using the right thing and am I supposed to save the file as text and save it in my web? I would greatly appreciate any help. Thank You.
Try this one, as it works for me:
RewriteCond %{HTTP_HOST} !^domain.com
RewriteRule (.*) [domain.com...] [R=301,QSA,L]
Dan
first of all, thank you for all the hints you give here at webmasterworld.
would returning a "410 Gone" status code be appropriate if the requested resource is gone for good?
i know googlebot is sending http/1.0 requests, still i wonder how it would deal with a http/1.1 (i.e. 410) response code.
regards
martin
- Ash
mod_rewrite is IN NO WAY related to FrontPage extensions .
If this is the answer you've got, maybe your question wasn't clear enough :(
When modifying the .htaccess file, be careful with all the stuff in there used for the Frontpage extensions. Don't erase them but add the rewriterules at the end of the file.
Good luck!
Dan
RewriteEngine on
RewriteCond %{HTTP_HOST}!^www\.domain\.com
RewriteRule ^(.*)$ [domain.com...] [R=301,L]
They said I would need frontpage extensions and they reccommended using notepad and cuteftp. Is there anything else I should ask or tell them?
Karen