Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Also moved one site to another host and googlebot started hitting the new host in about 1 hour or so.
Is it just lucky me or is google getting faster at everything?
<added> belated congratulations, Vitaplease</added>
I moved host 5 months ago.
For fear of googlebot missing the move, I was sweating it out.
Googlebot found it within a day.
We try to make it seamless and fast for people moving hosts. Glad it's worked well for folks. My rule of thumb is to bring both sites up on the two different IP addresses and then switch the DNS to the new IP address. After you see Googlebot visit the new IP address, you can probably take down the old site. Doesn't hurt to leave the old site up a bit longer, but you shouldn't need to worry about it.
GoogleGuy, Googlebot switched to the new site asap, but Mediapartners-Google/2.1 (+http://www.googlebot.com/bot.html) is still pounding my old site.
That's great :) one thing less to worry about.
How about changing domain names? That's a bit more complicated with backlinks pointing to the old domain and all that - i know the spider's pretty quick, but is it still around three-four weeks for everything to settle all the way to the SERPS?
Any recent experiences - within the last month or so?
I created links to a brand new site on the 20th (which contains 1500 subdomains which are not linked from the front page :-) ).
Googlebot looked at almost all of the subdomains on the 21st and has deep crawled on the 22nd.
A really strange phenominon (imo) is that it found the subdomains apparently without viewing the pages which link to them! How is this possible, any ideas?
You can't really create an active subdomain or domain without telling the whole internet about it. Except for links(*), there's only one [google.com] way that googlebot can discover your new sub/domain ;)
If this is right, it's very cool. Only, it seems
should be the first thing on a domain now, i'm glad Googlebot respects it.
However, I think people are really missing the biggest point. If your new site/pages have PR0 (as all new sites initially are), you still have to wait about 5 weeks (or according to some here, what feels like a yearly cycle, lol!) before a Pagerank/backlinks update and only then can you rank good on competitive keywords. I have some PR0 pages that are #2 and #6 for two non-competitive terms, while the site's main page is #65 for a much more competitive term. Therefore, without that pagerank/backlinks calculation being completed, you are a PR0 site competing against PR5's and above.
My sites have only been indexed after links have been placed (or the subdomain phenominon mentioned earlier).
I have developed many sites over a few weeks - the indexing followed very shortly after inward links were placed.
I have a site which relates to 1500 Cities worldwide. I wanted a seperate URL for each City, so I used subdomains.
I consider a subdomain to effectively be treated as a domain but may be told otherwise by webmaster/seo's here with far greater experience than I :-)
Not as fast as others have mentioned, but I'm pleased with anything within a month, and as I said, that site hadn't been touched in years so the client should feel lucky Googlebot even remembered it at all......
One site has been indexed quickly and several hundred pages added within a couple of weeks, which was excellent.
Another site of mine had a couple of hundred pages added and then Google just hasn't touched it in about 6 weeks. There are hundreds more well linked pages to be picked up.
Another 10K page directory site was partially crawled 3 months ago and over a thousand pages were added to the index. Nothing since. It is very frustrating that Googlebot doesn't come back and do the job properly.
I get the impression that Google is doing an initial spidering very quickly (which is good) but not returning for a thorough deep crawl for a long time (which is bad).
I am also finding it difficult to know just how many pages are actually in the Google index - each datacentre seems to return a different number of pages (which is messy). Where I check from (UK or USA) also seems to influence how many pages are in a datacentre (which is weird).