Forum Moderators: open
Sorry for all the questions, but as I'm sure you all can understand, I am new to Google and VERY excited to have seen her :)
Karen
216.239.46.23
216.239.46.86
216.239.46.197
216.239.46.85
216.239.46.116
216.239.46.102
just to name the first couple. So I'd guess Brett is right about the entire block.
The thing I'm curious about is why so many IPs. Is this just the result of inbound links? What i mean is that if I go through my logs, I get a different IP for Googlebot on just about every line whereas usually with the freshbot, I get just one IP throughout the day.
You can probably find what you are looking for in the “tracking and logging” forum:
[webmasterworld.com...]
Just browse through the titles until you see some relevant ones.
However, I can never seem to get the free ones I use (analog) to do exactly what I want so I often download my log files and write little programs of my own to pull out the information I want.
To track the Googlebot, I just extract every log entry with the word “googlebot” in it.
See [searchengineworld.com...]
You've got a ton of parameters in your URL.
"?section=fantasy&sub=byauthor&auth=Robin+Wayne+Bailey"
is apparently just too much to get through. IMHO, shorten it up if you can or try one of the rewrite programs so that your parameters look like directories instead of parameters. I don't know anything about these but you need to do something to cut down the parameters.
I just completed a redesign for a dynamic site that now uses an ALA-type method to deliver static looking URL's.
Here's a really brief outline: each page in the database has a url like "blue-widget.html" in a url field. All requests for product pages are redirected to a php script that parses the URL and looks it up in the database to retrieve the product record and all necessary data.
Previously, this client encoded some of that data in the URL, like you do, and none of their pages were in google. As of the October update, they have 400 product pages in google, and the site is doing twice the business it did in October.
You can optimize your main pages all you want, but you're going to get far more traffic by having hundreds or thousands of pages in google than you ever will from just a dozen optimized html pages.
So, look up ways to make your URL's search engine friendly, and do it. Getting those pages in google is definitely worth it.
You are right. I analyzed the logs and it looks like Googlebot isn't taking any URL that has more than two variables in it. I'm working on eliminating some of the worst offenses. I'll have to wait until next month to see if it helps. I think it has all it wants from my site for this month.
Thanks everyone for the tips. It has helped a great deal!
216.239.46.100, 216.239.46.101, 216.239.46.105, 216.239.46.124, 216.239.46.133, 216.239.46.140, 216.239.46.164, 216.239.46.166, 216.239.46.171, 216.239.46.172, 216.239.46.204, 216.239.46.220, 216.239.46.222, 216.239.46.223, 216.239.46.23, 216.239.46.236, 216.239.46.66, 216.239.46.82, 216.239.46.88