Forum Moderators: mack
Google is now looking at my site, but I'm not sure if it's able to crawl through my full site. How do I tell? I have awstats which is provided by my webhost and a bunch of others but they don't seem to tell me how many/which pages have been crawled. (Just says "1 Hit" 15.97 KB bandwidth..its a small site).
Also, although Google is now looking at my site, I still don't have a description for my site under my google entry. The strange thing is, for 1 day I was on page 7 (with a description) and now I'm nowhere to be found!
Thanks in advance :)
It sounds like a log file problem.Maybe your host cannot provide you an adequate log file.
As In a good log file is reported much more than "1 Hit" 15.97 KB bandwidth..
I too have a small web site hosted without access possibilities to a log file.
As regard of your Google entries,probably in a first crawling the spider has recognized succesfully some pages and regularly indexed with their titles/descriptions,while in a second one it wasn't able.
This could mean a bad quality of your internal links.
Review them making sure that almost the pages you want indexed are linked to the home page by a textual link,as spiders recognize it more easily than other "complex" linkings.
/(Result http code 200)
this means that google has found your home page and all is well
Should I create a robots.txt file?
robots.txt is about EXCLUDING robots not allowing them - if there is no robots.txt then any robot including googlebot will assume it can spider your entire site - most sites on the web don't have one, it is not needed, should you decide to create one there is a whole forum here at WebmasterWorld dedicated to it.
site:http://www.yoursite.com -asdasdasdasd
The site: tells it to search only your site. the -asdasdasdasd says to only show those pages that don't have asdasdasdasd on them.
You should have a pretty good idea of how many pages you have in your site. This will tell you how many pages google is 'seeing'
There will be some fallout for duplication on weak pages. For example, I have a page that links to about 50 different pages of widgets. Only about 10% of those pages are in the index.
What happens is that the page with all the links is a little stronger so google indexes that instead of the individual pages.
Chris
This can be very exasperating, especially when at other times the bot comes round in a couple of days and spiders the complete site.
A couple of things you might want to do :-
1. Build a site map that has a link to every page of your site and link to this from the home page.
2. Try and get links from other sites to lower level pages.
Hope this helps