Forum Moderators: open

Message Too Old, No Replies

Why is googlebot not deep crawling one of my sites?

Could it be a piece of javascript?

         

Steve Norris

4:02 pm on Nov 27, 2002 (gmt 0)

10+ Year Member



I own three sites, one of which is deepcrawled by Google once every 4 or 5 weeks (150 to 500 page requests logged)and as a result has excellent traffic from Google.
The two others are crawled at the same date (all three are linked) but only superficially (2 to 10 page requests logged). Much lower traffic from Google.
My site designer swears that the code is identical on all three sites. As they were constructed 6 months apart I'm not convinced. All three sites use an Access database to generate pages, and have includes whatever that is.
Looking myself, which is not easy, the differences I can spot are:

The use of Style sheets
The inclusion of a metatag called generator
A javascript framebuster code in the headers which in the good site redirects to index.asp and in the poor sites redirects to their url.

Thanks for any help

mattglet

4:24 pm on Nov 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



just one little piece of knowledge i can drop: javascript has no bearing on search engines. as far as i know, no SE's parse javascript.

Steve Norris

5:20 pm on Nov 27, 2002 (gmt 0)

10+ Year Member



Thanks,

Since making my post I have now come to the conclusion that the problem results from my sites being frames sites.
All three are frames based, but the successful one has a noframes section with text and links.
I reckon that is the difference.

jdMorgan

6:26 pm on Nov 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Steve,

Bingo! Add a <noframes> section written for a visitor with a non-frames-enabled browser, and containing at least an introductory paragraph and contact info. Do not just stuff it with a "This site requires frames, but your browser doesn't support them" message - Unless that's what you want your description to read like in the SE results!

If you add a <noframes> section before the next-deep crawl, you'll find inproved ranking in January. A quick-n-dirty cut-n-paste of the above-mentioned info from your main content frame would be a good idea, in case the Google update starts 5 minutes from now...

Just make sure that the <noframes> section contains useful-to-humans content - that is what it was intended for, and SE's may see other uses as "spammy".

The trouble that frames cause with search engines is a major reason you don't see them much anymore.

Jim

Steve Norris

6:37 pm on Nov 27, 2002 (gmt 0)

10+ Year Member



Jim,

I shall do that straightaway and watch for the next time the spider passes. I have made a graph of the googlebot visits to my three sites and it looks very predictable. Once every four or five weeks for the deep crawl and once evry week for the light crawl. On that basis I'm due for a deep crawl the first week of December.
I shall also put links to the other pages in each site within the no frames tag.

Thanks for your comments

Steve

BigDave

6:48 pm on Nov 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do not use frames, as I think they tend to be overused and annoying. But There are several sites that link to mine that their frames are crawled just fine by google without a noframes section.

One problem that I notice on a lot of frame pages is that they are dead ends. The site depends on the navigation frame for the users to move around, and there are no links in the content window. These sites aren't very good at spreading their PR around between their pages.

Actually, these pagea are very good to get a outbound link from.you will get all the passed PR from that page, as none of it gets circulated back into the site through navigation links.