Forum Moderators: open

Message Too Old, No Replies

Is it true that spyders only crawl the top two levels of pages?

Don't they drill deeper?

         

Lokutus

5:16 pm on Sep 20, 2004 (gmt 0)

10+ Year Member



Is it true that the G and Y's spiders only crawl the top two levels of pages on a site?

Brett_Tabke

10:35 pm on Sep 20, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



in a word - no.

ogletree

10:47 pm on Sep 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They can spider as deep as you can make them it just takes a while.

Lokutus

12:28 am on Sep 21, 2004 (gmt 0)

10+ Year Member



Damn, another thread here somewhere says they only go two levels deep.

trimmer80

1:14 am on Sep 21, 2004 (gmt 0)

10+ Year Member



The depth the bots will crawl is quite dependant on the amount of inbound links into your site.

If you only have a couple of links in and you have a site with 10000 pages spread throughout 5 levels then expect it to take a long time for google to crawl.

However if you have 1000's of inbound links then expect a to get all pages in the serps in a small period of time.

Chicken Juggler

1:58 am on Sep 21, 2004 (gmt 0)



I would try to keep my pages at mo more than 3 deep. If you do the linking right you can do that no problem. You can get deeper pages linked but the PR would be so low that they would not rank anyhow.

BigDave

2:01 am on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It also helps to get deep links. The more entry points that google finds to your site, the more places they will work out from.

This is also where site navigation is important.

kaled

9:20 am on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is it true that the G and Y's spiders only crawl the top two levels of pages on a site?

What do you mean by "top two levels"?
Directory structure need play no part in spider algos - spiders simply follow links.

Perhaps the question should be rephrased thus
"Is it true that G and Y's spiders only follow links two clicks deep?"

In this case, the answer would still be NO.

Many have speculated that the PR algo used by Google somehow uses directory structure (if only for making initial guesses). A flat directory structure might be useful for PR, but I've never seen any real evidence for this.

GoogleGuy has often recommended site maps. I think that's good advice.

Kaled.

webnewton

11:34 am on Sep 21, 2004 (gmt 0)

10+ Year Member



This may not be an poletically correct statement but pages deeper inside the sites have lesser possibility of getting indexed. If this can't be avoided use of SITE MAPs could solve the problem.

ogletree

2:12 pm on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It is easy to prove. I had a site that had thousands of pages. The person that set it up did not know much about site structure. I moved everything up so that it was no more than 3 levels deep. As soon as the site was respidered my traffic went way up. I did not move the pages just the links. www.domain.com/dir1/dir2/page.html is the same to G as www.domain.com/dir1/page.html if they are both pointed to by the front page. To fix a site don't move or rename pages just move the links. Your most important pages should be linked to from the front page.

If you have a lot of pages have the front page point to a hundred or so 2nd level pages then those pages each point to a hundred or so pages. This looks like a site map. Put the links on the bottom of the page on the 2nd level. Make the 2nd level pages content pages with your most important words. A site map does not have to be a page of links take advantage of the outgoing/internal links and word density and make a content page out of each of your sitemap pages. Just make sure that your pages are real content and that you did not buy your site for PR. Google may frown upon that.

internetheaven

9:13 pm on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is easy to prove. I had a site ...

That's like saying "the falling piano missed me therefore there must be a God".

What you said really did not prove anything as such. Your changes could have coincided with a big algorithm change, backlinks update, PageRank update (depending on when your talking about), 'sandbox' period expiry etc. etc. etc.

PLUS - just because something is true in the past does not make it sound advice for the future. Many, many, many sites do incredibly well using as many 'levels' as they want. Internal linking structure is a major part of good indexing and ranking and (in my opinion) always will be, but let's not make up "levels" ghost stories to scare the newbies with.

ogletree

10:32 pm on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you have enough sites and know what you are doing it is easy to prove. Everything I said is true and verifiable and repeatable.

steveb

10:36 pm on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You two are agreeing.

BillyS

10:39 pm on Sep 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The way I read it, they are agreeing too!?! Can't we all just get along?

ogletree

5:03 pm on Sep 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I promise if you have a page 4 levels deep and you put a link to it from your front page it will do better.

BigDave

5:08 pm on Sep 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But the original question was whether or not it was crawled, not how well it ranks.

Andyway, you should get deep links instead of worrying about having that link off your home page.

Jon_King

5:33 pm on Sep 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Andyway, you should get deep links instead of worrying about having that link off your home page.

How true.

ogletree

6:42 pm on Sep 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I understand what the first post was about. I elaberated because it was necessary. What is the point of having pages in G if they don't rank for anything. You might get a little traffic for really obscure long phrases. Getting more deep links is good but takes a while. Even if you have more deep links it still helps to get links from the highest pr possible whcih is normaly the front page. What we are both saying is usefull. What I am saying can get a big site more traffic the quickest way possible. There is no sandbox for internal links.

steveb

10:57 pm on Sep 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The answer to the question is in post #2. Nothing else to it. Directory location is irrelevant to crawling. It's just links (and google will crawl down more than two levels of those links).

prairie

6:20 pm on Sep 28, 2004 (gmt 0)

10+ Year Member



Our experience has been that Google will normally spider two deep at a time, and return to go deeper the next time it visits.

Meanwhile it will check on index.html more frequently.

So less deep directory structure might give you quicker re-indexing of a site.

Either way, it will crawl beyond 2 deep.