Welcome to WebmasterWorld Guest from 54.166.189.88

Forum Moderators: open

Message Too Old, No Replies

How deep?

Fourth level docs are not indexed.

     
3:07 pm on Apr 29, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2003
posts:1067
votes: 0


Can someone please point me to the coverage of how to get Google to index deeper content pages, like fourth level.
Thanks guys.
3:36 pm on Apr 29, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:June 23, 2003
posts:121
votes: 0


Create a site map linked to from the home page and then from the site map link to every single deep page.

Or even better, get external links directly to the deep pages you want indexed. Beyond helping Google find it, its the best way to seo your site.

4:18 pm on Apr 29, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


Yah, good answer. Site maps give you a chance to highlight parts of your site that would normally be buried several clicks down.
9:46 am on Apr 30, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2003
posts:1067
votes: 0


Thanks, we are talking BIG though, how manz sitemap pages does G accept as not spamming? I do understand that follow,noindex can help. Is this the proper method to avoid spam considerations by all major SE's?
1:36 pm on Apr 30, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:June 23, 2003
posts:121
votes: 0


I wouldn't worry about it (the site map) being considered spam. That is unless you are concerned that the pages you are trying to get indexed are spam...

I might be concerned that if I had too many links on my sitemap page that at some point google might not follow all the links. I think I remember at one point someone saying that 100 links on a page was the max...maybe someone else can chime in with more info...

If you had too many links, just have multiple sitemap pages...

4:17 pm on Apr 30, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 16, 2003
posts:746
votes: 0


I have had experience with google spidering more than 100 links on a sitemap and I think others have seen this as well. I think as long as the links are within 100k of the page you are fine.

[webmasterworld.com...]

4:43 pm on Apr 30, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 5, 2002
posts:1562
votes: 0


Build sitemaps that they're useful for your visitors and not for a spider. That helped me very well during the last years and I got LOTS of pages spidered this way.
4:51 pm on Apr 30, 2004 (gmt 0)

New User

10+ Year Member

joined:Apr 26, 2004
posts:2
votes: 0


On an interview with one of Google development guys (about 6 months ago), he said they were *intended* to check pages with over 100 links as suspicious - and then check them more thoroughly whether they are spam or not. I don't know if that was implemented, and what is the thorough test.

Maayan

6:20 pm on Apr 30, 2004 (gmt 0)

Junior Member

5+ Year Member

joined:Feb 11, 2008
posts:66
votes: 0


there are an incredible number of pages out there with more than 100 links on them that work just fine. The rule to go by is 100kb, not 100 links.
6:35 pm on Apr 30, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2003
posts:1067
votes: 0


Thanks folks, reassuring, I will build more sitemap pages now.
6:59 pm on Apr 30, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 5, 2002
posts:1562
votes: 0


Maayan, where was this interview? They'd have a lot to do if that's suspicious.
7:41 pm on Apr 30, 2004 (gmt 0)

New User

10+ Year Member

joined:Apr 26, 2004
posts:2
votes: 0


Can't find it but I'll keep looking

BUT, it is mentioned in GG guidelines:
[google.com...]