Forum Moderators: open

Message Too Old, No Replies

how deep does google go, where should i put my site map

         

leoo24

4:55 pm on Feb 6, 2003 (gmt 0)

10+ Year Member



hey there guys, my first post on one of these forums!
Straight to the question, i think that i'm quite adept at html design but only recently have i become interested in designing for SE placement/results, so how many levels does google spider into a website. I'm not all that clear on his topic, and have read quite a bit. All my pages do not have a direct link to them from say the index, although each page has a link to it from another page. So the best thing to do is have a map right!, do i then have a link on my index going to a map page which then has a link to every single page, so it would look like this: index-->map-->links
is this too deep, would the links get spidered?
Also i saw something about placing all the files in the root, all my files are in the root and always have been, but how does this affect the depth a spider has to go, or is the depth reffering to how many folders it has to access?
I apologise if this is no entirely the correct forum for this question.
while i'm here should pages with iframes on them also have a noframes conent?
cheers
leo

tedster

6:46 am on Feb 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome to WebmasterWorld leoo24!

I decided your question was best in the Google News forum, since you focused on Googlebot's style of spidering. This is a busy place, and I'm sure you'll see lots of discussion.

leoo24

8:42 am on Feb 7, 2003 (gmt 0)

10+ Year Member



thankyou tedster :)

glengara

9:59 am on Feb 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



*index-->map-->links*
I think you're adding an extra step, Leoo; I'd see it as index...>Map.
Don't forget to use some optimised link text to describe those links.

leoo24

10:08 am on Feb 7, 2003 (gmt 0)

10+ Year Member



now i just did a quick search on optimising link text- didn't find much, could you explain what you mean.
At an educated guess you mean a short descriptive link as oposed to single word links.?

glengara

10:37 am on Feb 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Link text goes within the <a> tags, as in <a href=http://www.domain.com/>Link Text</a>.
The idea is to use say the page title or at least include the main keyphrase in a short description of the link.

jlr1001

12:32 pm on Feb 7, 2003 (gmt 0)

10+ Year Member



Also i saw something about placing all the files in the root, all my files are in the root and always have been, but how does this affect the depth a spider has to go, or is the depth reffering to how many folders it has to access?

Your last assumption is correct. Depth is a function of how many nested directories there are from the root directory.

If all of your files are in the root, then they all have the same depth and should be easily spiderable. That is as long as there is a working link to each of them. Your sitemap should work, but so should your previous setup, i.e. every page is accessible one way or another by an internal link.

Make sure that all your links are working, and that you haven't any orphan pages. Another think to consider is maximizing your internal linkage, which would help your equalize your pagerank throughout your site.

This would also make your site more navigable since your vistors would have less of a problem guessing how you want them to navigate within your site, or themed section of your site.

I actually have my sitemap linked on every page. This makes sure the SE spiders have access to it, and also ensures that my visitors can always find what they're looking for.

-jlr1001

egomaniac

3:26 pm on Feb 7, 2003 (gmt 0)

10+ Year Member



Hi leoo24,

A lot of what you can read about SE optimization is outdated when it comes to Google.

A site map is a good idea, but not a necessary one to get your pages spidered. Google will follow all of your links from page to page. A site map is good insurance to make sure no page is missed by the spider. There is no Google optimization reason to link your site map to every page, or even the index page. I have three domains. One has the site map linked off of nearly every page within the domain. The other two have their site maps linked off of only a few pages within their respective domains. All three regularly get all of their pages spidered and included in Google.

Plan your site map based on what makes the most sense for either your users, or distribution of PageRank within your site, or both.

Don't worry about files being in the root versus files being 2,3,...,6 folders deep. It doesn't matter to Google. What matters is the "quality" the page(s) linking to the file 6 folders deep. Higher PageRank pages with your desired keywords in the anchor text will make the "linked-to" target page rank well irregardless of how many folders deep it is.

Use folders to organize your content around your target keywords. This will get keywords in your url, which *may* give you a boost in some engines. It may have no effect on your ranking, but then again it might. Either way, done judiciously, it can help the user.

leoo24

4:52 pm on Feb 7, 2003 (gmt 0)

10+ Year Member



cheers guys, that's great advice, just what i was looking for :)