Forum Moderators: open
Also i saw something about placing all the files in the root, all my files are in the root and always have been, but how does this affect the depth a spider has to go, or is the depth reffering to how many folders it has to access?
Your last assumption is correct. Depth is a function of how many nested directories there are from the root directory.
If all of your files are in the root, then they all have the same depth and should be easily spiderable. That is as long as there is a working link to each of them. Your sitemap should work, but so should your previous setup, i.e. every page is accessible one way or another by an internal link.
Make sure that all your links are working, and that you haven't any orphan pages. Another think to consider is maximizing your internal linkage, which would help your equalize your pagerank throughout your site.
This would also make your site more navigable since your vistors would have less of a problem guessing how you want them to navigate within your site, or themed section of your site.
I actually have my sitemap linked on every page. This makes sure the SE spiders have access to it, and also ensures that my visitors can always find what they're looking for.
-jlr1001
A lot of what you can read about SE optimization is outdated when it comes to Google.
A site map is a good idea, but not a necessary one to get your pages spidered. Google will follow all of your links from page to page. A site map is good insurance to make sure no page is missed by the spider. There is no Google optimization reason to link your site map to every page, or even the index page. I have three domains. One has the site map linked off of nearly every page within the domain. The other two have their site maps linked off of only a few pages within their respective domains. All three regularly get all of their pages spidered and included in Google.
Plan your site map based on what makes the most sense for either your users, or distribution of PageRank within your site, or both.
Don't worry about files being in the root versus files being 2,3,...,6 folders deep. It doesn't matter to Google. What matters is the "quality" the page(s) linking to the file 6 folders deep. Higher PageRank pages with your desired keywords in the anchor text will make the "linked-to" target page rank well irregardless of how many folders deep it is.
Use folders to organize your content around your target keywords. This will get keywords in your url, which *may* give you a boost in some engines. It may have no effect on your ranking, but then again it might. Either way, done judiciously, it can help the user.