| 3:31 pm on Sep 26, 2005 (gmt 0)|
| 3:38 pm on Sep 26, 2005 (gmt 0)|
My opinion: Keep your URLs as short as your system will let you. It makes good sense to use one or two keywords so your URLs are informative to humans, but don't go overboard. Keywords in the URL have a small effect compared to other things you could do so there's little point in long, keyword-stuffed URLs.
| 5:02 pm on Sep 26, 2005 (gmt 0)|
and I suggest you to stay away from this strucure. IMHO, google awards sites with natural?x=bluh&y=bluhbluh that those stuffed with keywords. Not to mention that google might penalize your site for this strucure even if it was a searching engine friendly (over optimisation penalty).
| 5:40 pm on Sep 26, 2005 (gmt 0)|
yeah thats what i was wondering. ie is using the keywords as folder names over doing it.
I think we'll stick to something short.
This seems logical and natural. No over doing it as thats the last thing we want to do. I am just trying to organize the files, and help with the theme of the site.
Or you think this is still over doing things?
| 6:33 pm on Sep 26, 2005 (gmt 0)|
I don't think you'll have any problem with that. I've just revamped a site and revised the directory structure, such that I end up with addresses similar to (in the most extreme case): keyword1keyword2.com/keyword2-keyword3/keyword4-keyword2-someword/someword.aspx?someparam=keyword5&someref=number. Google spidered the entire site and, as far as I can tell, I'm not being penalized for those keywords employed in these kind of addresses.
I can't see how G can penalize you for using a logical structure, albeit keywordy. If that's what your site is about, how can it be wrong to use those words!? I would only use 1 repeat keyword maximum in any address though (excluding domain name)
| 9:52 pm on Sep 26, 2005 (gmt 0)|
If you look at scraper site directory structure you tend to see a lot of keywords and a lot of repetition. Two of three keywords appear in paths repeatedly. Google is doing a pretty good job of making these pages "Supplimental"
I believe Google, in order to eliminate scrapers, as of the Bourbon update, is penalizing keyword heavy directory structures with lots of repetitions of multiple keywords.
Of course this is a killer to product sites and I have one where pages seem to be "URL only" for no apparent reason. But only product pages which used the same multi-keyword directory path repeatedly. This is still just supposition at this point.
Also in my opinion Bourbon seemed to mandate absolute links to all pages. Links can get cumbersome with extensive keywords, but of course they are very discriptive, which can really help a visitor.
| 10:00 pm on Sep 26, 2005 (gmt 0)|
So now I'm curious, is better to have the pages all in the root of the site or break them out into single layer folders?
---- or is this better ----
The second is more logical in my case and I wouldn't change it because helps organization but just wondering if it's hurting me in the long run?
| 11:03 pm on Sep 26, 2005 (gmt 0)|
modemmike, I asked myself the same question a while back, and after a bit of reading around I gathered that it's not the directory structure that matters, but the way the pages are linked up. i.e. a link from the homepage to somepage.htm in the root level should be regarded as equivalent to a link from the homepage to /some-directory/some-other-directory/somepage.htm. I'm still not totally convinced though! It might be purely a PR perspective. Maybe others can comment...
| 12:18 am on Sep 27, 2005 (gmt 0)|
I have one site that is a PR6 for no good reason (no BL's) but as I drill down (into sub folders) the pages get less PR so I was curious... this site with PR6 is a complete mystery to me PR 6 and no BL's? That's probably another topic however.
| 9:28 am on Sep 27, 2005 (gmt 0)|
The thing is, the Google toolbar is just "guessing" the page PR, based on the homepage PR (I think). I've read that you shouldn't pay it too much attention. I don't think anyone really knows the true PR their pages have.
| 9:30 pm on Sep 27, 2005 (gmt 0)|
Would you mind sending a sticky for the URL of the mystery PR6 site. I just looked into another PR 6 and it was interesting.
Certainly Google uses the linking structure of your site to distribute page rank. I maintain a traditional sitemap.htm page with absolute links to most of my pages. Without this sitemap.htm many of my pages seemed to go URL only after Bourbon. I pulled and reinstated this sitemap twice, each time when the absolute links were gone more pages were URL only. The user navigation linkage is "relative" and would have a much deeper struture without the sitemap.htm. The pages that were "deep" were the ones disappearing. The sitemap makes the structure almost flat.