Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

URLS with code

what the SERPs will index and not



10:49 pm on Mar 19, 2009 (gmt 0)

5+ Year Member

I have a new client who has not been getting any traffic to her website, and what I noticed about the pages where she has all of her articles is that the designer used this kind of coding:


As a result, no keywords are ever in the URLs.

My understanding is that Google doesn't like this kind of data in the URLs, so will someone please let me know what the reality is?

I'm thinking it would be better to have something lie:


[edited by: coopster at 1:48 pm (utc) on Mar. 20, 2009]
[edit reason] please use example.com, thanks! [/edit]


10:52 pm on Mar 19, 2009 (gmt 0)

5+ Year Member

it would be better, but not matters that much as to not having any traffic at all. perhaps there's too heavy competition on the niche and/or not enough backlinks.

Check if the website is available within SEs at all with site:http://www.example.com searches

[edited by: coopster at 1:48 pm (utc) on Mar. 20, 2009]
[edit reason] please use example.com, thanks! [/edit]


11:22 pm on Mar 19, 2009 (gmt 0)

5+ Year Member

Thanks for responding. Her competition is heavy and she has so many damn keywords it's hard to keep up.

As for backlinks, she has 2840 sites pointing to her on Google. She has 548 pages indexed on Google.

I know her onpage optimization is not done and her metatags currently have probably 100 keywords in there, so I can fix that.

Any other suggestions besides marketing tactics?


11:27 pm on Mar 19, 2009 (gmt 0)

5+ Year Member

converting the urls is a good thing. the same for the markup/content ratio. header (h1,h2) tags, nice xhtml+css, optimized keywords and content, keeping track of outgoing links (don't send PR where you don't want to), the same for the internal links. Also it's a good idea to submit sitemaps if the website is big enough to speed-up crowling. (548 pages just are not worth the hassle).

Should do the trick, but I don't think you'll notice that until the next google dance.


11:44 pm on Mar 22, 2009 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Google does not generally have problems with URLs that have up to three parameters in them.

Your issues are likely a myriad of Duplicate Content issues, and poor internal linking.

Check out [site:www.example.com] and [site:example.com -inurl:www] to see what you get.

Analyse site spidering using Xenu LinkSleuth, or similar, too.

You might also have problems with 'dodgy' linking schemes, as the number of incoming links does sound quite high.


If you change to all new URLs and use rewriting to connect the new requests back to the old internal filepaths, be aware that you will also need a set of redirects to cater for all requests to the old URL and redirect them to the new URL. These redirects will need to cater for all possible parameter permutations and order, and will need to force the correct domain at the same time. If the new URLs involve keywords then this is non-trivial to implement as it will need a whole bunch of routines that read your database in order to be able to associate each old article URL with the new URL for that article. This is a complex piece of work, and there are a lot of ways to get it wrong and damage the site still further.


Featured Threads

Hot Threads This Week

Hot Threads This Month