Welcome to WebmasterWorld Guest from 54.224.57.95

Message Too Old, No Replies

URL rewriting for SEO? How important for Google?

     
9:24 am on Dec 27, 2012 (gmt 0)



I have URLs with lot of parameters like ?x=bla&y=bla&... Is URLs rewriting matter in Google SEO a lot or I shouldn't take care of it too much and Google robots are intelligent enough to read complex URLs?
12:15 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The problems with complicated urls and parameters usually come on your side, not on Google's side. If you have long urls with lots of parameters, its hard to use them consistently. If you don't use them consistently, then Googlebot will crawl extra pages. You will have problems with duplicate urls. You may not be indexed and ranked like you should be.

It sounds like you have multiple parameters on each url. This can be problematic for several reasons:

1) What happens when the parameters are out of order? Your site should redirect to put them in correct order, or make sure you have a canonical tag with them in the correct order. Otherwise Googlebot will see two different urls for the same page.

2) What happens when a parameter is missing? You should either show an error (with error status) or redirect to fill in the default value for that parameter. If your server assumes default values for parameters and reacts the same as if the parameter were actually in the url with that default value, you will have similar problems to out of order parameters.

3) Are you using parameters that are customized per user? Things like tracking parameters, which page a user came from, last search a user performed, etc. If so, Googlebot shouldn't be crawling pages with these in it. It will just confuse it. There are some setting under "url parameters" in Google Webmaster Tools that you can use to help in this case, but I would recommend not using parameters like these on crawlable urls.

If you can use urls with parameters without these pitfalls, then there is no big SEO advantage to rewriting your urls. I find that it is very hard to have parameters and keep the discipline needed. Developers have a tendency to add parameters any time they need to. Url rewriting is one way to make them aware that there are SEO considerations when doing so.
1:10 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



URL rewriting often makes for a simpler URL, one that is easier to say, and "means something" when you look at it.

Parameters with duplicates:
example.com/index.php?category=20&name=widgets&page=5
example.com/index.php?category=20&page=5&name=widgets
example.com/index.php?name=widgets&category=20&page=5
example.com/index.php?name=widgets&page=5&category=20
example.com/index.php?page=5&category=20&name=widgets
example.com/index.php?page=5&name=widgets&category=20

Rewritten:
example.com/c20-p5-widgets

You don't need a .html or .php suffix and if example.com/c20-p5-random-words is requested then the site should redirect to the correct URL.
2:07 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



Seconding both what deadsea and g1smd said. Nicer URLs can also help with click-through, but the biggest benefit I found is exactly what deadsea said - minimising endless duplicate pages created because of inconsistant use of parameters.
4:38 pm on Dec 27, 2012 (gmt 0)



So what i understood there is no any significant benefit of URLs rewriting in Google SEO if I follow rules mentioned by deadsea and g1smd and I shouldn't take care of complex URLs having too many parameters?
5:02 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Senior Member



In their Webmaster Guidelines, Google actually specifies that you should make your URLs just a string of relevant words rather than like g1smd's examples there. If they go to the trouble of putting it in there, I assume it matters to them.
5:04 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Traditionally the issue was a session parameter in the query string that would normally be also served in a cookie.

Because the spiders did not perform requests with cookie's, they would get issued a new session for every page load- essentially this meant an almost infinite number of unique URLs, and also trapping the spider and giving them lots of duplicate content under those URLs.

There was possibly a time when people thought URL rewriting was good to give the impression your site was 'static' and not served by a scripting language.... but I think in both of these cases are obsolete.
9:56 pm on Dec 27, 2012 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The problem with URL rewriting done badly is that

example.com/c20-p5-random-words-here

will return the same content for any combination of words. You should make sure the words are checked and such incorrect URL requests either return 404 or redirect to the correct URL.
10:00 am on Dec 28, 2012 (gmt 0)



By diberry
In their Webmaster Guidelines, Google actually specifies that you should make your URLs just a string of relevant words


I believe relevant words string means a string of keywords. Would it not be keywords stuffing which Google penalizes? Please explain bit more.

the biggest benefit I found is exactly what deadsea said - minimising endless duplicate pages created because of inconsistant use of parameters


You should make sure the words are checked and such incorrect URL requests either return 404 or redirect to the correct URL


which means URL rewriting has no siginficant effect on SEO by anyway and only helps to control URLs duplication?
11:41 am on Dec 28, 2012 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



which means URL rewriting has no siginficant effect on SEO by anyway and only helps to control URLs duplication?


in google's search engine optimization starter guide [static.googleusercontent.com] the dilutive effects of serving non-canonical urls are discussed.
6:36 am on Jan 2, 2013 (gmt 0)



I understood URL rewriting is necessary here. What is the best way to create

example.com/c20-p5-random-words-here

OR

example.com/c20/p5/random-words-here
8:05 am on Jan 2, 2013 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I wouldn't use the pseudo-folder structure.

That URL is category 20, page 5 with the category name.

For products, /p543234-product-name

reviews, paged, /r543234-p3-product-name

tech specs /ts543234-product-name

and so on.
11:49 am on Jan 2, 2013 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Lets not forget the other benefits or rewriting urls, there are many that don't involve Google.

example: A phpbb forum
One of the parameters used in a phpbb forum is t=, which is short for topic. If you do not rewrite phpbb urls then you can't pass the topic urls to your Facebook share button easily since Facebook also uses the t= for the topic line. The result is that Facebook takes the end part of your url as the topic title.

example: forums in general
When spammers want to find a known type of forum they can search for known url patters. Not to pick on phpbb, this issue is shared by many forums, but you could find a phpbb forum by looking for inurl:viewtopic keyword and get a long list of phpbb forum pages about your keyword. Rewritten urls = less spam.

If you're on the fence despite knowing how to implement them, get off the fence :)
5:39 pm on Jan 2, 2013 (gmt 0)

WebmasterWorld Senior Member



Sam222, here's the link to what I referred to: [support.google.com...] It does sound like keyword stuffing, but their take seems to be that it's more useful to visitors.

A site's URL structure should be as simple as possible. Consider organizing your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers). For example, if you're searching for information about aviation, a URL like [en.wikipedia.org...] will help you decide whether to click that link. A URL like http://www.example.com/index.php?id_sezione=360&sid=3a5ebc944f41daa6f849f730f1, is much less appealing to users.

Consider using punctuation in your URLs. The URL http://www.example.com/green-dress.html is much more useful to us than http://www.example.com/greendress.html. We recommend that you use hyphens (-) instead of underscores (_) in your URLs.


Also this page, which references the above page: [support.google.com...]

Use informative URLs. The URL (web address) of a page appears below the title, with words from the userís query in bold. Your URLs should be simple and human readable. Which do you find more informative: http://example.com/products/shoes/high_heels/pumps.html or http://example.com/product_id=123458?


I'm assuming that they give you some leeway on URLs in terms of keywords, since having the keywords there is actually considered useful for visitors. I'm sure there's a limit, like I'm guessing that "stuffing" would be something like including synomyms, and plurals and singulars, like:

http://example.com/green-dress-dresses-jade-emerald-forest

But as long as you keep it simple, the quoted examples above are what they advise.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month