| 7:42 pm on Jun 20, 2003 (gmt 0)|
The short commonly accepted answer is 2 is the limit, 3 is not good.
Best is none.
| 7:48 pm on Jun 23, 2003 (gmt 0)|
after the last update I think 2 might be 1 too many ihmo
| 8:09 pm on Jun 23, 2003 (gmt 0)|
Try to have none. Use rewrite to make all your URI's look static.
| 12:34 am on Jun 24, 2003 (gmt 0)|
You guys trying to create a new urban myth? ;)
Just for the record, I have several dynamic URLs recently deepfreshbotted(?) Jun. 21 just like so...
including new URLs that was just created a few days before June 21.
Go figure ;)
| 12:45 am on Jun 24, 2003 (gmt 0)|
>>what would Google consider to be a variable
anything after a question mark in a url, the formula being something like this
fullurl?variable1=value1&variable2=value2& etc ....
we are not just talking about getting spidered, you can get spidered all you want and wont necessarily rank for anything.
2 variable max but none will rank better every time. The more competitive the term the more important it becomes to have no vars in the url.
Tried tested and true on every size site you can imagine and as many languages. Every site that has been redone with out variable strings in the url has improved ranking, no exceptions.
If you are starting a site, always stay away from query strings. It is much easier to do at the beginning than it may be to get rid of them later.
| 2:54 am on Jun 24, 2003 (gmt 0)|
Just to share my experience:
I had a site with a static directory (similar structure to DMOZ) that got very good rankings in google - probably a combination of good structure, static pages, and page topics in the URL. I then switched to a dynamic directory because the program runs much better.
Pages with 2 variables get spidered and indexed and seem to show-up okay in the SERPS. I don't think the Google toolbar shows the correct pagerank for many of these pages though, as the entire directory shows the same pagerank. However, in the Google Directory, the pagerank is different than what the toolbar shows. The most frustrating part is that, despite getting almost all external links to change to my new site structure, the old pages often rank better in the Google SERPs.
On a side note, there was a several day period about 2 months ago where Google was spidering many more of my dynamic pages with up to 5 variables. None of those pages ever got indexed though.
| 3:03 am on Jun 24, 2003 (gmt 0)|
I'll mirror what jatar_k has stated about clean urls vs. dirty urls. Eliminate all variables, all of them. Clean and shorten the url strings as much as you can. If you use "/" to separate parts of the string, make sure to bring the most important part of the string forward, as close to the tld as possible.
You can get rather creative with your url rewriting and the url paths become very user friendly and much easier to bookmark and share. ;)
| 2:43 pm on Jun 24, 2003 (gmt 0)|
The original questions here is
|How many variables in a url would cause trouble for Google? Also, what would Google consider to be a variable? |
The original poster did not ask as to how a dynamic URL perform in regards to Google ranking.
By all means, I agree that static URL perform better than dynamic URL. However, there's always an exception to the rule. I can pull a few competitive key phrase with a dynamic URL at page1.
If the site is new then of course you have to use whatever means to convert your dynamic URL to seemingly static URL. Let's not forget that the main reason why Google is not really crazy about dynamic URLs, aside from session ID and spider trap, is because their crawler could cause a heavy load on the target server. Relevancy and ranking still applies.
Since this thread have move in the direction of the value of having a dynamic URL then let me ask this...
Suppose for whatever reason it's not practical to convert the dynamic URL to a static URL, is it then that this site is practically useless?
To humor the argument that dynamic URL does not rank well, what is then the impact of dynamic URLs to the overall site internal link structure?
Otherwise, if we to have to believe that dynamic URLs are totally useless then we wouldn't be using them, right?
| 3:02 pm on Jun 24, 2003 (gmt 0)|
Net_Wizard me too I have dynamic urls ranking high in very competitive areas but when you lose has many pages as i did over the last two updates you start to look at rewriting the urls.
one site lost every thing after id=1 (prod.asp?id=1&code=9999)
when you lose 100's pages you have a quick peep at you sites which use a rewrite mod and they are still standing.
g is littered with my pages that say :
| 4:29 pm on Jun 24, 2003 (gmt 0)|
Yes, I too had some dynamic URL strings that were doing well. But, after the URL rewriting, not only did those previous dirty URLs improve, now the rest of the content in the database came to the top.
We all know that most SEs have one problem or another with dynamic URLs. Google and AlltheWeb seem to have a handle on indexing dynaminc URLs that contain no more than 2 variables. I'm not 100% sure how the other majors handle them and don't really care as they don't justify the time and research to fully test the effects of dirty vs. clean URLs.
Think about your visitors. Dirty URLs are very difficult to remember. They give no indication to the visitor what the resource is other than a bunch of numbers, characters and variables that make no sense to them. There are other factors involved such as security, abstraction and maintainability.
What happens when you need to change the underlying technology on a site that has over 30,000 pages indexed? With dirty URLs, you'll have a very large task ahead of you. With clean URLs, the switch is transparent.
I strongly advise anyone thinking of building a dynamic site that they start with clean URLs from the very start. You need to spend as much time planning those URL strings as you do the structure of the site. In fact, it is mandatory as you don't ever want those strings to change. The W3C has a neat little tip when validating your site that refers to cool URIs don't change. In that tip, they discuss why your URL strings should not change for 5, 10 even 30 years.
I'm not too certain on where URL rewriting won't work. From the research and testing I've done, there has not been an instance where URL rewriting would have caused any issues other than the learning curve required to implement the feature.
You make the choice. Would you rather have...
Think about the long term effects of having dynamic URLs as opposed to clean static URLs. A new buzzword I've picked up lately is content negotiation. It's been around for a long time, but we've never really discussed it in detail. I think we'll see much more interest centered around the topic of dirty URLs vs. clean URLs.
[edited by: pageoneresults at 4:49 pm (utc) on June 24, 2003]
| 4:40 pm on Jun 24, 2003 (gmt 0)|
Well since I agree with Dave and pageone I will only add another small tidbit.
I think that dynamic urls developped from an innate laziness for programmers and weakness of scripting languages as they were maturing.
I can slap together a site with query strings in no time but it takes work and has to considered through out the design process to keep them out of the url and make a site dynamically static. I also know that most programmers don't know or care about search engines present company excluded of course.
Thats why all of these "great" programmers build these "great" products for use online that are completely unspiderable. Design Firms in their "great" wisdom build sites that effectively remove all of the traffic the client used to have.
Query strings are easy and passing params is common place but it is the method in which it is done that should be assessed and considered.
Best Practice? no query strings, period.
sorry, its a topic that makes me a little upset, no offence intended, I am a programmer so mostly I insulted myself ;)
| 4:45 pm on Jun 24, 2003 (gmt 0)|
|sorry, its a topic that makes me a little upset, no offense intended, I am a programmer so mostly I insulted myself ;) |
don't you start pinching my job ;)
| 5:16 pm on Jun 24, 2003 (gmt 0)|
Thanks for the feedback, all.
|Design Firms in their "great" wisdom build sites that effectively remove all of the traffic the client used to have. |
jatar_k, a discussion with a design company that builds cookie cutter sites is what prompted my post in the first place.
| 5:19 pm on Jun 24, 2003 (gmt 0)|
I have worked with a bunch and have seen some really deplorable things.
Sell a new design
destroy all rankings and traffic
sell them seo
repeat as necessary
I was the seo and they resold it for 3 times what I charged bad, bad, bad. I only did was I was told because I had a boss but it was ugly either way.
<OT>sorry Dave, I meant REAL programmers ;)</OT>
| 7:49 pm on Jun 24, 2003 (gmt 0)|
Generally what will help your dynamic URLs (not counting session IDs) get indexed better is having STATIC URL pages linking to them. This in general tells Google that while the URL looks dynamic, the content is likely fairly static. PR can also help in getting your dynamic URLs indexed.
Still though, like everyone said, you're best off using search engine friendly URLs. Both because they are SE friendly and people friendly. They also provide other ancillary benefits like hiding what server side technology you are using -- which would allow you to easily change technologies in the future without having to change URLs. So if you converted from PHP to JSP or something you could keep your same URL structure and thus maintain your link popularity.
For solutions just search on Google for "search engine friendly urls" and you'll find a bunch of articles. There are a variety of ways to accomplish it, mod_rewrite, using error pages, other apache hacks.
| 11:24 pm on Jun 24, 2003 (gmt 0)|
>> ... I've picked up lately is content negotiation. <<
Can we link to articles at evolt from here? There is a recent one covering this topic....
| 7:45 am on Jun 25, 2003 (gmt 0)|
Inspired by this thread I rewrote all my dynamic links yesterday into some kind of static links.
Using php's urlencode() those links look like this:
Is that kind of encoding ok for Google or would it be better to use underscores instead of +.
And what about special chars in the url like %28 for a "("? Does this affect my ranking. Would it be better to filter out all ( and : and to generate some really plain url?
Thanks for any idea
| 9:49 am on Jun 25, 2003 (gmt 0)|
i've always had my asp pages + query string indexed by google fine,
in a: www.mydomain.com/default.asp?pageid=1 kind of way
but recently i made a site using a java servlet and google just won't take the pages:
any ideas why google likes asp but not java?
| 9:55 am on Jun 25, 2003 (gmt 0)|
"id" is a real bad one google thinks it's a session ID and google hates java
| 10:01 am on Jun 25, 2003 (gmt 0)|
How do you mean google hates java?
it's not just my pages with id that aren't crawled also some with query=
| 10:02 am on Jun 25, 2003 (gmt 0)|
the applet won't help
| 10:12 am on Jun 25, 2003 (gmt 0)|
what if i change the query= to q=
do you think google is more likely to crawl the pages then?
+ u didnt say why google hates java
btw, thanks for this
| 10:56 am on Jun 25, 2003 (gmt 0)|
Why have a query string?
I mean, people just keep asking how to use query strings best. It's like asking how to stab yourself with the least pain, the simple anser: don't.
the only place a query is admissible is in a form submit. And even there the W3C has quite strict recommendations when to use GET and when to use POST. POST is for when your submission changes server state (like modifying statefull databases) and GET is when it doesn't change server state. Furthermore, query string should only be for a page whose contents is previously undefined. So a query string should return a list of items, like search results, which don'T make any other sense together.
If on the other hand you have an advanced searhc the lets you, say, choose a category, and that category exists as a list of widgets in that category you should link to:
While on the other hand if somebody types red widgets in your searchbox, the form should bring up:
The sooner webmasters/designers/SEOs/programmers learn what the tools mean, and what they were designed for, the better for all of us.