Forum Moderators: open
So, on to my question. In order to get better indexing by Google, I recently restructured the whole site from a multi-parameter query string (default.asp?a=1&b=2&c=3) into a simpler, one parameter query string (default.asp?url=blah). Google came and looked at the site this weekend, however, it only looked at the top level pages (domain.com/mainsection/), but nothing deeper than that.
The bulk of our keywords and content is in press releases and other pages that are 2-3 clicks away from the homepage, and I am concerned that the googlebot will not dig deep enough to get to these pages, and therefore, they will not get indexed.
Could someone please shed some light on this matter, and let me know whether I have to rewrite the whole site to be static, or whether it just needs some time to get to the pages we want indexed?
TIA for the responses.
The number of pages indexed and speed of crawl is dependent on several factors such as PageRank, number of Links, path, ...
Generally I'd say that you should wait a few more days and see GoogleBot coming back to your site and index more pages. If it doesn't happen I'd try to get more links to your site as that normally encourages Google to crawl more pages.
So, I guess my question is, whether it is ok to have default.asp?url=blah structure, or whether it makes sense to use an IIS plugin like LinkFreeze to convert all dynamic URL's into ugly looking html URL's?
The previous version of the site used a similar navigation setup, and was also constructed from default.asp?url=blah type of links, yet most of the site (80+ pages) were indexed in Google. Does Google need a significatnt amount of time to index everthing, or am I reacting too early?
There are numerous threads on here that cover this. You might like to look at this one:
[webmasterworld.com...]
but as I say, there are many many threads on this. Just look up 'dynamic URLs'
I only have a few months working around this "indexing" topic, and doing my best to get a good architecture on the sites I work on so the bots can crawl as deep as possible. But now I just heard from a friend (and read right here) about this issue with the amount and length of parameters in the url, and a lot of doubts comes to my mind.
The most obvious and top priority doubt is: what's the limit for the amount (and length) of parameters on the url so the bot will index it? I've heard the limit is two, "maybe" three parameters depending on the length of the parameters, but that's somewhat vague. Can someone say something more specific about this?
Also, there is the solution of joining all (or several) parameters into one, like this:
default.aspx?param1=A¶m2=B¶m3=C¶m4=D
would be transformed to
default.aspx?params=A,B,C,D
Does this really helps the googlebot to crawl more efficiently? (or better said, does this helps to get the default page indexed?)
And lastly (for now :)), if the previous is actually a workaround to get the page indexed, is there a limit on the length and/or amount of parameters listed as a single parameter? (i.e. no more than 10 comma separated values on a single parameter)
Well, I hope not to bore you guys with this long post (sorry about that ^^) and to make sense with my humble questions.
Thanks! and great site btw!