Forum Moderators: phranque
Firstly, there is in principal no way that anyone outside your website (including an indexing spider) can tell if page was dynamically generated or is static.
I presume therefore, that you are worried about the appearance of your URL, which might end up looking very messy - something like:
www.example.com/index.html?cat=something&foo=bar&x=y
This might (and does I believe) put indexing spiders off, so you might ask your developer to encode the variables that make up your directory into a more pleasent looking URL.
If it is just a straight forward single page you're talking about then you have nothing to worry about - unless it changes with the wind in which case a search engine can't index you very easily anyway.
These threads will help:
theory on dynamic pages [webmasterworld.com]
problematic urls [webmasterworld.com]
PR and dynamic URLS [webmasterworld.com]
HTH
I've been working with very, VERY long URL's (see example in my previous messages) and have managed to get deep crawled and fresh crawled, and to increase the pages' rankings. the key is to use url aliases as much as possible. i use this technique:
1. build a few server-side redirect pages which all they do is redirect to other pages (that would have a long url) using something like a jsp response.sendRedirect() (or the ASP equivalent)
2. for every url i want crawled, in the site i would link to those redirect pages and pass one parameter e.g.: /dir/filename?id=4. Googlebot will follow this kind of link and would be redirected to the long url.
3. for very important pages, i created normal-looking 'root-level' entry points such as /myproduct, these pages redirect using the same response.sendRedirect. Googlebots follow these too, no problem.
4. build the META tags dynamically so the bots think they are normal tags, and optimize the content on the pages accordingly.
I was amazed at the results - i added hundreds of pages to G within one update. the next update, all of the pages PR was increased from 4 to 5.
4. build the META tags dynamically so the bots think they are normal tags, and optimize the content on the pages accordingly.
How do you build meta tags dynamically?
Also, when you say optimize the content on the pages accordingly, are you saying building a regular, static page that could "toggle itself" between dynamic and static mode?
I'm not certain I follow you on that number 4. statement
Thanks for your upcoming response