Welcome to WebmasterWorld Guest from 188.8.131.52
sites shouldn't use "&id=" as a parameter if they want maximal Googlebot crawlage, for example. So many sites use "&id=" with session IDs that Googlebot usually avoids urls with that parameter
If this is your problem, change it. Otherwise, ask how is googlebot getting your static links, and why is it missing collecting your dynamic ones? My first guess would be that you have links specifically to each of your static pages, but not to your individual, specific dynamic pages. To correct this, create a text link to one of your dynamic pages, just as you want it indexed,
such as: <a href="http://site.com/product.cgi?profit=high">hot deals</a>
Then watch to see if googlebot investigates in a while. Simple dynamic pages like this shouldn't pose a problem to googlebot, but don't expect the bot to guess at the parameters.
I have noticed that dynamic pages can be delayed in getting indexed more so than static ones in some cases. Possibly if the website starts out as purely dynamic, there's more chance of google indexing it in the future.
p.s. nice name
what do u think? during one update how many times, can Gbot crawl the same page?
for example crawling start yesterday, he check some pages, and today he visit it once again (2-3 times).
i put a small script, which detect pages and times, when gbot visit it. but unfortunately i have 18 000 articles, so i cann't check which he visit the second time .. to understand this.
I'm a little paranoid about looking like spam, but it is our same content just in a different format on our pages.
We have done a test a few months ago, and the HTML pages get significantly more search engine traffic than the dynamic pages.
Anyone have any experience with this?
site:www.domain.com page or article title
Make sure you havent been penilised by the last update cos thats hitting those deap pages hard.