Forum Moderators: open
Here are my list:
--I have a sitemap link on the first page;
--There are 40 links on the first page and all of them are accessible
--All the following links on the first page works fine without any error and connection delay
--All of the links are dynamic url such as products.aspx?id=123 (only one parameter)
--I used text browser and sim spider to test my web pages.
--I don't have viewstate text in the page source
--The page size is about 40k (including some images)
--All of the contents are dynamiclly from asp.net program
--All contents are visitor-oriented
--keyword1-keyword2.net domain name
What staffs I missed? how can I improve the pages attract googlebot's interest? Should I submit some sublevel links? Any suggestion is appreciated!
the links are dynamic url such as products.aspx?id=123
That might be a problem, using id as the paremeter could cause Google to think it's a session ID. Perhaps give it another name.
Also, just because you submitted your site to Google doesn't mean it's going to crawl every page or list you. I'd recommend working on getting a lot of incoming links, and wait a few more weeks before getting concerned with google.
Try to have a good 10 - 15 on every other page too... I've found some neat new ways to get Google to gobble up pages even faster than before...
I just tested a few over the last few days and they worked great... drop downs that include source as UL LI, 90% CSS, just 10% Javascript, but no js in the body...working very nicely indeed.