Forum Moderators: goodroi
Since then, new pages that I added to my web site simply don't show up in Google Search (even when I force site:<mydomain>.com).
Old pages (prior to May 24th) show up fine, just as they used to.
What could have I done wrong?
Is it possible that an XML sitemap submission to Google actually hurts?
Thanks!
BTW, my robots.txt contains only the following:
User-Agent: *
Allow: /
Sitemap: [<mydomain>.com...]
[edited by: Propools at 5:19 pm (utc) on June 16, 2009]
Does your sitemap and file location comply with: [sitemaps.org...] ?
Did I miss anything?
Thanks!
00:06:48 0 ¥Spider 66.249.71.38 16:00:54 16:00:54
Time Since Clicked:
00:06:48 ago Session ID:
Host: crawl-66-249-71-38.googlebot.com
User Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
/shop/index.php?main_page=product_free_shipping_info&products_id=34&language=en
The problem seems to be that it somehow doesn't reach products_id=73 ...
Any idea what could have gone wrong?
Be aware that the Allow: syntax is not Universal; I wouldn't use it with User-agent: * here.I would use Disallow: <blank> or else not specify anything at all there.
Thanks for the tip. I wasn't aware of this. I just changed my robots.txt to include only the following line:
Sitemap: [<mydomain>.com...]I will track this and see if this helps.
Interestingly enough, a product that I added on June 4, 2009, 06:58 is found by Google search.
Is there a minimum 2-week delay for Google to add website pages to its results?
I did specify <changefreq> for product pages to change daily but I read somewhere that this parameter is treated as a suggestion only? I have no explanation for this weird behavior.
Additional insights would be appreciated.