Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
Since then, new pages that I added to my web site simply don't show up in Google Search (even when I force site:<mydomain>.com).
Old pages (prior to May 24th) show up fine, just as they used to.
What could have I done wrong?
Is it possible that an XML sitemap submission to Google actually hurts?
BTW, my robots.txt contains only the following:
Does your sitemap and file location comply with: [sitemaps.org...] ?
Did I miss anything?
00:06:48 0 •Spider 18.104.22.168 16:00:54 16:00:54
Time Since Clicked:
00:06:48 ago Session ID:
User Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The problem seems to be that it somehow doesn't reach products_id=73 ...
Any idea what could have gone wrong?
Be aware that the Allow: syntax is not Universal; I wouldn't use it with User-agent: * here.
I would use Disallow: <blank> or else not specify anything at all there.
Thanks for the tip. I wasn't aware of this. I just changed my robots.txt to include only the following line:
I will track this and see if this helps.
Interestingly enough, a product that I added on June 4, 2009, 06:58 is found by Google search.
Is there a minimum 2-week delay for Google to add website pages to its results?
I did specify <changefreq> for product pages to change daily but I read somewhere that this parameter is treated as a suggestion only? I have no explanation for this weird behavior.
Additional insights would be appreciated.