| 5:18 pm on Jun 16, 2009 (gmt 0)|
Does your sitemap and file location comply with: [sitemaps.org ] ?
[edited by: Propools at 5:19 pm (utc) on June 16, 2009]
| 5:46 pm on Jun 16, 2009 (gmt 0)|
|Does your sitemap and file location comply with: [sitemaps.org...] ? |
Absolutely. I know that the path [<mydomain>.com...] looks problematic but I (think that I) made sure that everything complies as follows:
- In ~/public_html's .htaccess there is a redirect 301 to [<mydomain>.com...]
- Thus everything in my website is currently located under [<mydomain>.com...] - including the newly added pages that Google ignores for some reason.
- In addition, I made sure that robots.txt (located in ~/public_html, i.e. [<mydomain>.com...] is world-readable and contains a line that tells where the site's XML sitemap file is located (see the message that started this thread).
Did I miss anything?
| 9:13 pm on Jun 16, 2009 (gmt 0)|
Also, it seems that the problem is not with Google not crawling my site anymore but rather not "completing" the crawl. In fact, I just "caught" it crawling:
|00:06:48 0 •Spider 220.127.116.11 16:00:54 16:00:54 |
Time Since Clicked:
00:06:48 ago Session ID:
User Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The problem seems to be that it somehow doesn't reach products_id=73 ...
Any idea what could have gone wrong?
| 9:31 pm on Jun 16, 2009 (gmt 0)|
I have seen intermittent reports like this ever since XML sitemaps first came into use.
Be aware that the
Allow: syntax is not Universal; I wouldn't use it with
User-agent: * here.
I would use
Disallow: <blank> or else not specify anything at all there.
| 12:00 am on Jun 17, 2009 (gmt 0)|
|Be aware that the Allow: syntax is not Universal; I wouldn't use it with User-agent: * here. |
I would use Disallow: <blank> or else not specify anything at all there.
Thanks for the tip. I wasn't aware of this. I just changed my robots.txt to include only the following line:
I will track this and see if this helps.
| 5:00 pm on Jun 19, 2009 (gmt 0)|
Hmmm... even after implementing the above suggested change in robots.txt, it seems that Google is insisting on ignoring the last product description page that I added to my web site on June 8, 2009, 21:05.
Interestingly enough, a product that I added on June 4, 2009, 06:58 is found by Google search.
Is there a minimum 2-week delay for Google to add website pages to its results?
I did specify <changefreq> for product pages to change daily but I read somewhere that this parameter is treated as a suggestion only? I have no explanation for this weird behavior.
Additional insights would be appreciated.