Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Sitemaps driving me crazy with "403 Forbidden" report

         

JamaicanFood

4:01 pm on Jul 22, 2006 (gmt 0)

10+ Year Member



Ok guys, I took your advice, waited, the Gbot came back I checked yesterday and it said the Gbot last accessed my site on the 13th, today it said it last accessed my site on the Jul 20, 2006.

Yet it says 403 Forbidden. I knew it said that on the 13th as I was uploading then and missed the whole foray PR and everything. But why did it access my site and still say it was forbidden.

What the DICKENS is going on as we say here in Jamaica, WHAT A GWAAN!

This is driving me crazy Ive got tons of content and just bursting at the seams for G to come spider it the Yahoo slurp grabbed 84 of my pages and I am waiting until mid August to pay for the Y Directory, MSN got me too though just 30 pages I am going to pay the 49 bucks for their submit it directory as well. But Jesus can someone explain what the dickens is happening with

Mr. G - Too much money I guess.

g1smd

5:29 pm on Jul 22, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Use WebBug set to HTTP/1.1 and investigate what status codes you get in the HTTP headers when you try to access the site with that.

JamaicanFood

9:25 pm on Jul 22, 2006 (gmt 0)

10+ Year Member



Ok I did it what exactly am I looking for,
This is what came back I replaced my site data with widgets.com

SENT DATA
GET / HTTP/1.1
Host: www.widgets.com
Connection: close
Accept: */*
User-Agent: WebBug/5.0

RECEIVED DATA
HTTP/1.1 200 OK
Connection: close
Date: Sat, 22 Jul 2006 21:23:25 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
MicrosoftOfficeWebServer: 5.0_Pub
Content-Length: 14743
Content-Type: text/html
Set-Cookie: ASPSESSIONIDAQBADRCC=MGEDCADBLCNABMKOHAECDFGF; path=/
Cache-control: private
<HTML>

It then gave me all the HTML on the website so is there a problem here

mattg3

4:44 am on Jul 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



sitemap constantly throws out random pages that are supposedly unaccessible. I just ignore it now,

JamaicanFood

5:23 pm on Jul 23, 2006 (gmt 0)

10+ Year Member



This is crazy I am really wondering if the Google Sitemap does your site more harm than good the googlebot came Jul 21 - or so it said and then said the landing page home page is inaccessible again 403.

Listen really guys I am going to delete this account because it is a waste of time if the info is inaccurate and G wont spider my site because of it waste of time truly can I delete tha coount and re-open a new one and hope for better results.

OR Just say 2 hell with G sitemaps period.

Your thoughts anyone...

abates

1:15 am on Jul 24, 2006 (gmt 0)

10+ Year Member



Are your site logs showing that googlebot is being served a status of 403 when it hits your index page?

If so, whether you have sitemaps or not won't make a bit of different.

JamaicanFood

6:48 am on Jul 24, 2006 (gmt 0)

10+ Year Member



Im not sure I actually dont have a robots.txt file so this should not be an issue.

I have checked with this lame webstats software I got from the hosting company it did tell me that the Gbot requested the page x times. However heres the funny thing. I have been indexed since then why because the cache is different its newer, as I changed some info in the title and its reflected but just for my home page.

Also in the G sitemap where the Crawl stats are it says Last Calculated July 10, 2006. What the dickens does that mean. Does it mean that the little Gbot came only to spider 1 page and that the big Gbot came on July 10. 2006.

Could this be telling the GBOT not to index my site in error?
IF YES what is the prudent thing to do?