Welcome to WebmasterWorld Guest from 23.20.248.132

Forum Moderators: incrediBILL & martinibuster

webmastertools no access

     
10:12 am on Jun 13, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 10, 2005
posts: 497
votes: 0


Since a bunch of days WMT reports "access denied" on more and more domains and pages. Plenty 522 and much more 403.

Webhosting says, that they ain't blocking Google IPs and all pages perform well.

Running exactly the same .htaccess and robots.txt sinces ages on all sites and only since June 7 the crawling has stopped.

Inside WMT robots.txt is declared as missing, but when clicking on "see live version" it shows up right away.

Webhost recommends to contact Google, but who would expect a reply?
10:34 am on June 13, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Apr 20, 2017
posts: 214
votes: 34


Code 403 = Access Forbidden

Code 522, is a Cloudflare specific code for "Connection Timed Out".

So I assume you use Cloudflare. So it might be the connection between Cloudfalre and your webhost which has a problem.

Aslo, in your Google Search Console, you can try "explore as Google" to see what happens.
11:10 am on June 13, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 10, 2005
posts: 497
votes: 0


yes, cloudflare is used.

Besides that, speed tests in Google analytics are working fine while search console's "Fetch as Google" is reporting error.
2:09 pm on June 13, 2017 (gmt 0)

Junior Member

joined:Aug 14, 2014
posts:104
votes: 22


That's interesting. On Friday I just happened to move three of my sites to Cloudflare to take advantage of their free ssl. Two of the three properties are showing lots of errors (warnings) in GSC due to the robots.txt file ("Url blocked by Robots.txt") but the Fetch as Google works each time on both properties. The two are shared hosting at Hostgator and the third is on a dedicated cloud server elsewhere. The third has had no issues whatsoever in GSC.

I've deleted the robots.txt on the two in question to see if the issues clear up. Right now I feel like it's a Cloudflare issue as I've never had sitemap issues with any properties before the switch.
7:57 am on June 19, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 10, 2005
posts: 497
votes: 0


Meanwhile it is clear, that cloudflare has a bug, that prevents google from crawling websites.
Their service was dropped on 2 domains and right away google was able to access all pages.

Too bad, that cloudflare does not respond to tickets unless you are a vip client :-(