| 4:46 am on May 19, 2011 (gmt 0)|
Same here. Lots of Fetch Failures, and yet, if you hover over those that failed, it pulls them right up. Looks like the tool just doesn't work properly, or it doesn't wait long enough to before deciding upon failure.
| 4:53 am on May 19, 2011 (gmt 0)|
Same here as well. Searching my site on Google shows a preview that looks absolutely fine. But the one on GWT shows a lot of errors and fetching issues besides displaying a weird looking preview.
ETA : I tested this feature on my GWT account for "example.com/widget". But the preview being shown is for example.com instead.
| 6:20 am on May 19, 2011 (gmt 0)|
Most of the errors reported have "roboted" against Detail.What does that mean?
| 6:26 am on May 19, 2011 (gmt 0)|
i presume that "Roboted" means that it is restricted by the robots.txt but ... none of our images are protected this way
just another google goof
| 7:59 am on May 19, 2011 (gmt 0)|
Very clever! Google at last has found a way of finding cloaked pages!
| 8:18 am on May 19, 2011 (gmt 0)|
I don't know how everyone is reportedly getting errors but still google found it good to push it for use by webmasters!
Looks to be a QUALITY tool to me just like their panda. They definitely test their code harder before pushing them into production. Don't they?
| 9:55 am on May 19, 2011 (gmt 0)|
Um, isn't the "Labs" area specifically for Not Ready For Prime Time features? Good way to cover themselves.
Now, is anyone the teeniest bit uneasy about the fact that when something is "roboted", the preview shows nothing, as you'd expect ... but the "site" side shows exactly what a human would see, meaning that the googlebot blithely stepped in and got it?
Matter of fact, it's not really "site" on the left and "preview" on the right: it's the Google Web Preview robot on the left and the ordinary Googlebot on the right. I tried a page from a directory that's not only roboted, but has GWP locked out at the htaccess level. The left side showed the 403 page, which only the GWP robot would get; the right side showed a blank. With a page that's roboted but not htaccess-blocked, the left ("site") side shows the actual page, while the right ("preview") side is blank.
And honestly, GWP, it wouldn't kill you to include non-ASCII characters in your displayed text. Kinda blows the point of some of my pages :P
| 10:11 am on May 19, 2011 (gmt 0)|
The preview link is on, its just that its giving error message as follows while generating instant previews:
"Preview Unavailable. Our render server failed to render your page. Please try again in a couple of minutes."
| 10:20 am on May 19, 2011 (gmt 0)|
It's not working for me. It keeps refusing to fetch my CSS files, saying they are 'roboted' (robots.txt, I guess) - even though they aren't. The image previews show up fine (with the CSS files and styles) in the SERPs, so it's probably just a bug.
| 10:52 am on May 19, 2011 (gmt 0)|
|Um, isn't the "Labs" area specifically for Not Ready For Prime Time features? Good way to cover themselves. |
You put it in Labs area or call it the beta.Anything that is pushed for users consumption should have undergone some amount of testing and not suck so badly.
| 10:58 am on May 19, 2011 (gmt 0)|
Why are ads clearly visible in GWP of pages on some sites while they aren't on many other sites? I would normally see the ads section as a blank area in those sites.
ps: I am talking about google ads.
| 5:24 pm on May 19, 2011 (gmt 0)|
Follow-up which I should have thought of in the first place:
This is the ordinary Web Preview IP-- the one I've htaccess'd out of one directory.
UA: Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.597 Safari/534.14
This is identical to the Web Preview UA but note the omission of "Google Web Preview" after "like Gecko".
This is a typical address for Googlebot (Preview uses others in the same range).
UA: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The Googlebot we all know and love. It did everything by itself with no help from the imagebot.
For each page, the left and right files were collected separately, first by the googlebot (right or "preview" side) and then by the generic UA (left or "site" side). The googlebot must have pulled the robots info out of its database, because it didn't check right then and there. (The last robots.txt check was about 6 hours earlier. I think they check every 24 hours or so, but it takes several days to process the information.)
Each page was visited from scratch. The supporting files show up with the html page as referrer, the way they do when a human visits.
Coincidentally I'd only just made some visible changes to one rarely-touched page* so it was easy to confirm that they were using fresh data even before I looked at logs. Three days ago a page in the same directory got a 304, so they had no reason to expect a change.
The "roboted" pages weren't visited by the googlebot at all. The "site" UA went everywhere. Or tried to, in the case of the htaccess'd page.
* I happened to look at pages in this directory-- mainly dating from 2004 with minor edits in 2007-- and said "Eeuw! That's ugly!" But nobody ever goes there, so it isn't worth spending a lot of time.
| 9:00 pm on May 21, 2011 (gmt 0)|
I have the same problem on many of my sites. I get a bunch of errors and the image on the right side often miss a lot of background/CSS images.
I just found out something very interesting on my most important site though. I use Smarty, and...
...when I set $force_compile = true; I get 12 errors.
...when I set $force_compile = false; I get 80 errors.
No difference in source code, so does it have to be the headers? What els could it be? The only difference I've found, is...
...when I set $force_compile = true; I have "Content-Type: text/html; charset=UTF-8"
...when I set $force_compile = false; I have "Content-Type: text/html;"
Hmm, could that really be the difference? I'll check it out and will post the result here. Maybe also you should check your headers?
| 10:59 pm on May 21, 2011 (gmt 0)|
Do you have any characters outside of the ASCII set (decimal 32-127)? Eg: umlauts, acutes, graves etc.
If so you should probably have UTF-8, otherwise the characters will not resolve correctly IF the "browser" is following the charset header, which may happen with a strict-protocol robot.
| 11:57 pm on May 21, 2011 (gmt 0)|
Yes I do, and now you probably solved a huge ranking problem for me. I "Fetched as Googlebot" in Webmaster Tools and you were right, Googlebot is following the charset header. Thanks a lot!
| 2:57 am on May 22, 2011 (gmt 0)|
|Do you have any characters outside of the ASCII set (decimal 32-127)? Eg: umlauts, acutes, graves etc. |
Were you asking me or brandmaker? All my pages are utf-8 encoded with the real characters. No entities except and —. Web Preview simply doesn't display them, either in GWT or the "real" version.
:: time out to investigate ::
OK, got it. The overall page image-- the part that shows your text in teeny sans-serif type-- won't display non-ASCII characters in HTML. (They will if it's a pdf.) But the snippet-- the close-up text within the preview-- shows everything. So does the little orange box showing where the snippet was taken from.
| 3:53 am on May 22, 2011 (gmt 0)|
its buggy for me. i get fetch errors for most of my pages but then click 'compare' again and then get none or fewer.
| 6:20 am on May 22, 2011 (gmt 0)|
Plain & simple, the preview tool doesn't work.
| 4:41 pm on May 22, 2011 (gmt 0)|
Piss poor integration, so far.
| 8:54 pm on May 22, 2011 (gmt 0)|
My analytics comes also as roboted.
Google webmaster central forum as people as puzzled as we are!
| 2:33 am on May 25, 2011 (gmt 0)|
I have about 25 sites in GWT. None have updated site speed since the 5/15. Anyone know why? Prior to the 15th the site was being updated regularly. As it turns out I deployed a new server and a performance tuned relaunch on the 16th. Not knowing if I got a performance gain is killing me. The Firebug performance score that G pushes is not really validation.