homepage Welcome to WebmasterWorld Guest from 107.22.37.143
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
GWT Showing Instant Previews
jekko




msg:4314489
 4:34 am on May 19, 2011 (gmt 0)

I just noticed that the Labs section of GWT is showing Instant Previews. When i compare my site to the instant preview google is showing 35 errors but the instant preview is working fine.

these are the errors shown

.css, .js, gif & .jpg are 'Roboted' (what ever that means)

other .jpg have a Fetch Failure

but as i said the Instant Preview is working fine in GWT and in the SERPs. very odd

 

dazzlindonna




msg:4314490
 4:46 am on May 19, 2011 (gmt 0)

Same here. Lots of Fetch Failures, and yet, if you hover over those that failed, it pulls them right up. Looks like the tool just doesn't work properly, or it doesn't wait long enough to before deciding upon failure.

anand84




msg:4314491
 4:53 am on May 19, 2011 (gmt 0)

Same here as well. Searching my site on Google shows a preview that looks absolutely fine. But the one on GWT shows a lot of errors and fetching issues besides displaying a weird looking preview.

ETA : I tested this feature on my GWT account for "example.com/widget". But the preview being shown is for example.com instead.

indyank




msg:4314511
 6:20 am on May 19, 2011 (gmt 0)

Most of the errors reported have "roboted" against Detail.What does that mean?

jekko




msg:4314512
 6:26 am on May 19, 2011 (gmt 0)

i presume that "Roboted" means that it is restricted by the robots.txt but ... none of our images are protected this way

just another google goof

Andylew




msg:4314541
 7:59 am on May 19, 2011 (gmt 0)

Very clever! Google at last has found a way of finding cloaked pages!

indyank




msg:4314543
 8:18 am on May 19, 2011 (gmt 0)

I don't know how everyone is reportedly getting errors but still google found it good to push it for use by webmasters!

Looks to be a QUALITY tool to me just like their panda. They definitely test their code harder before pushing them into production. Don't they?

lucy24




msg:4314572
 9:55 am on May 19, 2011 (gmt 0)

Um, isn't the "Labs" area specifically for Not Ready For Prime Time features? Good way to cover themselves.

Now, is anyone the teeniest bit uneasy about the fact that when something is "roboted", the preview shows nothing, as you'd expect ... but the "site" side shows exactly what a human would see, meaning that the googlebot blithely stepped in and got it?

Matter of fact, it's not really "site" on the left and "preview" on the right: it's the Google Web Preview robot on the left and the ordinary Googlebot on the right. I tried a page from a directory that's not only roboted, but has GWP locked out at the htaccess level. The left side showed the 403 page, which only the GWP robot would get; the right side showed a blank. With a page that's roboted but not htaccess-blocked, the left ("site") side shows the actual page, while the right ("preview") side is blank.

And honestly, GWP, it wouldn't kill you to include non-ASCII characters in your displayed text. Kinda blows the point of some of my pages :P

AnkitMaheshwari




msg:4314577
 10:11 am on May 19, 2011 (gmt 0)

The preview link is on, its just that its giving error message as follows while generating instant previews:

"Preview Unavailable. Our render server failed to render your page. Please try again in a couple of minutes."

tristanperry




msg:4314579
 10:20 am on May 19, 2011 (gmt 0)

It's not working for me. It keeps refusing to fetch my CSS files, saying they are 'roboted' (robots.txt, I guess) - even though they aren't. The image previews show up fine (with the CSS files and styles) in the SERPs, so it's probably just a bug.

indyank




msg:4314581
 10:52 am on May 19, 2011 (gmt 0)

Um, isn't the "Labs" area specifically for Not Ready For Prime Time features? Good way to cover themselves.


You put it in Labs area or call it the beta.Anything that is pushed for users consumption should have undergone some amount of testing and not suck so badly.

indyank




msg:4314586
 10:58 am on May 19, 2011 (gmt 0)

Why are ads clearly visible in GWP of pages on some sites while they aren't on many other sites? I would normally see the ads section as a blank area in those sites.

ps: I am talking about google ads.

lucy24




msg:4314811
 5:24 pm on May 19, 2011 (gmt 0)

Follow-up which I should have thought of in the first place:

Left column:
IP: 72.14.194.33
This is the ordinary Web Preview IP-- the one I've htaccess'd out of one directory.
UA: Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.597 Safari/534.14
This is identical to the Web Preview UA but note the omission of "Google Web Preview" after "like Gecko".

Right column:
IP: 66.249.68.162
This is a typical address for Googlebot (Preview uses others in the same range).
UA: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The Googlebot we all know and love. It did everything by itself with no help from the imagebot.

For each page, the left and right files were collected separately, first by the googlebot (right or "preview" side) and then by the generic UA (left or "site" side). The googlebot must have pulled the robots info out of its database, because it didn't check right then and there. (The last robots.txt check was about 6 hours earlier. I think they check every 24 hours or so, but it takes several days to process the information.)

Each page was visited from scratch. The supporting files show up with the html page as referrer, the way they do when a human visits.

Coincidentally I'd only just made some visible changes to one rarely-touched page* so it was easy to confirm that they were using fresh data even before I looked at logs. Three days ago a page in the same directory got a 304, so they had no reason to expect a change.

The "roboted" pages weren't visited by the googlebot at all. The "site" UA went everywhere. Or tried to, in the case of the htaccess'd page.


* I happened to look at pages in this directory-- mainly dating from 2004 with minor edits in 2007-- and said "Eeuw! That's ugly!" But nobody ever goes there, so it isn't worth spending a lot of time.

brandmaker




msg:4315784
 9:00 pm on May 21, 2011 (gmt 0)

I have the same problem on many of my sites. I get a bunch of errors and the image on the right side often miss a lot of background/CSS images.

I just found out something very interesting on my most important site though. I use Smarty, and...

...when I set $force_compile = true; I get 12 errors.
...when I set $force_compile = false; I get 80 errors.

No difference in source code, so does it have to be the headers? What els could it be? The only difference I've found, is...

...when I set $force_compile = true; I have "Content-Type: text/html; charset=UTF-8"
...when I set $force_compile = false; I have "Content-Type: text/html;"

Hmm, could that really be the difference? I'll check it out and will post the result here. Maybe also you should check your headers?

dstiles




msg:4315799
 10:59 pm on May 21, 2011 (gmt 0)

Do you have any characters outside of the ASCII set (decimal 32-127)? Eg: umlauts, acutes, graves etc.

If so you should probably have UTF-8, otherwise the characters will not resolve correctly IF the "browser" is following the charset header, which may happen with a strict-protocol robot.

brandmaker




msg:4315812
 11:57 pm on May 21, 2011 (gmt 0)

Yes I do, and now you probably solved a huge ranking problem for me. I "Fetched as Googlebot" in Webmaster Tools and you were right, Googlebot is following the charset header. Thanks a lot!

lucy24




msg:4315851
 2:57 am on May 22, 2011 (gmt 0)

Do you have any characters outside of the ASCII set (decimal 32-127)? Eg: umlauts, acutes, graves etc.

Were you asking me or brandmaker? All my pages are utf-8 encoded with the real characters. No entities except   and —. Web Preview simply doesn't display them, either in GWT or the "real" version.

:: time out to investigate ::

OK, got it. The overall page image-- the part that shows your text in teeny sans-serif type-- won't display non-ASCII characters in HTML. (They will if it's a pdf.) But the snippet-- the close-up text within the preview-- shows everything. So does the little orange box showing where the snippet was taken from.

robster124




msg:4315859
 3:53 am on May 22, 2011 (gmt 0)

its buggy for me. i get fetch errors for most of my pages but then click 'compare' again and then get none or fewer.

keyplyr




msg:4315872
 6:20 am on May 22, 2011 (gmt 0)

Plain & simple, the preview tool doesn't work.

Sgt_Kickaxe




msg:4315954
 4:41 pm on May 22, 2011 (gmt 0)

Piss poor integration, so far.

henry0




msg:4316020
 8:54 pm on May 22, 2011 (gmt 0)

My analytics comes also as roboted.
Google webmaster central forum as people as puzzled as we are!

DirigoDev




msg:4317254
 2:33 am on May 25, 2011 (gmt 0)

I have about 25 sites in GWT. None have updated site speed since the 5/15. Anyone know why? Prior to the 15th the site was being updated regularly. As it turns out I deployed a new server and a performance tuned relaunch on the 16th. Not knowing if I got a performance gain is killing me. The Firebug performance score that G pushes is not really validation.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved