Welcome to WebmasterWorld Guest from 220.127.116.11
It does show when I use Camino browser (Moz engine), so the bug must be due to integration with the FF extensions "Adblock Plus" and "Noscript".
I have them both set to allow everything on your Sitemaps pages, but the + Tools menu still doesn't work.
Is it something to do with your Thawte cert?
1. Option to include deleted and redirected pages.. so with this we can inform google about deleted pages which are still in the google index.
2. Option to view all indexed pages with page rank.. we can use this instead of site command because site: is not returning correct results always..
3. New feature prefered domain is a good one... but we are already giving full url in the sitemap file... Why can't google take it from there?. anyway it is a nice option..
4. An announcement section for system maintenance, updates...data refreshes, alg changes.. etc
5. Number of pages crawled per day...
6. Default page.. Many html editors link to index.htm or index.html when you link to the top directory.. but when submit to other sites people submit only the domain name. This creates duplicate pages in google's index.. So like prefered domain, sitemap should have option to specify a prefered index page.
7. Average page loading time (while crawling the site). with this we can find out any crawling problem and fine tune our scripts and judge server performance.
8. if-modified-since, hearders, robots.txt, htaccess..etc are up to googbot's expectation?
9. An internal ranking for each page (not page rank).. how much value google gives to each page.. like we give priority in sitemap...(0 to 1)
10. An overal rating of the site.. (depends on crawling efficiency, up time, quality of pages etc)
I think all of us should come out with our suggestions.. so that google can pick up some good ones and include it in next updates..finally we are going to get benefited...right?
An obvious check would be that the web page would have to contain a link or img tag pointing at the image.
I guess this would be best done in a sitemap file of some sort?
"Okay, I believe most/all U.S. users should see radically fresher supplemental results now. The earliest page I saw was from Feb 2006, and most of the ones that I looked at averaged in the ~2 month old range.
As data gets copied to more places, the fresher supplemental results should eventually be visible everywhere, not just the U.S."
On Aug 8, 2006 I was tempted to post:
As you said, my supplementals are now APR-MAY 2006.
I am grateful, but please forgive me if I wait a week
or so to sip that vintage. I've experienced 4 to 5
reversions to mid-2005 after similar progress over the
last 4 months. Haply, Adam assured me that there was no penalty in question.
Current supplementals still remain about 80% of pages reported for my site -- which themselves constitute only about half of my total pages."
Fortunately I didn't post that because today, my supplementals (on dc 18.104.22.168) are increased back in number, %, and age -- all the way back to Jul 20, 2005 19:47:35 GMT again.
I, however, remain hopeful that the little search engine that used-to-could will make it over the crest of the hill and down into the glorious valley of the future that awaits us.
Perhaps I've been out West too long but, in defense of the Sitemaps team, I'll end with the venerable acronym:
Well, that would suck if Google again has things in place (like the evil reinclusion requests) that benefit spammers but have nothing available to non-spamming webmasters who have pages or sites suffering for technical reasons but have no recourse.
They show your site's distribution of PR and file type/encoding.
Interesting note: My new site shows 0 PR on all pages with the Google toolbar, but in the Crawl stats page... although most of my pages have "Low" PR, a few pages have "Medium" PR.
Looks to me that these PR measurements are more up to date than the Google toolbar.
However, Matt say's in his video that "honest webmasters" are likely to be using "Webmaster Central Sitemaps" and he quoted a notional example of a site having innocently placed hidden text on a site and receiving a penalty, that would assist those webmasters gettheir sites back on the tracks.
So i think he is talking about "penalties" and not so much about just "banning".
Maybe GG , Vanessa , Adam or Matt can further clarify this important feature.
Angonasec, thanks for the information. We're looking into it.
I also want to thank everyone for the feedback and suggestions. Send more my way anytime.
Could you clarify, when you say that you don't show the breach of webmaster guidelines "in all cases".
Do you mean for a given "breach" you will not show it to all webmasters [ ie you will be selective in who you disclose this to ] or, do you mean that only some categories of "breach types" are shown[ e.g. hidden text per Matt's notional example on video]?
If it is the latter, is it just a matter of time before you increase the capacity of GWC [ Google Webmaster Central ] to show more breach types?
Thus only illegitimate, spamming webmasters benefit from yet another poorly conceived policy. You just have to wonder about the idea vetting process down there at the plex that allows such policies to see the light of day.
If the site has violated the webmaster guidelines, in some cases we provide this information at the top of the Summary page. We don't show this information in all cases.
While I can understand not wanting to provide this all the time, for legitimate webmasters, it would be very helpful; as we could be being penalized for completely innocent reasons.
For instance my site has been around for over ten years (since 1999 on its current domain) and it has traditionally done very well in search results and attracts a healthy level of traffic. My focus has always been on trying to create really useful resources for users and I try to always focus on the user experience ahead of search engine considerations.
Unfortunately, I was knocked completely out of Google's serps for every single search phrase on July 27th. This in turn caused me to lose 80% of my traffic and revenues overnight.
Since I don't know why I fell or what might be wrong with my site, I'm forced to try and cover all my bases to put my site back in good stead with Google. This means that I have been working 12 - 15 hours a day every day since July 27th. Investigating possible causes and updating my site. Updates and additions I had planned for this coming winter I have moved up and am working feverishly to get rolled out as soon as possible.
I don't see I have any options to this. I have to put food on my table and pay my bills. Since I don't know why my site has been penalized I have to exhaust every avenue and do it as quickly as possible before I drain my bank account dry.
Good, hard working web publishers shouldn't be punished because of some spammers and black hats. If a webmaster takes the time to use GWC, and really wants to do the right thing, then Google should make it easier for them to know why they are being penalized. Especially if it is an older well established site that is less likely to be run by a fly by night spammer. It also seems very unreasonable for reinclusion requests to take three – six weeks to get reviewed especially after what is known to be a high level of false positives on June 27th and July 27th.
If we aren't being intentionally penalized, then Google should do everything in their power to quickly resolve the issues caused by those two updates as a lot of innocent websites are being seriously harmed financially.
KenB, the fact is that there are malicious folks out there, so alerting every site with a penalty would tell some people that they'd been detected. We may expand the amount of information we can show over time. Certainly after the test of emailing webmasters, it was successful enough that we started emailing more webmasters to alert them.
[edited by: GoogleGuy at 4:15 am (utc) on Aug. 12, 2006]
I just found a bunch of supplemental pages from July of 2005...
On data center 22.214.171.124
trinorthlighting, the refresh of supplemental results went out to a data center that serves North American and Asian traffic. If you went looking at specific data centers, you could probably find older supplemental pages--however, as the newer supplemental results roll out at more data centers in the next few weeks, they will completely replace the older supplemental results.
KenB, the fact is that there are malicious folks out there, so alerting every site with a penalty would tell some people that they'd been detected.
I fully appreciate this. Lord knows, I'm fighting the same people who keep trying to scrape my site so that they can feed it to you. If I had a penny for every page request by a malicious bot, I wouldn't worry so much about AdSense.
My point is that the last couple rounds of updates (June 27th/July 27th) caught up a lot of innocent sites that don't try to play games but are getting hit with site wide penalties -- which can be economically devistating.
We may expand the amount of information we can show over time. Certainly after the test of emailing webmasters, it was successful enough that we started emailing more webmasters to alert them.
You mentioned recently the end of the summer to tell us more about datacenters. Its raining here where I live and the summer is approaching its end soon.
Looking forward to hear from you few words about 126.96.36.199 and 188.8.131.52 ;-)
Alerting some sites is a deliberate choice to not serve the interests of the mass honest webmasters, but rather cater to the interests of some (but not all) spammers. It's extremely bad policy.
An honest webmaster should KNOW that if no banned notice is on their site that they then do not have a ban. Honestly, I have no clue why Google bends over backwards for some spammers, while being deliberately confusing to honest webmasters... (don't get me started on how Webmaster central links to the deliberately inaccurate link search).
alerting every site with a penalty would tell some people that they'd been detected
Google Guy/ Vanessa - I'm sure that you've given a lot of thought to this issue & that you understand that many good sites are run by folks that do not fully comprehend the guidelines [ with all the interpretations ], as well as spammers. So i guess it's a juggling act.
But isn't there some sort of validation process which could make it possible for you to sort out the spammers from the well intentioned site owner/ web master?
What about a series of security checks, email validation, coupled with age and ownership of the site , a recurring subscription fee, as well as an established pattern of some authority sites. These are just my thoughts to provoke some thinking - even though they need stronger inputs and better ideas than i can offer.
If G's aim is to eliminate SPAM, then working closely with validated webmasters could strengthen the process of identification and monitoring.
[edited by: Whitey at 1:04 pm (utc) on Aug. 12, 2006]