If thats the case, then this is just a cosmetic fix and does not deal with the real issue. Googleguy, can you clarify this?
Vanessa: More feedback on the + Tool menu not showing on my FF set-up.
It does show when I use Camino browser (Moz engine), so the bug must be due to integration with the FF extensions "Adblock Plus" and "Noscript".
I have them both set to allow everything on your Sitemaps pages, but the + Tools menu still doesn't work.
Is it something to do with your Thawte cert?
System: The following message was spliced on to this thread from: http://www.webmasterworld.com/google/3039624.htm [webmasterworld.com] by tedster - 9:58 am on Aug. 9, 2006 (EDT -4)
My suggestions to the sitemap team...
1. Option to include deleted and redirected pages.. so with this we can inform google about deleted pages which are still in the google index.
2. Option to view all indexed pages with page rank.. we can use this instead of site command because site: is not returning correct results always..
3. New feature prefered domain is a good one... but we are already giving full url in the sitemap file... Why can't google take it from there?. anyway it is a nice option..
4. An announcement section for system maintenance, updates...data refreshes, alg changes.. etc
5. Number of pages crawled per day...
6. Default page.. Many html editors link to index.htm or index.html when you link to the top directory.. but when submit to other sites people submit only the domain name. This creates duplicate pages in google's index.. So like prefered domain, sitemap should have option to specify a prefered index page.
7. Average page loading time (while crawling the site). with this we can find out any crawling problem and fine tune our scripts and judge server performance.
8. if-modified-since, hearders, robots.txt, htaccess..etc are up to googbot's expectation?
9. An internal ranking for each page (not page rank).. how much value google gives to each page.. like we give priority in sitemap...(0 to 1)
10. An overal rating of the site.. (depends on crawling efficiency, up time, quality of pages etc)
I think all of us should come out with our suggestions.. so that google can pick up some good ones and include it in next updates..finally we are going to get benefited...right?
I'd like a way to affect which web page that images are associated with in the image index. E.G. I have a blog with some pictures in. Some of the pictures in the index are associated with the entry they're on, some with the dated archive, and some with the category archive.
An obvious check would be that the web page would have to contain a link or img tag pointing at the image.
I guess this would be best done in a sitemap file of some sort?
Just did the webmaster central thing and despite what Matt Cutts says there is no place to see if there are any penalties.
I keep looking for places to click on to find information, but there is little here. I don't get it.
I am aware of someone that has told me that when they go there, there is a prominent "you are banned" notice; however Google does not tell everyone that they ban that they are banned.
On Aug 7, 2006 GoogleGuy posted:
"Okay, I believe most/all U.S. users should see radically fresher supplemental results now. The earliest page I saw was from Feb 2006, and most of the ones that I looked at averaged in the ~2 month old range.
As data gets copied to more places, the fresher supplemental results should eventually be visible everywhere, not just the U.S."
On Aug 8, 2006 I was tempted to post:
As you said, my supplementals are now APR-MAY 2006.
I am grateful, but please forgive me if I wait a week
or so to sip that vintage. I've experienced 4 to 5
reversions to mid-2005 after similar progress over the
last 4 months. Haply, Adam assured me that there was no penalty in question.
Current supplementals still remain about 80% of pages reported for my site -- which themselves constitute only about half of my total pages."
Fortunately I didn't post that because today, my supplementals (on dc 126.96.36.199) are increased back in number, %, and age -- all the way back to Jul 20, 2005 19:47:35 GMT again.
I, however, remain hopeful that the little search engine that used-to-could will make it over the crest of the hill and down into the glorious valley of the future that awaits us.
Perhaps I've been out West too long but, in defense of the Sitemaps team, I'll end with the venerable acronym:
I just found a bunch of supplemental pages from July of 2005...
On data center 188.8.131.52
trinorthlighting, they're like roaches.
My supplimentals go back to 13 Mar 2006.
Man I wish there was a way to find a list of what pages are "supplimental results" and even nicer if we knew why.
|Man I wish there was a way to find a list of what pages are "supplimental results" |
"I am aware of someone that has told me that when they go there, there is a prominent "you are banned" notice"
Well, that would suck if Google again has things in place (like the evil reinclusion requests) that benefit spammers but have nothing available to non-spamming webmasters who have pages or sites suffering for technical reasons but have no recourse.
I am hoping google will ad some features down the road, its a start with google though...
System: The following message was spliced on to this thread from: http://www.webmasterworld.com/google/3041914.htm [webmasterworld.com] by tedster - 7:06 pm on Aug. 10, 2006 (EDT -4)
Did you guys notice the new green bars under Crawl stats and Page analysis in the Statistics tab?
They show your site's distribution of PR and file type/encoding.
Interesting note: My new site shows 0 PR on all pages with the Google toolbar, but in the Crawl stats page... although most of my pages have "Low" PR, a few pages have "Medium" PR.
Looks to me that these PR measurements are more up to date than the Google toolbar.
Those green bars aren't new. They've been there for ages.
My bad. I didn't see them before. Probably because my site is new, and the features weren't available for me until today or something.
g1smd & steveb - Here's my casting vote for seeing what penalties are in place for sites. I can't see anything either.
However, Matt say's in his video that "honest webmasters" are likely to be using "Webmaster Central Sitemaps" and he quoted a notional example of a site having innocently placed hidden text on a site and receiving a penalty, that would assist those webmasters gettheir sites back on the tracks.
So i think he is talking about "penalties" and not so much about just "banning".
Maybe GG , Vanessa , Adam or Matt can further clarify this important feature.
If the site has violated the webmaster guidelines, in some cases we provide this information at the top of the Summary page. We don't show this information in all cases.
Angonasec, thanks for the information. We're looking into it.
I also want to thank everyone for the feedback and suggestions. Send more my way anytime.
Could you clarify, when you say that you don't show the breach of webmaster guidelines "in all cases".
Do you mean for a given "breach" you will not show it to all webmasters [ ie you will be selective in who you disclose this to ] or, do you mean that only some categories of "breach types" are shown[ e.g. hidden text per Matt's notional example on video]?
If it is the latter, is it just a matter of time before you increase the capacity of GWC [ Google Webmaster Central ] to show more breach types?
"We don't show this information in all cases."
Thus only illegitimate, spamming webmasters benefit from yet another poorly conceived policy. You just have to wonder about the idea vetting process down there at the plex that allows such policies to see the light of day.
>> Send more my way anytime. <<
I missed the recommended route for doing that...
It would be nice if there were some way to know when to expect updates to query and other stats reported on sitemaps. Is there a regular schedule?
|If the site has violated the webmaster guidelines, in some cases we provide this information at the top of the Summary page. We don't show this information in all cases. |
While I can understand not wanting to provide this all the time, for legitimate webmasters, it would be very helpful; as we could be being penalized for completely innocent reasons.
For instance my site has been around for over ten years (since 1999 on its current domain) and it has traditionally done very well in search results and attracts a healthy level of traffic. My focus has always been on trying to create really useful resources for users and I try to always focus on the user experience ahead of search engine considerations.
Unfortunately, I was knocked completely out of Google's serps for every single search phrase on July 27th. This in turn caused me to lose 80% of my traffic and revenues overnight.
Since I don't know why I fell or what might be wrong with my site, I'm forced to try and cover all my bases to put my site back in good stead with Google. This means that I have been working 12 - 15 hours a day every day since July 27th. Investigating possible causes and updating my site. Updates and additions I had planned for this coming winter I have moved up and am working feverishly to get rolled out as soon as possible.
I don't see I have any options to this. I have to put food on my table and pay my bills. Since I don't know why my site has been penalized I have to exhaust every avenue and do it as quickly as possible before I drain my bank account dry.
Good, hard working web publishers shouldn't be punished because of some spammers and black hats. If a webmaster takes the time to use GWC, and really wants to do the right thing, then Google should make it easier for them to know why they are being penalized. Especially if it is an older well established site that is less likely to be run by a fly by night spammer. It also seems very unreasonable for reinclusion requests to take three – six weeks to get reviewed especially after what is known to be a high level of false positives on June 27th and July 27th.
If we aren't being intentionally penalized, then Google should do everything in their power to quickly resolve the issues caused by those two updates as a lot of innocent websites are being seriously harmed financially.
trinorthlighting, you should still do a 301 from the non-preferred host to the preferred host. That will make sure that every search engine can combine the two hosts. When you give us a www vs. non-www hint, eventually we should take that preference and choose it as the canonical host. Then I expect that links and PageRank will be combined into the preferred host.
KenB, the fact is that there are malicious folks out there, so alerting every site with a penalty would tell some people that they'd been detected. We may expand the amount of information we can show over time. Certainly after the test of emailing webmasters, it was successful enough that we started emailing more webmasters to alert them.
[edited by: GoogleGuy at 4:15 am (utc) on Aug. 12, 2006]
|I just found a bunch of supplemental pages from July of 2005... |
On data center 184.108.40.206
trinorthlighting, the refresh of supplemental results went out to a data center that serves North American and Asian traffic. If you went looking at specific data centers, you could probably find older supplemental pages--however, as the newer supplemental results roll out at more data centers in the next few weeks, they will completely replace the older supplemental results.
|KenB, the fact is that there are malicious folks out there, so alerting every site with a penalty would tell some people that they'd been detected. |
I fully appreciate this. Lord knows, I'm fighting the same people who keep trying to scrape my site so that they can feed it to you. If I had a penny for every page request by a malicious bot, I wouldn't worry so much about AdSense.
My point is that the last couple rounds of updates (June 27th/July 27th) caught up a lot of innocent sites that don't try to play games but are getting hit with site wide penalties -- which can be economically devistating.
|We may expand the amount of information we can show over time. Certainly after the test of emailing webmasters, it was successful enough that we started emailing more webmasters to alert them. |
The emails are a great idea. It would also be nice to be able to resolve things faster to get off of bans sooner.
Good morning g1smd
">> Send more my way anytime. <<
I missed the recommended route for doing that..."
I guess this thread is also very suitable for the purpose. Vannessa visit this thread very often, you know ;-)
Good morning GoogleGuy
You mentioned recently the end of the summer to tell us more about datacenters. Its raining here where I live and the summer is approaching its end soon.
Looking forward to hear from you few words about 220.127.116.11 and 18.104.22.168 ;-)
"alerting every site with a penalty would tell some people that they'd been detected."
Alerting some sites is a deliberate choice to not serve the interests of the mass honest webmasters, but rather cater to the interests of some (but not all) spammers. It's extremely bad policy.
An honest webmaster should KNOW that if no banned notice is on their site that they then do not have a ban. Honestly, I have no clue why Google bends over backwards for some spammers, while being deliberately confusing to honest webmasters... (don't get me started on how Webmaster central links to the deliberately inaccurate link search).
|alerting every site with a penalty would tell some people that they'd been detected |
Google Guy/ Vanessa - I'm sure that you've given a lot of thought to this issue & that you understand that many good sites are run by folks that do not fully comprehend the guidelines [ with all the interpretations ], as well as spammers. So i guess it's a juggling act.
But isn't there some sort of validation process which could make it possible for you to sort out the spammers from the well intentioned site owner/ web master?
What about a series of security checks, email validation, coupled with age and ownership of the site , a recurring subscription fee, as well as an established pattern of some authority sites. These are just my thoughts to provoke some thinking - even though they need stronger inputs and better ideas than i can offer.
If G's aim is to eliminate SPAM, then working closely with validated webmasters could strengthen the process of identification and monitoring.
[edited by: Whitey at 1:04 pm (utc) on Aug. 12, 2006]
|If G's aim is to eliminate SPAM, then working closely with validated webmasters could strengthen the process of identification and monitoring. |
Indeed! In my process of trying to defend my site against plagerism, I routinely come across faux blogs, faux directories and sub-domain spamming sites that have scrapped and mashed the content of my pages simply to feed to Google. If I thought it would do any good, I would be willing to report these sites on a regular basis. Reporting these site, however, would take a great deal of my time and I don't want to spend my time on doing something that will do no good.
Legitimate websites and Google have the same vested interest in getting rid of the SERP spammers, it would be wonderful if there were a way we could really work together to defeat this problem (or at least keep it i n check).
| This 167 message thread spans 6 pages: < < 167 ( 1 2 3  5 6 ) > > |