| This 205 message thread spans 7 pages: < < 205 ( 1  3 4 5 6 7 ) > > || |
|Possible Shift in Google-Webmasters Communication Policy!|
Have Googlers stopped announcing updates and posting weather reports?
I have noticed in recent months a remarkable shift in the way Google employees communicate with the webmasters communities.
For one reason or the other the Googlers have stopped posting weather reports about the new infrastructure. No more posts explaining critical changes on the serps. No more talking about specific DCs as they use to do. No more chat when it comes to Google serps and possible changes which are so clear and obvious to even novice webmasters. The only thing we have been hearing is the famous "Data Refresh".
Of course, one of the reasons for the shift in Google's communication policy mightbe the continuous trouble the Googles are into since the deploy of BigDaddy.. "if you have no good news..keep quite".
If its true that the Googlers have decided not to talk to us anymore about the serps, DCs and possible filters, algo changes and announcing possible updates. Do those same Googlers deserve to be invited anymore to the webmasters conferences and meetings, for example?
What will be the consequences of Google shift in Google-Webmasters communication policy?
Many thanks in advance for your contributions to the thread.
I just think all of the search engine reps are now communicating more through offical channels rather than posting in webmaster forums.
I see more communication in terms of other services Google providing. For example Google Sitemap is of good help. So are Matt's postings on its blog. At times Matt has invited suggestions and even replies to personal comments, which we should appreciate.
Yes Google has started talking less to webmasters, but I don't exactly remember them talking too much before either (except GG at times)
It seems Google is now more into educating webmasters on how to get site listed in Google and all. In fact the webmaster section of Google is totally different and elaborative then it used to be before.
I still don't see that there's been any substantive change at all. Matt's blog was very helpful and very specific up to a couple of months ago. Then he went on vacation for 6 weeks. Since he's been back I haven't seen any substantive changes in the SERPs whatsoever -- not even the normal rotating datacenters -- hence not much to say.
And what about those emails to webmasters, giving them a heads-up about penalties and filters they were at risk of triggering? That was a welcome change to their communication policy that started after the IPO, IIRC.
"2. Complaining and whining webmasters. Take a look at the forums, mattís blog, and other forums. I see a lot of people who whine and complain about Google yet their own sites are not in order. I saw that even Brett made a post last week about it.
May be if webmaster world moderators would start cracking down on these "Google sucks" and "Google is spam" type of posts, may be googleguy and Adam would post more. I am more than sure that they read and hear enough complaints during the day and really do not want to come to webmaster world and read more.
After all, reading all of these posts about how google sucks and spam is king is not what us old timers come here for, right? "
1. Webmasters wouldn't complain as much if there was actual communication instead of their usual encrypted PR responses. Google deals with problem more like a political party now than a company. Ignore the problem and eventually the commotion will die down until the next problem. Repeat cycle. They understand webmasters complaining will never make serious news or effect their bottom line therefore they do not care.
2. Having 5 billion pages indexed by obvious spammers doesnt help the situation and/or reputation.
The bottom line is that yes there is a lot of complaining and there are many webmasters that don't have their sites in order but the problem is that these webmasters who might have minor errors are getting seriously penalized while obvious spammers get away with murder, dramatically effect the SERPS and penalize those who are trying to follow the guidelines. When you penalize the good guy while the bad guy gets away with murder you are going to have one pissed off population. You have a better chance at ranking high in the SERPS with black techniques than having a completely white hat site with some very minor mistakes.
On top of that your appointed liaison to the webmaster community is completely silent and Matt Cutts goes on 6 weeks vacation coincidently when the **** hits the fan.
Google is so big and en grained in the minds and dictionary of the general public that they really could care less about the webmaster community. Whatever we say or do won't effect their bottom line anymore and they know that.
|Google is so big and en grained in the minds and dictionary of the general public that they really could care less about the webmaster community. Whatever we say or do won't effect their bottom line anymore and they know that. |
I don't think that's quite true. They send engineers to search conferences, for example.
But even if you're right, is lack of communication with the "Webmaster community" necessarily a bad thing? Google's guidelines basically say "Create a site for users, and we'll do the rest." From a user's point of view, why should Google give a leg up to sites whose owners or employeees hang out on forums like Webmaster World and SEW and attend search conferences?
I think I would prefer less communication than Google representatives saying anything at all just to make it appear there is a dialogue. When this happens some us feel as if we're being talked down to in a condescending manner. Even the the fact that Google send representatives to conferences isn't important in itself. It's not evidence of Google trying to please webmasters. It's all about PR and giving Google a presence at these conferences. At least that's how I'm seeing it. I can't blame Google for this and they are certainly no different in this regard than other search engine companies. I do think that Google has realized that not everyone is fooled and it can't be helpful to their cause when it's pointed out publicly.
Let me put it this way: If someone from Google says it's raining outside I go to a window and look for myself.
And how does that really help us? It would be more embarrassing if they didn't send people.
"Create a site for users, and we'll do the rest."
What a guide. How specific. I can't imagine why on earth people would ask questions with such a thorough guide on how to create a website.
Lack of communication isn't bad if people actually got answers. Google is the one who said they wanted to "communicate" with webmasters. The problem is their communication is completely vague, doesnt answers the questions people are asking (or they get deleted from MC's blog) or they get ignored. Nobody is saying we should get a leg up at all. People just want them to follow through with what they said they wanted to do. In the meantime, people don't know what to do, change their sites, become afraid to change their sites, speculate, whine, and create threads asking "Can a site be too clean?" and become extremely frustrated. Stringing people along is a sure way to piss people off.
In the last 4 months what answers have we received about the supplemental issue? June 27th? Pages being deindexed? Major drop in SERPS 4-5 pages deep for sites that been around for years?
None. All speculation or "bad data push". Their hand picked liaison to the webmaster community really explains it well.
If Webmasterworld was selling stock I know I would buy it.
reseller, I think that there is a lot of good things we've been doing for communication with webmasters lately.
Sitemaps has really been beefed up to tell more errors, let people test robots.txt, and let people find out about penalties for many sites. We're still writing to many sites when they are removed from Google's index as well.
At the same time, google.com/support has been going through a revamp to refresh the information there and give more answers
Matt Cutts has blogged about the googlebot crawl cache plus the indexing timeline. He also did a post confirming that the server params from a Google error page were real, and talked about the real story of what happened to the Catawba school and how student information got crawled if the section of the site was password protected. If you monitor his comments, you've also seen his comment about 188.8.131.52 and how it has more accurate estimates for site: queries.
You asked "Do those same Googlers deserve to be invited anymore to the webmasters conferences and meetings, for example?" I think it benefits everyone when webmasters can ask questions and Googlers can get feedback and respond to it. And folks have been listening and responding too. Vanessa Fox just wrote about the META NOODP tag, which gives webmasters more control over their snippets. I think she's also been digging into a topic to write a post about accented characters for non-English sites too. The Sitemaps team gets fired up when they talk to outside webmasters, and they prioritize based on the requests that they hear. And other groups are trying to give more information to site owners and advertisers (for example, this week Google started showing counts of invalid clicks that Google pre-filtered and didn't charge you for).
I'd like us to do more on communication, and I'm glad that we hired Adam Lasnik to help with getting feedback--he's been able to reply to a lot more emails than we would otherwise. Don't misunderstand me: I'm not claiming that we're perfect by a long shot. I'm not even claiming we're where I'd like to be. Sometimes there's quiet periods where there's less to talk about; I'm expecting that as new infrastructure at 184.108.40.206 rolls out, I'll be answering questions about it, but that will be closer to the end of the summer.
Let me turn it around: how could we do better? It sounds like you'd like to hear a lot more about specific data centers, for example.
Tell us more about the data refreshes as I have been affected negatively and positively by each and every one since Jagger :)
Also, looking at that IP i am not looking as good as I am now, but I can live with it if thats what its gonna be :). Might not be where I am now, but it is better than dropping out of the index. Again, not a complaint, as I can see that when that rolls out, i will need to work, but it will not be a losing battle if what i see is accurate.
Also, I notice that for the first time i see a popup on the right hand side asking me to make Google my default search engine in IE7. This a sign of things to come in the engine battle?
[edited by: 300m at 8:01 pm (utc) on July 27, 2006]
"In the last 4 months what answers have we received about the supplemental issue? June 27th? Pages being deindexed? Major drop in SERPS 4-5 pages deep for sites that been around for years?"
It seems the threads that dominate most of the forums still don't get answered.
gcc_llc, I think that's because many of those questions are ones that are evergreen; people are always asking about them. There was a data refresh on June 27th that lots of people ask about, but there was also a data refresh in the last 1-2 days that refreshes the same data. Going forward, I'd expect that the cycle time would go down even more, possibly down to once a week for that particular algorithm. But people also asked about data refreshes back in September of last year. Sometimes we take feedback here (e.g. when people were asking about pages dropping), but since WebmasterWorld doesn't allow specifics, we also take feedback from emails and other places around the web too.
Would I be correct to assume that for each refresh if results stick for that particular algo, that the results we see are at the very least right within this algo over a trended period of time?
If you are not able to answer that, could you at least let me know so i dont have to think about it? :)
[edited by: 300m at 8:24 pm (utc) on July 27, 2006]
|Let me turn it around: how could we do better? |
I think a simple answer could be gleaned from the recent (and ongoing) June 27th ranking problems. From today's events it seems pretty clear that some bugs were introduced on the 27th June, one of which appears to have now been fixed, while others remain. Both Google and the web community desperately need for Google to provide some mechanism by which problems can be reported. Perhaps the Sitemaps interface, with its inbuilt site ownership verification, would be ideal for this?
Try and imagine how frustrating it is to lose 90% of your organic Google traffic overnight, be fairly certain that it is due to a problem at Google's end, and have no way of reporting the problem? Has my site been incorrectly penalised by an over-zealous spam filter? Has Google's html parser been tweaked so that it now trips up on a "/>"? Or is it any one of around a trillion other possibilities? How long will it take to fix? 1 Month? 2 Months? 6 Months? Do Google even know about the problem yet? And so on...
I'd bet that several thousand re-inclusion requests have been filed as a direct results of the June 27th problems. I wonder how many have been answered? I know that neither of mine have been.
Given that Bugs are innevitable, it must surely be in everyone's interest, including Google's, for Google to have a reliable means of hearing about them and being provided with examples. I've had a think, and haven't been able to come up with any other examples of companies trying to function without some form of "customer-facing" problem reporting and tracking process.
[edited by: ClintFC at 8:22 pm (utc) on July 27, 2006]
People are always asking about them because we don't really get answers about them. Just saying when they happen doens't really help the people that are effected. We don't know how they effect our sites. We don't don't know why we are effected. We don't know what to do to fix the problem if we are effected. We bascially don't know anything but WHEN because our sites drop off the face of the map. That doesn't help anyone. You just get more threads asking "what happened". When are you actually going to get specific for once?
What email address do we use that we can actually get an answer other than the automated one from the so called "support"? What other places can we go to find out without our comments being deleted?
If its nothing but data refreshes why do pages come back from 3 years ago? Why do sites all of sudden go supplemental after being fine for 6 months? Why do people all of a sudden drop 4 pages in the rankings in terms they were top 5 for a long time? Why do pages just disappear after being heavily crawled?
Is this all just a data refresh? How can a refresh produce OLD results?
People are following your guidelines but for some reason are getting penalized. At this point people can do nothing but sit here and wish upon a star because we have no clue what to do.
I guess the headline of these last hours has been the "2 KW term filter":
Blue Widget >>> Disappeared from results
"Blue Widget" >>> Same top position as usual.
And it affected so many respectable webmasters that I can't believe this is a punishment or similar measure for a black hat technique.
Maybe a name for this update would "collateral damage".
I think a positive move on behalf of the management of WebmasterWorld would be to publish the IP addresses of its members ... lets see who we're talking to.
All the Best
Many people today said their sites came back from the June 27th changes. Mine didn't, but it does have decent results on 220.127.116.11 - you said that won't be finished until the end of summer. Why did some people come back on all datacenters and some of us are only on 18.104.22.168?
|hear a lot more about specific data centers |
Are the numbered datacenters even relevant anymore? From what I can see, the so-called "micro-filters" do not display any of the "traditional" datacenter results anyways.
Are you saying that 22.214.171.124 will be the "base" infrastructure in which the micro-filtered results will be applied to or even that these micro-filtered results are simply a stop-gap for the time being?
I read in the tea leaves that "everything will be alright and that as long as we adhere to Google rules, we will reap rewards". Unfortunately, I still smell spam in the index ...
Keep the faith
|I read in the tea leaves that "everything will be alright and that as long as we adhere to Google rules, we will reap rewards". Unfortunately, I still smell spam in the index ... |
Tell me about it. Yesterday I was doing some research for a school business project and was looking for market share information on a well known company and almost every result was an MFA or link directory.
Good evening GoogleGuy
Thanks for taking the time to visit the thread and post your comments (just like the good old days ;-)). Much appreciated.
"Let me turn it around: how could we do better? It sounds like you'd like to hear a lot more about specific data centers, for example."
Because we have had excellent Google-Webmasters communications in the past where you use to communicate with us (weather reports, answering questions, Q&A sessions etc..), on our webmasters platforms, as WebmasterWorld forums (especially forum 30), I wish to ask you kindly to continue on that successful path. Of course it would be also of great value and highly appreciated to see the other Googlers friends as Adam and Vanessa visiting us here at WebmasterWorld forums and contribute as often as they can.
Google's blogs and Matt's blog are great additional channels of communications with webmasters. But the inherited weakness of blogs is that they are designed mostly for one-way communications which don't allow much discussion.
Regarding the datacenters. You use to mention for us in the past during updates specific DCs to keep an eye on in order to allow us to follow developments. That has been missing lately. Of course new infrastructure weather reports are long time no see too.
I see the other kind fellow members have already " bombarded" you with questions that I would have asked too ;-)
Once again.. thanks a bunch GG for visiting us and look forward to the pleasure of seeing you among us again.
Wow, you all have to remember that for every site that fell, there were sites that gained as well. If your site fell, check it. When itís perfect email it to googleguy and ask him why...
But, in saying that your site better be in order, because I guarantee that he will find the issue almost immediately weather it would be bad coding, questionable links, content that is not unique, bad Metaís etc....
For your newer members, do not feel like help is not out there for you, if your site fell, sticky some of us old timers who are still on the top some urls. There are quite a few of us out here whose sites have ranked good and have not been affected by data refreshes. Feel free to ask us for advice.... What you do with the advice is up to you, but never be afraid to ask.
Point taken, reseller. It's tough to find the time to comment/respond everywhere I'd like to. I think part of the solution has to be ramping up more people who can answer some questions.
I do think that we'll be discussing data centers more toward the end of the summer, if I had to guess.
Problem is, you don't know the answers either.
Stop telling people to "check their sites" like they don't know what they are doing. People are sick and tired of hearing it when the site in question has ranked high for years and its not "bad coding".
For every site that fell other have agained? Yeah the domains with 5 billion spam pages indexed.
[edited by: gcc_llc at 9:37 pm (utc) on July 27, 2006]
|Wow, you all have to remember that for every site that fell, there were sites that gained as well. If your site fell, check it. When itís perfect email it to googleguy and ask him why... |
But, in saying that your site better be in order, because I guarantee that he will find the issue almost immediately weather it would be bad coding, questionable links, content that is not unique, bad Metaís etc....
You're a funny guy. Do you actually believe any of the stuff you're saying? Oh...and before I forget...since you seem to be so knowledgable...what's Googleguy's email address?
The more Googlers answering our questions the better.
Looking forward for more discussion of DCs ;-)
Good night and God bless.
Actually, GoogleGuy, I don't suppose you could set up one of your special email addresses for sites affected by the June 27th issues could you? That would be very much appreciated.
We have been told, for example, that our site has not triggered any kind of Spam filter and yet, since the 27th June, most of our formerly high-ranking pages now rank below foreign-language pages, foreign-country pages, and even supplemental pages.
|Are you saying that 126.96.36.199 will be the "base" infrastructure in which the micro-filtered results will be applied to or even that these micro-filtered results are simply a stop-gap for the time being? |
whitenight, 188.8.131.52 has some newer infrastructure that makes site: queries more accurate, and in general that infrastructure also improves results for other queries too. But the infrastructure at 184.108.40.206 is orthogonal/independent of many other changes.
Part of the trouble is that what some of this infrastructure does is hard to describe without getting into a really detailed understanding of our search architecture. But I'll try to answer questions how I can.
reseller, I'm coming back to this notion of how to get problem reports. WebmasterWorld isn't ideal because we can't discuss specifics. A blog isn't ideal either because one person can't field all the requests, and some people don't want to talk specifics in front of the rest of the web. Someone mentioned reporting problems in Sitemaps, which might be interesting.
Then set an up an email address.
Maybe instead of sitemaps have an individual separate entity cover those requests. I do not use sitemaps and do not have any intention do do that :(
Yes I do believe in it. Listen, I employ 10 people who do nothing but build websites. In our portfolio of websites, I would say 80-90% of them rank top ten in their serps for their particular keywords in their niches. We have been building sites since the 90's so trust me; there is a lot of wisdom behind these words.
Now, if you take a sudden dip in the serps, do not blame Google. Check your site..... When one of our sites takes a dip in the serps, 99.99% of the time itís our own fault and we catch it.
A bit of info, we had a few sites that recently had a hack and a worm eat through it. Guess what it did, it wiped the title tags out for every page in the site. Another site we had drop recently, was because a user who processes orders for the site was playing around in some of the php coding late one night and erased the default meta tag coding. Another time, a disgruntled employee did some damage.
Those are just a few examples, but I have been in this game a very long time and trust me, when you drop in the serps, 99.99% of the time its something that you should be catching anyways.
Always check your sites if you have multiple users who can access the internal workings of the site.
[edited by: trinorthlighting at 10:05 pm (utc) on July 27, 2006]
| This 205 message thread spans 7 pages: < < 205 ( 1  3 4 5 6 7 ) > > |