| This 205 message thread spans 7 pages: < < 205 ( 1 2 3 4 5 6  ) || |
|Possible Shift in Google-Webmasters Communication Policy!|
Have Googlers stopped announcing updates and posting weather reports?
I have noticed in recent months a remarkable shift in the way Google employees communicate with the webmasters communities.
For one reason or the other the Googlers have stopped posting weather reports about the new infrastructure. No more posts explaining critical changes on the serps. No more talking about specific DCs as they use to do. No more chat when it comes to Google serps and possible changes which are so clear and obvious to even novice webmasters. The only thing we have been hearing is the famous "Data Refresh".
Of course, one of the reasons for the shift in Google's communication policy mightbe the continuous trouble the Googles are into since the deploy of BigDaddy.. "if you have no good news..keep quite".
If its true that the Googlers have decided not to talk to us anymore about the serps, DCs and possible filters, algo changes and announcing possible updates. Do those same Googlers deserve to be invited anymore to the webmasters conferences and meetings, for example?
What will be the consequences of Google shift in Google-Webmasters communication policy?
Many thanks in advance for your contributions to the thread.
> transcript of the video
Yes. On holiday with an umts-modem falling back to 57k GPRS. No video. Blind. #*$!.
Good evening Folks
For sure its a step in the right direction, when Matt Cutts took the time today to answer some of the 150+ questions posted by webmasters under Grabbag Friday [mattcutts.com].
However, very important questions dealing with critical matters were left unanswered, unfortunately. For example:
Over at WebmasterWorld, there were quite a few people claiming their sites came back to normal on July 27th from the June 27th fallout. Unfortunately for me, I found all my listings totally gone on July 27th. From #1 to #200, many pages not even showing up anymore. I did notice all the pages that lost rank fell to the very end of site:
When things like this happen, and people lose nearly all their listings, do you advise going through and looking for problems, re-writing content etc., or is this something that is more of a waiting game, and we should just keep working as usual? I’ve signed up for sitemaps, enquired about any penalties, but I don’t anticipate getting much out of it. My problem seems limited to all the article pages, regardless of topic or linkage. Just an unexplainable loss. All doing fine one day, all gone the next.
I’m curious to hear about the 27th shakeup and the weird things that happened with the link operator for reporting backlinks.
I’d like to know more about the supplemental index. It seems while you were on vacation many sites got put in there. I have one site where this happened. It has a PR of 6, many links from dmoz, yahoo, wikipedia, .edu sites, etc. Online since 2001. In late May though it got stuck in the supplemental index, and from what I read on forums I’m not the only one.
My question is regarding the total number of pages indexed for a given site in Google…
We see this number bounce around a LOT for our site (via the [site:] command). Besides, hitting a different datacenter, what could be the reasons for this? Somedays all datacenters will have >2 million pages from our site, then some days all datacenters will have 200,000 pages from our site. The numbers seem to increase/decrease randomly each day. Overall, shouldn’t the trend be up?
Hi Matt, I’ve mentioned before that I’d love to see you do a define: type post, where you define terms that you Googlers use, that we non-Googlers may be confused about. Terms such as data refresh, orthogonal, etc. You may have defined them in various places, but one cheat sheet type of list would be great. Thanks.
Matt, I would like to know why some sites now no longer show the homepage at the top of the results for a “site:www.example.com” search. Sites that I’ve seen this happen to have taken a big drop in SERPs. Sometimes during the next data refresh things go back to normal, but not always.
So what triggers a site’s homepage no longer showing at the top for a “site:www.example.com” query and what can be done for sites seemingly trapped in that condition?
Supplemental is the topic!
Trying to avoid the question, “why did my site go supplemental?” I thought I’d rephrase from a searchers point of view first.
When I do a search using Google, often results show up that have the word “supplemental” by them. These results are not on page 38 of the results but rather right on top. Sometimes when I follow the result the page is exactly as shown in the snippet, other times its changed totally or doesn’t exist.
What does Google mean by supplemental? If it is meant as a page that may or may not still be what it was when crawled some long time ago, then why does it show up in the first page or results when there are hundreds of thousands of others available.(according to the 1 OF Statement) If that statement is true, then why does sometime the resulting page have exactly what I want, but the cache date is from 2005.
Now as a webmaster; I have sites that have gone 95% supplemental. My concern is that there is no pattern to it. Specific product detail pages with tons of content are supplemental while a worthless (in the eyes of a searcher) contact page or sitemap is not. The supplemental pages show up in the search results just fine, still receive search traffic, but are not crawled or updated at all. To combat this I’ve had to start designing the site for GOOGLE not the user, which is in fact against all logic when it comes to building a good site. Why bother updating a page if it’s gone supplemental? Just do the update and rename the page (to take it one step further and greyer I could make it a subdomain page!). That page will be indexed soon enough and the supplemental can go away. Seems like this practice is chewing up valuable processor time, crawler bandwidth, and storage space by Google and myself.
So the real questions are: What differentiates a page from being in the supplemental vs. normal index. Is it a site specific phenomena, or page specific? Is there anything I can do about it, generally speaking?
Lets hope Matt would answer above questions in his next Video.
Yeah I was really hoping to hear some BigDaddy to current issues.
Cute presentations and nice effort by Matt for doing this for sure.
Although I pretty much learned all what he said in the pre-google days let alone the google dancing days.
*yawn* As always, he never talks about what every one is interested in, like the billions of pages of sub-domain SPAM or fixing the screwed up site: search that works great in datacenters, but is messed up in google.com.
>>>184.108.40.206 has some newer infrastructure that makes site: queries more accurate, and in general that infrastructure also improves results for other queries too. But the infrastructure at 220.127.116.11 is orthogonal/independent of many other changes.
If it's so accurate, then it sure would be nice if googe.com got the site: results from there. (Yes, the results are good there.)
|When things like this happen, and people lose nearly all their listings, do you advise going through and looking for problems, re-writing content etc., or is this something that is more of a waiting game, and we should just keep working as usual? I’ve signed up for sitemaps, enquired about any penalties, but I don’t anticipate getting much out of it. My problem seems limited to all the article pages, regardless of topic or linkage. Just an unexplainable loss. All doing fine one day, all gone the next.... |
Unfortunately, reseller, if you can't identify exactly what's happening...and you start to make changes based on a specific set of filters running on one of G's datacenters...that are bring rotated in and out of the main index for testing purposes...you may end up nuking your site entirely....
Can you see your missing pages on any of the datacenters where you would expect them in you sector(s)?
This certainly tells you that for that set of filters .. you are listing where you would expect...
My recommendation...keep working as usual...if your site(s) has/have been performing well in the index...and you have been seeing good solid traffic from G ... then you may just have to wait until they sort out these poorly designed automated filters that are causing all the problems...and remember...even G doesn't have all the answers as to what is actually going on when things go wacky.....such a complex system they are running now...
Google wants to make sure that they will always be the Cat when they play Cat and Mouse games with us. Matt sorta hints at that I think.
Build and grow your site and (we'll get Google figured out soon enough...nah)explore all of your advertising opportunities.
I think the web in Googles eyes since going public is more like a zoo.
Even if we are a Tiger, Google will (or wants to) put us in any cage that they can.
Matt's added three more videos.
(Matt, dude, the supplemental results are not "fresher"... URLs that haven't existed for 18 months still being in the index are not "fresher"... Googlebot never in its history obeying 301s involving supplemental results is not "getting better". On the bright side, Mary Ann is the right choice...)
Another posative thing about Matt's Videos...
It seems that Matt has lost some weight and he looks great and much fresh than before those famous 6 weeks summer vacation.
Congrats, Inigo. You just keep those Videos coming :-)
Coming (again) to the topic here...
I think Google needs to open a communications channel with those who are really harmed by what is happening. It is crucial for the webmasters, but it is also crucial for G themselves and I will explain why.
Most webmasters who will notice a drop in Google rankings will start questioning it, as, let's face it, it is the biggest contributor of traffic to our sites (that is if they rank high still).
After researching, people can find if they get spammed by "sneaky redirects", "referral spam", etc., things that Google advises webmasters not to use but spammers use and get away with it.
It makes it obvious if searching for your company name in google and you find all those spammy sites which have copied a paragraph from your text ranking higher than your company's site. I am not talking about a specific black hat tactic here..
So Google can really benefit from such a communications channel in terms of providing a better search experience to their users.
However, instead of spending their gazillions on things unrelated to traffic they can spend a few mere millions to improve their search, which have been hammered by spammers on quite a few occasions.
There is a REAL need to open a communications channel and particularly with those who run genuine sites and get severely penalised by the actions of spammers. Those are the people who are more likely to find out things that even Matt Cutts doesn't...
Also, what Google needs to understand is that hey I am trying to make a living here. I have far more important things to do for my users, other than having to spend 4 hours from each day of my life going through this excellent site.
Isn't this situation having a severe effect on people's work? What if this time we all spend here we could spent it on making our sites better for our visitors?
Also, talking about communications with google, they have the spam reporting feature. Do they EVER look into spam reports? I can let all know of a single IP address hosting over half a million sites with the same redirect script on all of them and a dictionary and a half of subdomains and subfolders in those which (guess what) ALL are indexed by google.
I have submitted several spam reports, and nothing happened. 4 months on and those sites are still in google. I am sorry but it is getting ridiculous to spend even 10 minutes of your precious time to submit a detailed spam report, only to have it directly sent to Matt Cutts' trash bin.
What attitute on earth should that get from webmasters? Am I supposed to help them fight spam and get neglected?
On the other hand, Google is spending to increase brand awareness and extent it's frontiers to new business models and advertising opportunities, but hey they need to spend more on their search technology.
Theoretically one can argue that it is the best way for google to fight spammers, as only the victims can tell sometimes what has happened. And in my case took several months to find out.
Definitely I am not talking about a channel available to the masses, as noone can expect from google to be able to address that. One backdoor to the "Matt Cutts & co" team, since these guys are responsible for improving search.
Or, for them to start using their existing infrastructure, as the spam reporting feature they have. Wonder why it is there...
[edited by: mcskoufis at 3:29 pm (utc) on Aug. 1, 2006]
|I do think that we'll be discussing data centers more toward the end of the summer, if I had to guess. |
Reminds me of comments made pre-Florida ;-)
So, we're all pretty much agreed on at least two easily implemented requests:
1. Google SiteMap accounts should have a "report a problem" form.
Reinclusion requests don't seem to cut it for all situations. Too often, regardless of the nature of the problem being reported, these just come back saying "you have not been banned because your pages are included in our index". It seems that the Google Help desk often take reinclusion requests literally, which isn't really surprising, since we are all forced to confess to imaginary sins just to be able to send one.
2. In the meantime, we would very much like an email address (a la bostonpubcon2006)
This would allow those of us with June/July 27th ranking problems to report the problem effectively
Any progress on either of these?
I see how this can be a good idea, but honestly why do I need to use sitemaps for that sole purpose? They would be showing preference to webmasters that use sitemaps then.
I have said in an earlier post that I have no intention of using sitemaps. Its not because I think i need to install something. Its because I really think that it is not necessary. Google knows i exist, probably more than I want them too. They have ample information regarding my websites via 10 months of analytics. My point is I think that we should have a source to go to that is automated, but sitemaps is not the answer because there is going to be preference.
Back to sitemaps, i have used it on a site before for a/b testing and for me it holds no value to me, and if its primary intention is for "something else" other than "Google Sitemaps is an easy way to tell Google about all the pages on your site, which pages are most important to you, and when those pages change, for a smarter crawl and fresher search results." why do i need it to report a problem? AFAIK no other search engine requires this, so why should i be doing it?
I just feel that i give google more than enough stats on my site via analytics and really do not think sitemaps is of benefit to me.
Why not just create another serivce that is independent of sitemaps that anyone can submit to?
If it was up to me, lose the automation process, drop the standard replies and hire a staff that can handle the workload and take it from their. Google certainly makes enough money to do this and it would absolutley give them a better image with webmasters. Sure its gonna take a long time to clear the backlog of issues submitted already, but after that is clear, it will become an easily maintainable thing.
I shake my head in disbelief that Matts' single answer to site problems seems to be "join Google sitemaps"; and then I come to WebmasterWorld and see the long list of people reporting that their sites were dropped as soon as they did that, or that sitemaps is reporting crazy stuff about their sites.
"I have said in an earlier post that I have no intention of using sitemaps. Its not because I think i need to install something. Its because I really think that it is not necessary."
I understand. But lets look at it in this way:
We should aim at Sitemaps to offer "confirmed" webmasters all the communications channels needed to the folks at the plex in addition to super duper tools which Google might make available for us.
Btw, have any of you heard from vanessa fox yet? Maybe we should ask Matt to invite her to joine WebmasterWorld to answer few questions. And in return we promiss that we shall give her a very nice kind welcome ;-)
Reseller, I agree to a point. If they offered that option with super duper tools :) then i might consider it. However, maybe the answer would be better to have a report submission form in any of the webmaster services that they offer. (i.e sitemaps, analytics etc..) Something tells me that there are more webmasters out there that do not use sitemaps than those that do. I would even say that the amount of webmasters that do not use sitemaps is dominant in comparison to those that do.
|I see how this can be a good idea, but honestly why do I need to use sitemaps for that sole purpose? They would be showing preference to webmasters that use sitemaps then. |
As I've pointed out already on this thread, you don't need to register a sitemap in order to have and use a sitemap account. Take a look for yourself.
That is not the point. I understand, but then you are basically saying to thousands of people that they get no help unless they sign up for it. Does that seem fair?
|That is not the point. I understand, but then you are basically saying to thousands of people that they get no help unless they sign up for it. Does that seem fair? |
Are you serious?
Are you saying that it is unfair to require registration in order to open a support ticket? That's how every other company on the planet does things.
"Are you serious?
Are you saying that it is unfair to require registration in order to open a support ticket? That's how every other company on the planet does things."\
I am serious. One thing that I have noticed is that when people are required to submit a support ticket, they are usually paying for something prior to that. Not every company, but most do. I am not saying that it is unfair to register, but I do not see why it has to be sitemaps. Why cant they just have a help center service that people can sign up for that is separate from other services. No offense, but if so many people are having issues after using sitemaps and regardless if you have to register a sitemap or not to submit a report thru sitemaps, you will then be submitting your domain(s) to a help center that works through sitemaps service. Why should i have to take a risk like that?
|Why should i have to take a risk like that? |
This is getting surreal. A risk like what? How are you going to report a problem without mentioning your website?
i first found WebmasterWorld during the infamous florida update. i was cruising along, happly little webmaster, and wham! santa brought coal.
googleguy's "walk around the lake" and "seo will be happy" posts in the aftermath kept me sane, and also kept me from making any radical changes. sure enough, everything began to come back.
but i remember even then how many semi-flaming / whining / angry posts there were. including the ubiquitous "my mom/sister/brother is an average user, and she says she can't find anything google anymore, and has moved to "x" search engine" posts.
For me, MC’s blog has replaced googleguy’s posts here as my first stop in search of google info. Although, the rise in MC’s blog and the fall of googleguy’s participation here is a rather, how shall we say, ‘interesting’ coincidence?
ok, i can see that you do not see where i am coming from and thats ok :). The risk of submitting my URL(s) through a system that scores of people have stated they are having issues with since it's roll out. I do not see the need to have to submit my url through something that multiple people have said is a possible reason thier sites have dropped.
Yes, i am aware that i dop not need to register a sitemap to do things in sitemaps. However, i do not necessarily want to submit my url(s) through something that has been having the kinds of problems that it does. Why take a chance and submit a url that you think may or may not be having a problem in a system that has multiple known issues? Its not worth it to me. However, I would have NO issue submitting url(s) to another service that is independent from sitemaps. There would still be a risk, but at least i would not have to worry about it running through a service many have stated is buggy.
Join google sitemaps? Join google sitemaps? Join google sitemaps!?
I did! And it did no good. Things got worse! Google can not only NOT index a 32 page site...it still even with the sitemap account, does not index a specific 10 pages but still indexes old versions of them that have 404'd for a year now. And the two sites that I have sitemaps on (different accounts) are all supplemental now except for the index pages.
It's pathetic. I went to some trouble to create and upload sitemaps with the intimation from google employees that it would help my sites.
As we say in Texas...BULL! It's only worse!
G1smd..you may shake your head in disbelief but I threw my trashcan across the room. My blood pressure is way up.
Before Matt tells people to join Google sitemaps, Google needs to make sure it works. Not just empty promises.
Sorry for the rant...
And that's why promoting sitemaps as an answer to site issues has been one of my big beefs. It then just adds to webmaster confusion and a lot of the less experiences guys panic and wonder if they should use it or, if they are already using it, if they should discontinue use. It's downright cruel is what it is. When you know that your site has no spidering issues you could laugh but want to cry because the only answer you are given to address problems is ridiculous. And the tenth time you see sitemaps offered as a solution you have already passed anger and feel insulted.
While I certainly appreciate the work Matt has done to release this presentation, I still do not see any information that isn't common information discussed here daily that can significantly help webmasters with issues at Google.
It is as if they are not recognizing any of the issues that clearly affect business listings on a day to day basis.
Most real functional businesses understand the merging of user end presentation and seo, but are wondering why their website which was apparently the right mix before the dreaded D dates all of a sudden was deemed crap.
Google sitemaps has been effective for some sites of mine, and a nightmare on other sites, grabbing incorrect urls.
No mention of the ever present canonical issues, although I am sure that question should have been presented.
No mention on a real way to report issues such as spam, only a mention of 'there is no tools really available to webmasters for this' which might amount to 'we rarely act on reports of spam'
Sounds a little like the elephant in the living room syndrome.
< continued here: [webmasterworld.com...]
with a post by Google's Vanessa Fox >
[edited by: tedster at 5:38 am (utc) on Aug. 2, 2006]
| This 205 message thread spans 7 pages: < < 205 ( 1 2 3 4 5 6  ) |