Forum Moderators: Robert Charlton & goodroi
We need to keep this thread focused on the followings:
- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).
- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.
- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).
- Effective ethical measures to deal with the above mentioned changes.
Thanks.
You have sites that have been subjected to reasonable shifts in rank. That's one branch of the discussion, where algo changes may be speculated upon. Webmasters may adjust their sites accordingly, and improve their position.
You have sites that have gone from the heights of the Olympus to the depth of Hades in 5-seconds flat, no longer coming up for their company name. Perhaps these sites should be excluded from this particular discussion. They reveal nothing about the nature of the new algo, only that any old update will stir new ranking hijacks to the surface. No amount of page fixing will help these webmasters.
Not that there is a damn thing we can do about much of the theories.
I for one am working with Adwords support to find out why sites such as about.com and ostg.pricegrabber are having their adwords ads/links indexed as pages (of our sites) in the serps.
So who said this update is almost over? Last I heard it was about 2 more weeks.
Our site used to be #8 about 15 months ago, then #5, then #3, then #4 and now it must be #3 again (I am not sure, as the SERPs have not stabilized yet, so sometimes we are still #4).
I keep reading about all the effects of Google updates but - thankfully! - nothing has effected our rankings yet.
Please note that we did a backlink campaign about 18 months ago (If I remember correct!) and we have done nothing since then...
What we only kept doing is adding quality content and content and content. We intent to start a new link campaign very soon, in order to target the #1 position; the right time has come indeed :)
BTW, in case any newcomers to this thread are confused on what's going on, see "Bourbon Update Part 4" here, this is the LAST page of it.
[webmasterworld.com...]
[edited by: Clint at 12:33 pm (utc) on June 11, 2005]
Clint,
EUREKA.. I found the answer to your woes.*There should be a binary push this week to improve a corner-case of CJK-related search, and that new binary should have the hooks to turn on the third set of data.*
Get it. If not then you are not a webmaster wothy of ranking in google. Get another job.
While you are at it get me one too because looking back on a few of the posts I made my sites are going to go into oblivion with yours. Into the infested nether regions of google's cordon saintair of unwanted websites.
Hell I'd rather rank last in alltheweb than rank first in google knowing that I am only ranking in the absence of good sites. I am not gloating I am a winner, I feel sorry for webmasters who tanked for no apparent reason.
Google cannot expect every webmaster to be an expert and play its ever changing secret rules.
That so called "binary push" was apparently BS. Didn't do cr@p for me.
I know what you mean, but when you've already been to college for your trade, 45 with some health disabilities, with a family + retired parents, have your own business.......(HAD your own business I should say), that is much much easier said than done. Perhaps I should practice filling out bankruptcy papers and start practicing saying: "Would you like fries with your order?" I'd have to get SEVERAL of those jobs to make a @!#$%!@# living. :( :( :( I couldn't do them anyway due to my medical problems. Thanks again Google. (Praying for a miracle because that's what it seems like it's going to take).
I'm not sure this update is over yet, so it may be premature to be dealing with an update that is not over. Why analyze something that's not finished?
That being said, I am seeing some very strange activity from Googlebot right now on my site. I used to have links to .pdf files for each page of the site and you could also email the page to a friend. With these links the number of pages of the site was essentually tripled. I took off the links to these pages months ago.
No bot (including Googlebot) has asked for those pages in about four months now. Between yesterday and this morning this is the only thing Googlebot is requesting.
That being said, the fat lady may be in the room, but she's not singing yet.
I'm seeing something strange though. Pages that I HAVE CHANGED, the bot is returning an "Http Code: 304", and doesn't that mean "page not changed"? I'm also showing a lot of http code 301 which is evidently from it accessing my non-www pages and it's getting sent to my www version of the pages. So, is that good or bad?
I am going through my morning checks and I am seeing an update to the Google directory. Sorry if this is a repeat. Google.com for me is 64.233.161.99 right now.
I KNOW this is an update because about two months ago DMOZ removed several bad players from my category. Google is now reflecting those changes. Although it seems like the directory update might not be complete.
Clint - 304 = not modified, you're right.
I am going through my morning checks and I am seeing an update to the Google directory. Sorry if this is a repeat. Google.com for me is 64.233.161.99 right now.I KNOW this is an update because about two months ago DMOZ removed several bad players from my category. Google is now reflecting those changes. Although it seems like the directory update might not be complete.
Ok, so then the GoogleBot is wrong, it is indeed backwards! What's going on with that?
I can only say what happened in a couple areas I watch (10-70 million results) and they are:
Lots of cloaking. Some of these have been gone from Google for a years and they are back. Some have suddenly appeared out of nowhere.
More mirror sites are showing up also – not like Yahoo, but Google had always been pretty good at finding complete duplicate sites and not showing them.
Many large sites gone to be replaced by sites that offer a single page or just a few pages on the topic. Also seeing more pages show up that are using free hosting services.
I am seeing more error/blank pages like "This page doesn't exist" , "page moved", "account does not exist" etc. Some of these "page moved/does not exist" have been in the index for literally years on a few .gov sites – they have gained positioning in this update.
Very "corporate" sites. What I mean by that are the big money spenders with huge affiliate programs and multiple sites that end up taking the user to the same place at the end. Often these cloak for Googlebot - not even good cloaking, change your UA to Googlebot/2.1 and you see all the keyword stuffed pages that the user cannot see without filling out multiple forms. 99% of this spiderfood/info pages will never be seen by users/visitors.
So, what do I see? I see the age "signal" being cranked way up – way too far in most cases. Just because a page has been indexed since the late 90's doesn't mean it's good or relevant. I think this is the single most damaging "signal" that has been cranked up and could be the single largest cause of some very good sites dropping from the serps. It is the common denominator I can really look at and see clearly for the reason behind abandoned pages/sites showing up and the sites/pages that have risen.
I also see that "bad" cloaking still works wonders, but the sites that I see doing it are also "old" sites so I am not going to jump to the conclusion that the cloaking is working – just that it's not hurting them.
The same goes for mirror sites that have reappeared after 3-4 years. I don't think creating mirrors is a smart thing, but the ones showing up have the same common denominator – they are old (7-8 years).
Google has thrown out a lot of "babies" with the bathwater - which were actually some of the most useful sites (for visitors anyways).
My own sites? I only have a couple. The one that has been hit has always been my pet project and has been around for about 5 years. It has "never" been affected negatively by an update (once lost by google though for a month a couple years ago). This is the first update that has affected that site in a negative way.
Will I change anything? Nope, built it for visitors and I will continue to do so. The only sites that moved up which pushed it back are sites "older" than it. I cannot do anything about that, so I will continue doing what I have for the past 5 years.
I'm not an expert in this area, but according to W3.
If the client has performed a conditional GET request and access is allowed, but the document has not been modified, the server SHOULD respond with this status code. The 304 response MUST NOT contain a message-body, and thus is always terminated by the first empty line after the header fields.
Googlebot would be the client (Getting), your server is returning a 304 - not Googlebot.
If you built your site "for visitors", then you won't be getting any from G, in which is the same boat we are find ourselves.
Can you please explain what you mean by this:
"Often these cloak for Googlebot - not even good cloaking, change your UA to Googlebot/2.1 and you see all the keyword stuffed pages that the user cannot see without filling out multiple forms. 99% of this spiderfood/info pages will never be seen by users/visitors. "
Thanks.
Contractor, my site has….
Take it for what you will, but never judge anything by using the criteria of my site. The problem is everyone that looks at their own sites lets their emotions get in the way of what is really going on. I don't know your site, but there could be numerous things affecting one site's positioning. To judge the serps by looking at your own positioning will get you nowhere. To judge the serps requires that you know the industry inside/out and the top 100 sites which have been around for years and/or the newcomers (which hardly happens nowadays) that have gained positioning.
Concentrate your judgment on the other 99 sites in the top 100 and then you can see what's working/not, what "signals" are being adjusted etc.
I have spent 1000's of hours on my site so if I judge serps by my site I cannot do so clearly. By looking at what makes the others rise and fall you can more accurately judge what is being affected in "your" niche.
Can you please explain what you mean by this:
"Often these cloak for Googlebot - not even good cloaking, change your UA to Googlebot/2.1 and you see all the keyword stuffed pages that the user cannot see without filling out multiple forms. 99% of this spiderfood/info pages will never be seen by users/visitors. "
Although I do not cloak anything myself, there are many "good" reasons for cloaking to make your content more accessible and many sites do this. There are also good ways to cloak – using only the user agent string to feed 99% of your content that is not accessible to humans would be considered stupid IMHO.
If you built your site "for visitors", then you won't be getting any from G, in which is the same boat we are find ourselves.
I have never found getting an attitude gets you anywhere – think clearly. If you built your site for visitors – keep building. It's when times get tough you see what a site is really worth to it's owner. Many sites are abandoned and never get updated again when they are hit by an algo change – what signal does this send the SE's?
Will I chase my tail on an established site to try to keep up with Google's algo changes – never, it's a losing battle and a game you won't catch me playing.
I keep reading about all the effects of Google updates but - thankfully! - nothing has effected our rankings yet.
I wonder if it has something to do with size and the strength of the sites ranking.
Both of my sites were built the same way (handcoding) and both were vulnerable to hijacking in terms of how I did my linking.
Somehow the Bourbon update plunged the smaller site while the larger one is still #1 for several related 2 word searches and even #7 for a general one word description. It's been there for a couple of years at least.
One possible factor is that the site that was spared is about 9 years old while the smaller one is about 5 years old.
I'd be interested in theories on what saved the larger site while the smaller one dropped significantly.
I'm gong to wait a bit to see if the damaged site comes back. If it doesn't I'll close it down and put some of the best articles on the other site, mainly those written by experts in the field.
I'm also concerned about protecting the larger site that survived Bourbon. This update had made me realize how vulnerable my sites are.
Sites got slammed that had no business getting slammed, (i.e. many are now saying that according to G, NO MANUAL PENALTIES) which means it's an algo screwup; and ~90% if the sites in the G SERPs now for my search phrases (and most other's search phrases) have no business being there. I can't repeat all this all over again (I've been doing it for weeks now), so just read that URL in my post, [webmasterworld.com...] 1st post on the page. That will sum up most of what is happening to most of us. I'd browse over the last several pages to get a feel of what's happening.
There are STILL no commonalities to the sites removed, nor in the sites not removed....except for NON relevant content I should say for the sites not removed.
I wonder if it has something to do with size and the strength of the sites ranking.
Nope. ;) I see sites with a PR of ZERO all over the first page of results for search phrases I'm monitoring. I also see one page sites 1st, and huge sites 1st.
If I may make a helpful suggestion to all newcomers on this thread, or those that haven't been reading "Google update Bourbon Part 4", please read several of its pages and most of your answers can be answered there. This starts at page 40, as good a place as any: [webmasterworld.com...] A lot of "conjecture" or "speculation" can also be answered there.
Reseller, can you put that last paragraph of mine above, somewhere in your post, or maybe a "Continued from [webmasterworld.com...] ". I think that would a logical thing do to.
What i want to know is why all of sudden did GoogleGuy go MIA> must have gotten in trouble.
NOTHING he said here about .05 whatever rolling out datacenters BLAH became remotely evident to me... unless that evidence is SERPS that are even further in the toilet. what a joke.
Someone field this question please: Why is it that a company that does it first (in G's case first in simplistic search and relevant results) ALWAYS thinks they are entitled to stay first? I think they forget that this is a FREE Enterprise society. You know, what the USA was founded on.
If I was Google I would be pretty concerned over it. Unless they are all delusional.. HA I think I just answered my ow question :)
There's my rant.
One possible factor is that the site that was spared is about 9 years old while the smaller one is about 5 years old.
How old are the sites dominating that niche?
I have found that to be the factor in what I am seeing compared to "size". Large sites in my opinion have always faired better with Google in the past - not this time. Seeing several single-page sites or single pages on unrelated sites/domains that are 7-9 years old doing very well - even though they haven't been updated since then. Like I say, have seen pages like "page does not exist" rise a full 20 or so positions and those pages are several years old - why else would a blank page rise in positioning (I know it doesn't have any more backlinks).
Reseller, can you put that last paragraph of mine above, somewhere in your post, or maybe a "Continued from http:// www.webmasterworld.com/forum30/29782-82-10.htm ". I think that would a logical thing do to.
I don't think this was meant to be an update thread in the sense of my site this or my site that (I hope not). It's more of the aftermath. Talking about ones own site and/or complaining about Google and their business model will do nothing to change the facts - so let's all stop complaining and evaluate it.....
What i want to know is why all of sudden did GoogleGuy go MIA> must have gotten in trouble.
GoogleGuy doesn't spend a whole lot of time here, as a rule--he mainly shows up for a few days during updates.
Have you considered the possibility that he might be working on solving problems with the search results instead of merely talking about them here?
I don't think this was meant to be an update thread in the sense of my site this or my site that (I hope not). It's more of the aftermath. Talking about ones own site and/or complaining about Google and their business model will do nothing to change the facts - so let's all stop complaining and evaluate it.....
The update is not completed (I've seen nowhere that it's completed), so it would be my guess that this thread started by Reseller is going to end up as a continuation of Part 4. If not, then I guess I need to start another topic. ;) The topics Reseller outlined in his (or her) post is indeed what was being discussed in Part 4:
- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).
- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.
- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).
- Effective ethical measures to deal with the above mentioned changes.
"Complaining" IS part of any "evaluation" process here. Any "evaluation" process cannot be complete without those affected comparing notes, and comparing what has happened to each of us affected in order to find the ever-elusive solutions of said event to which neither GG or Google will discuss. It apparently is all up to us (please say that ain't so! :( ). "Complaining" IS going to be part of that when someone with a trashed site starts describing their situation. :)
[edited by: Clint at 2:53 pm (utc) on June 11, 2005]
>Reseller, can you put that last paragraph of mine above, somewhere in your post, or maybe a "Continued from [webmasterworld.com...] ". I think that would a logical thing do to.<
Honestly.. it wasn´t my intention when I started this thread to start a "Bourbon Update 5", but to discuss how to deal with consequences of Bourbon.
[edited by: reseller at 2:53 pm (utc) on June 11, 2005]
>>>I'm not sure this update is over yet, so it may be premature to be dealing with an update that is not over. Why analyze something that's not finished?
I just dont know where we are regarding the 3.5 timescale - is that done?
One thing I noticed for a site that had gone url only on non-www and www - the site is now showing description again on DC [72.14.207.104...] with a cache date of 1st March.
I hope this is a move in the right direction for that site.