| This 120 message thread spans 4 pages: < < 120 ( 1 2 3  ) || |
|Dec 2006 Google Changes - Data Refresh or Penalty? (part 2)|
| 1:13 am on Dec 25, 2006 (gmt 0)|
< continued from [webmasterworld.com...] >
Perhaps, based on Matts recent post [mattcutts.com] where he says:
|....expect those (roughly monthly) updates to become more of a daily thing. That data refresh became more frequent (roughly daily instead of every 3-4 weeks or so).... |
it is time to coin a new term, EVERFRESH. Google, it's full of Everfreshness. :)
Seriously though, let's take Matt at his word (if you find that difficult then for a moment at least) when he says:
|I know for a fact that there haven’t been any major algorithm updates to our scoring in the last few days, and I believe the only data refreshes have been normal (index updates). |
Instead of saying Google is broke why don't we ask ourselves what could be the causes that produce SERP fluctuations or the reports thereof?
- SERP results for queries that produce smaller result sets are more subject to large changes because those queries enjoy less competition, documents are more likely to not be optimized, and changes to documents (or the addition or deletion of documents) may have a deeper impact on the SERPS.[/li]
- SERP results for deeply contested search queries are more likely to change because active optimization is widespread. That means there are lots of link changes, content changes, and content additions afoot, changes that will change the SERP results.[/li]
- People on these forums are actively optimizing documents hence they are more likely to be a part of group 2.[/li]
- People who are unhappy with the SERP results or who are affected negatively by SERP changes are more likely to post such on SEO forums than those who are perfectly happy with how the universe is treating them.[/li]
Now that we are in the EverFresh era with daily updates I expect that SERP changes have become more obvious for the exact opposite reason that movement in film becomes more pronounced when you slow down the rate at which frames advance. We have been watching the length of time between SERP changes shorten with each Google upgrade. We have also witnessed, over the years, communication between the data centers improve and accelerate. Now, these have combined and moved so close to real time (when compared to the Google Dance days) that data center watching has become virtually meaningless over the last six months. (The Google Dance has become a GoogleRave.)
I’m not saying that Google never sticks a hot iron poker into their works (and I definitely think they have made some silly choices like, ahem, their no-follow policy). It’s just that there are far more, a near infinite percentage more, changes happening to the content and population of the documents on the Internet. I expect to see lots of change to the SERP results.
So, assuming that Matt is as honest as he is nice, what do you see as non-Google causes that can be responsible for noticeable SERP fluctuations?
[edited by: tedster at 3:23 pm (utc) on Dec. 25, 2006]
| 12:09 am on Dec 29, 2006 (gmt 0)|
I was hit on Dec 20 like most here, and have seen all the same problems. I did a site: command, and the cache date for all pages listed is between Dec 14 to 19 with a couple at Dec 23. However, google WM tools shows a cache date for all pages of Dec 23 to Dec 26.
Googlebot is working over time but SERPs for my sector seems to be from a differant planet at this point. search terms w/or w/o quotes give far less relevant results than before Dec 20. Also, the "short" URL issue seems to be a factor. Listings start w/a short URL and then get longer as I work thru the results. (I never paid any attention to it before, but it seems to be a factor now).
As far as I can tell google has changed something (whether they know it or not), and it's not anything on our end of the WWW.
| 4:55 am on Dec 29, 2006 (gmt 0)|
My site was hit at 7 December with supplemental results and bad rankings since then.
What gives me a little hope is that the "Query stats" in Googles "Webmaster Tools" still show me good rankings as before.
I know that this tool is normaly delayed but not more than 7 days!
So what's going on there?
| 9:29 am on Dec 29, 2006 (gmt 0)|
Google Recent Shift in “Algo Updates Policy”
An effort to understand Google contemporary updates
First off, the followings is by no mean established facts. Its rather an honest attempt to understand the recent shift in Google’s Updates Policy.
December 2006 hasn’t been the most happy month of the year for several fellow WebmasterWorld members. For obvious reasons we don’t have an exact number of affected sites. However, there is no doubt by now that those Google “savage” Data Refreshes have affected ranking, indexing, backlinks, traffic and ultimately the business of several members (as expressed through many related threads).
Though Google assured us (through our kind fellow member Matt Cutts) that there haven’t been a “major” algo update or “major changes”, there is no doubt whatsoever that the impact of those Data Refreshes resemble the impact of the classical updates as Allegra and Bourbon etc..
Because some of us don't wish to consider those Data Refreshes as major algo updates, many find it very hard to understand why their sites have been affected (sometimes very badly).
Reading the feedback of affected fellow members on the different threads, It seems that Google has hit again very hard in this December on Duplicates or what it regards as duplicates or Unworthy Contents.
It seems that the contemporary Algo Updates function in stages through successive Data Refreshes.:
- Identify the Unworthy Contents
- Deprive the Unworthy Content of its rankings either by “converting” it into supplemental or just de-index it.
In that connection, I respectfully disagree with Matt Cutts that “supplemental results by themselves don’t indicate badness/penalties/problems”. IMO, Google treats supplementals as Unworthy Contents which is kept isolated in its own Unworthy Index.
- Depending on the volume of Unworthy Content, the entire site might loss part/big part of its Worthiness and decline in rankings in general.
IMO, one of the symptoms of sites suffering of decline in Worthiness might be illustrated through site: operator. While site: operator displays what we call “normal” listing (most important pages at top of the list) for a “Worthy” site, the same site: operator might display supplementals at top of the listing or displays no “logical” order in listing for affected sites (less Worthy or Unworthy).
I ‘m looking forward to hear more details (case history if possible) especially from kind fellow members who have been affected by what it seems a “Duplicates-Penalty” during the so called “Data Refreshes” of December 2006.
| 11:43 am on Dec 29, 2006 (gmt 0)|
I see that "Unworthy Content" gets crawled heavily.
[edited by: SEOPTI at 11:44 am (utc) on Dec. 29, 2006]
| 1:01 pm on Dec 29, 2006 (gmt 0)|
|I see that "Unworthy Content" gets crawled heavily. |
Unfortunately, getting crawled heavily does not necessarily mean getting out of supplementals, nor does it guarantee a listing in the index that can be found by searchers for any of the keywords or phrases. So, for all logical purposes, the page isn't indexed.
I don't know of many searchers who will dig to #700+ in an attempt to find what they're looking for. Perhaps a few will, but not enough for a successful website.
| 1:03 pm on Dec 29, 2006 (gmt 0)|
No duplicate content here: just unique, long standing and very popular information. This has been replaced by a Wikipedia stub, which is so shallow that it is worthless - in fact duplicate as well as worthless.
They just seem to have so many filters and penalties that they are tripping over themselves and catching everything in sight, except of course for the big guys like wiki, who have a 'get out of jail' card, however useless their content on a topic.
Once silver lining though is that this forced me to use live.com for the first time. And hey, it isn't at all bad. It showed me that mine isn't the only worthy site missing from Google.
| 1:14 pm on Dec 29, 2006 (gmt 0)|
Almost all of my fresh content is still indexed, and is non-supplementary. Those pages that are supplementary have been supplementary for quite some time, and this never effected my rankings. My ranking started tanking in early December, and have been obliterated since December 20th. I haven't been banned, just decimated in rankings, like never before.
The site:www.domain.com operator yields the same # of pages that it always did, but my Supplementals are now listed first.
In my specific case, there are a few pages that rank high for certain keyword terms, but most of my major terms are gone for my main pages. My pages contain unique content, unless they have been duped by others. Specifically, I have used COPYSCAPE against a number of my pages, that return NO results (hence, not duplicated). These pages won't even come up in the SERPs when I enter the page title or description verbatim in my searches (they do, however, when they are placed in quotes - and even then, show up on Page 2 mostly). Yet for several other pages, placing my title in search verbatim, puts me at the top of the heap. This inconsistency allows me to cling to the hope of a "bad data push", as it is not fully consistent with the negativity mentioned above.
One additional observation that I noted today, was an error that was listed in my Sitemap diagnostic page. To reiterate from my prior posts, I have 9 websites that all fall under my single domain (has been in existence for 10 years). Each of the 9 websites are contained in a subdirectory. I have been ranking well for each for many years. The format of my subdirectories are:
Of the 9 websites, I consider one to be MAJOR, since it receives the most traffic, due to its searchable content.
SITEMAP OBSERVATION: On December 15th, I have a Sitemap Diagnostic error that states that:
www.domain.com/subdir1 yielded an http error (403 - Forbidden). Note that this isn't a "network unavailable", but a 403 on a specific directory (not a page). I do have (and always have had) directory listings disabled on my webserver, but I found it odd that my entire directory (for this heavy-hitting website) came up 403 by googlebot. For fun, I enabled directory listings, just to see if this had anything to with my ranking tankings.
Additional sitemap observations that seem a tad weird:
- I have 21 Not Found sites listed in diagnostic of late, all pointing
to subdirectory/page combinations that don't exist. To verify this,
I spidered my site using a tool that I have, and none of these
21 alleged unfound pages, turned up as being referenced on my
- I have 6 unreachable pages noted from Dec 14 - Dec 26th. My
network (I host my own site) has not been down at all during
that time, but I guess it is possible that my provider wasn't
sending things through during these times. Of note, 2 of these
6 warnings, mention that my robots.txt file is missing. I don't
have a robots.txt file for any of my sites, but this is the 1st
time, I've ever seen this error logged.
QUESTION # 1: For those who have been uneffected by this Dec. 2006 phenomenon, do your Supplementals show up first?
QUESTION # 2: For those who have been effected by this December tornado, have you seen anything similar in your Sitemap diagnostics?
[edited by: doughayman at 1:52 pm (utc) on Dec. 29, 2006]
| 1:36 pm on Dec 29, 2006 (gmt 0)|
doughayman - This is interesting. We notice almost the identical order of the most popular pages across our site compared to before the crash and yet some terms rank number 1 and others 950+.
It seems very strange that some search terms are unaffected while most have proportionally sunk, leaving the overall popularity of pages in the same order and almost the same keyword popularity order.... go figure?! This seems like a datapush error rather than any deliberate change. In the past you could see a pattern... perhaps longtail searches lost or specific keywords lost. There is only one pattern for us and that is one major page has lost all its rankings, the rest have all just proportionally dropped.
| 1:47 pm on Dec 29, 2006 (gmt 0)|
Yes, I agree. The inconsistency alludes to error of some sort. If things were consistent across the board, I could accept algo change, but they are not.
MHES (and all), please re-read my post above (which I just edited), to read about my querky Sitemap error findings.
| 1:55 pm on Dec 29, 2006 (gmt 0)|
|The site:www.domain.com operator yields the same # of pages that it always did, but my Supplementals are now listed first. |
Thanks for feedback. That looks like the symptom I talked about in my previous post:
|IMO, one of the symptoms of sites suffering of decline in Worthiness might be illustrated through site: operator. While site: operator displays what we call “normal” listing (most important pages at top of the list) for a “Worthy” site, the same site: operator might display supplementals at top of the listing or displays no “logical” order in listing for affected sites (less Worthy or Unworthy). |
[edited by: reseller at 2:00 pm (utc) on Dec. 29, 2006]
| 1:56 pm on Dec 29, 2006 (gmt 0)|
Mine is almost word for word same as doughayman's. Have 9 sites up and running, one being my bread and butter or the one that was hit hard. A handful of supplementals come up first and then index followed by remaining internal pages. Pagerank and backlinks still the same. Hit on 19/20 Dec. Come up #1 for only a couple of three key word phrases. We've been in the top 4 for several main keyword phrases for over 2 years. Dont' use google site maps on my sites. Main site has been around since 1999. Traffic now down to 1/3 of what it was. One note, over the last couple of days, have been coming up for a couple of additional keyword phrases in google uk and nz.
| 1:58 pm on Dec 29, 2006 (gmt 0)|
Thanks reseller. Can I assume then (assuming that you have been uneffected), that your "site:www.domain.com" command lists your supplementals last, as opposed to first then?
| 2:13 pm on Dec 29, 2006 (gmt 0)|
doughayman, i see the same symptoms as you with regards to site:, supplementals, rankings, and webmaster center.
| 2:16 pm on Dec 29, 2006 (gmt 0)|
Thanks for your feedback. Specifically, are you seeing a 403 (forbidden) logged in your Sitemap Diagnostics, being applied to a root directory as opposed to a page?
| 2:35 pm on Dec 29, 2006 (gmt 0)|
Sorry, no, I'm not seeing that. I do have some Unreachable pages (not root) from Dec. 15, and one 404 page (my most important inner page which also has the highest PR) but that 404 page includes a colon at the end. So, mypage.html: is showing as 404 and of course there is no such page, and no such link exists with a colon at the end as far as I can tell.
| 2:36 pm on Dec 29, 2006 (gmt 0)|
As an aside, I was wondering why this problem is not affecting more sites, we keep hearing comments like "... business as usual" or " what problem, I'm fine" etc.
Well there could be a good reason. Google has suffered from bad publicity in the past with big updates. Often it does not matter if the update is good or bad, there have been enough complaints to warrant ill feeling which has been picked up by the press. By doing these changes in small doses, there is never enough impact to affect their share price, especially if it goes wrong. Members here report the same observations going back months, if I was google, I would do the same and roll out changes in small pockets below the press radar.
Anybody else seeing the same order for busiest pages before and after the drop? This could be a clue to where we should be looking.
[edited by: MHes at 2:40 pm (utc) on Dec. 29, 2006]
| 2:37 pm on Dec 29, 2006 (gmt 0)|
OK, here's another observation of mine (using comments from others that have been made on this subject):
- For my major keyword terms, the top results seem to be mostly
root page domains, as opposed to specific pages within the
- For one of my sites' subject domains, which has previously
infested with spam, most of the spam seems to be gone. The
specific "spammer team" responsible for this spam (it was
obvious that it was being generated from the same guy), his
former spam pages were black-hat deposits on forum sites that
had obvious holes, for allowing such garbage. The URL for
these pages typically were pages many subidrectories deep. For
Is it possible, and I'm just throwing this one on the table, that this Google update may have been a major spam filter, that somehow went bad?
Are those that are being effected, missing results from the SERPs, that were pages buried several directory levels deep under their root domain? I fit this profile, and just a weak theory that I'm throwing out on the table.............
| 2:39 pm on Dec 29, 2006 (gmt 0)|
I'm waiting for Google.com to show up supplemental pages first with site:google.com command. I really hope they will dump their own domain one day.
| 2:50 pm on Dec 29, 2006 (gmt 0)|
doughayman - All our pages are at the same level. BUT.... the pages linked to from the home page have suffered a 70% drop, the next set ( 2 links from home) have dropped 80% and so on as you get further from the home page.
Observation = the further from the home page, the bigger the drop in traffic compared to last month.
When did they stop saying "your search contains the words 'in' and 'from'...." at the top of the serps? Is this new?
[edited by: MHes at 2:54 pm (utc) on Dec. 29, 2006]
| 2:55 pm on Dec 29, 2006 (gmt 0)|
Sorry for the flurry of posts today....I think the caffeine is working overtime:
I just answered one of my own questions above. For a subject domain that I normally ranked high for (and now don't), the sites that come up first on the list, also have their supplemental pages come up first (if they have supplmentals), when I do a:
Hence, having your supplementals come up first does not appear to be a symptom of being effected by the December obliteration.
| 3:02 pm on Dec 29, 2006 (gmt 0)|
Further to my post above, and to yours, the terms that I am still ranking high for, are very "short" pages names like:
Keep in mind that this may be a ludicrous assertion. Also, note that all my pages (even my root website pages) start at Level 2.
Could there be some sort of punishment based on level? That is, pages that are further down in the tree are being deemed as less important?
This would be a ludicrous imposition by Google, unless it is an attempt to rid their indices of spam, and this filter went awry.
NOT SURE -> "When did they stop saying "your search contains the words 'in' and 'from'...." at the top of the serps? Is this new?"
| 3:58 pm on Dec 29, 2006 (gmt 0)|
|Thanks reseller. Can I assume then (assuming that you have been uneffected), that your "site:www.domain.com" command lists your supplementals last, as opposed to first then? |
Of course we can't issue a general judgement. However I would assume that a site with entirely "Worthy Contents", in most cases, is that which shows “normal” listing whith no supplementals for a site: search.
In the case of my kind WebmasterWorld fellow member Graywolf website [mattcutts.com], the site: search shows supplementals at the end of listings. Graywolf has posted on this thread [webmasterworld.com] a comment about that:
|I'll say that the order the pages are listed for the [site:] command is still a bit screwy. True the supplemental results aren't first, but the pages listed in the top 10 aren't pages that get the most traffic or have the most inbound links. If you are super curious the 'top posts' section gives you a much better idea of what's linked to and getting the most traffic. I do plan to do a little testing on the "short url" factor. |
[edited by: reseller at 4:04 pm (utc) on Dec. 29, 2006]
| 4:02 pm on Dec 29, 2006 (gmt 0)|
>>Hence, having your supplementals come up first does not appear to be a symptom of being effected by the December obliteration.
I agree. I don't think Google spends as much time worrying about the site: command as webmasters do. When I look at this command I do see one supplemental on page 1. But that particular page is nearly an "orphaned" page. In this case it's a spreadsheet that can be downloaded and it's used as a supplementary tool for an article. So I'm not surprised it's supplemental because it only has one incoming link that I know of and it serves a very specific purpose.
Now if my home page came up supplemental, then that would worry me. Other than that, I'm taking Matt Cutts at his word and not worrying too much about supplementals.
| 4:08 pm on Dec 29, 2006 (gmt 0)|
|Now if my home page came up supplemental, then that would worry me. |
But I assume that it will worry you much if other "important" pages of your site, which you wish them to rank, will be turned supplementals, right?
| 4:30 pm on Dec 29, 2006 (gmt 0)|
Not sure if I was clear on this or not, but my pages that are supplemental are either orphaned pages, or pages that are 6-8 years old and haven't been updated in ages. In addition to my "business" websites under my root domain, I also have some personal websites and some old sites that have gone supplemental. I have made no attempt (nor do I desire) to get these pages out of supplemental purgatory.
ALL of my key pages are non-supplemental, and that hasn't changed through this December debacle. Other than the supplemental pages listing first on my "site:" command, I don't see any relevance associated to this right now, especially since I've seen some very high-ranking sites (currently) list their supplementals first.
| 4:46 pm on Dec 29, 2006 (gmt 0)|
>Hence, having your supplementals come up first does not appear to be a symptom of being effected by the December obliteration.
Hmmmm... not convinced. None of my pages have suddenly become supplemental but rank below them on site:search. However, one of my pages could have been one of the sites you saw rank number 1 ....... and I have been effected.
| 5:35 pm on Dec 29, 2006 (gmt 0)|
doughayman has nailed it:
Is it possible, and I'm just throwing this one on the table, that this Google update may have been a major spam filter, that somehow went bad?
I think it's a long tail problem that had its first phase in the October massacre:
The problem is that the long tail is thinly populated, and so when attempting to automate the removal of spam (which mostly resides in the long tail) it's easy to do a lot of collateral damage. We are the victims of that collateral damage.
| 5:50 pm on Dec 29, 2006 (gmt 0)|
anax - We are still getting the longtail searches. This is what is different from last decembers fallout. In fact there is no pattern to the types of searches we are missing... they vary from competitive to noncompetitive, long or short. A quick glance at the recent visitor queries would make you think nothing is wrong..... except before we had 5 times the traffic for this time of year.
However, there may be two, three or four different issues all running in this thread....sigh!
| 5:54 pm on Dec 29, 2006 (gmt 0)|
>>But I assume that it will worry you much if other "important" pages of your site, which you wish them to rank, will be turned supplementals, right?
Not necessarily. Going back to my example. I've got this supplemental on page 1 of my site: command results - and that's a spreadsheet. If that was my home page I'd be worried because my home page is what my website is all about...
But if I used the "spreadsheet site: www.mydomain.tld" command I see that supplemental page and it's no longer listed as supplemental. My conclusion is that this particular page is not a good result for the simple site: command but it's a good result for the term "spreadsheet." In fact, I've got other pages that show as supplemental that rank fairly well for the term they're targeting.
| 6:19 pm on Dec 29, 2006 (gmt 0)|
BillyS, are the cache dates different on those two listings, the one tagged as Supplemental and the one that's not tagged?
< This discusion continues here: [webmasterworld.com...] >
| This 120 message thread spans 4 pages: < < 120 ( 1 2 3  ) |