| 7:16 pm on Jun 11, 2005 (gmt 0)|
Well - looks like a new sentence about the algo change.
econman - based on your definition I think that he had nothing to lose by saying those dc had the future.
He he - GG posts get analysed like a politicians :)
Mind you a lot of us have a lot of skin in the game.
(I have no idea if that makes sense)
[edited by: Dayo_UK at 7:29 pm (utc) on June 11, 2005]
| 7:29 pm on Jun 11, 2005 (gmt 0)|
There should be Googleguy UK on the UK & Ireland forum so us "limeys" don't get confused!
| 7:39 pm on Jun 11, 2005 (gmt 0)|
>I avoided most of the damage done to my one site which was slaughtered in this "update" by moving the content to another domain.<
I see this as one of the effective ways to deal with the consequences of Bourbon and any other update.
And this means that we should at any given time have several "enmergency domains" which are indexed on Google with little contents (few pages) and ready to accomodate our major contents (from current domain(s)) in case of being hit by an update.
Another effective way mightbe by splitting our contents on several domain addresses instead of keeping it on one domain.
| 7:55 pm on Jun 11, 2005 (gmt 0)|
doesn't seem to be over here; plenty of DC movements. Don't think Google is happy yet (thank god I might add)
| 7:58 pm on Jun 11, 2005 (gmt 0)|
I think that GG gave us some pretty big hints. He said that the reinclusion request should go out sometime this weekend, possibly on Monday (June 6th). Some people thought they sawy improvements on Tuesday (only 1 day off).
He also mentioned a binary push (update to programs) sometime this week with the hooks for the third set of data.
I think a lot of people noticed two sets of data changes, no one's reporting a third set of data. GG thought it might go out this week. My guess is that it didn't. He also mentioned he would give us an email address when it's all over. He's got a good track record - why would he fail us now?
When GoogleGuy is back with an email address - that is the Fat Lady Singing.
| 8:08 pm on Jun 11, 2005 (gmt 0)|
The problem with moving to a new domain is that you'd have to write to possibly hundreds of sites who have linked to you and ask them to change their link. Sounds overwhelming right now. And some will never get around to changing it.
If in a couple of months there hasn't been any progress maybe that will be the only solution.
| 8:17 pm on Jun 11, 2005 (gmt 0)|
I'm seeing a lot of churn in the results in my areas in the pass hour. In fact I can tell there will be more and could last till about Tuesday. The caveat though is Bourbon didn't hit my areas heavily until Monday. I may just be seeing the tail end of what people saw a week ago.
| 8:19 pm on Jun 11, 2005 (gmt 0)|
I've got so much skin in the game my voice is starting to sound like a 10 year old's
| 8:20 pm on Jun 11, 2005 (gmt 0)|
>If in a couple of months there hasn't been any progress maybe that will be the only solution.<
The emergency domains should be created in good time to start their aging process. This shall give you another choice when and if needed. We are talking of little investment here, maybe $10 for each emergency domain.
| 8:50 pm on Jun 11, 2005 (gmt 0)|
The Bourbon process has a lot of time to run yet, so the lashing around in the dark isn't going to do much good. Then, it seems we have even greater changes to expect in the next couple months ("this summer"). Hopefully Google will somehow remember to remove the cloaks and redirects and massive blog spammers. They did this far better six months ago (even a month ago).
"There are STILL no commonalities to the sites removed"
Every single one mentioned here (that I've seen) has canonical issues. Not a single problem site has been mentioned that does all the "canonical protection"(tm) tactics Google Guy outlined. The folks believing this is a coincidence ought to give that up by now, and figure out that "how to deal with them" means making your sites as friendly to Googlebot as possible.
<of course, sites that don't protect themselves might not be effected; or may have been effected by other things rather than duplicates, 302s, and other canincal stuff; and people who protect themselves from the canonical stuff could drop because they lose good links or a million other reasons>
| 8:54 pm on Jun 11, 2005 (gmt 0)|
|Another effective way might be by splitting our contents on several domain addresses instead of keeping it on one domain. |
After losing huge G traffic we think this multi-domain structure may be getting us incorrectly filtered for "link farming" and we are about to revert to a single domain.
| 9:02 pm on Jun 11, 2005 (gmt 0)|
>After losing huge G traffic we think this multi-domain structure may be getting us incorrectly filtered for "link farming" and we are about to revert to a single domain.<
I guess that your domains have been crosslinked?
Is it possible to split your contents on several not crosslinked domains? For ex.
| 9:11 pm on Jun 11, 2005 (gmt 0)|
I thought the top players in my niche had survived Bourbon.
But No... the SERPs have been turned upside down today with nearly irrelevant stuff at the top.
A search for Microsoft's well know desktop database plus consultancy company results in #1 result that does not have the words microsoft access together on the page and has 'access' mentioned once but not in ref to a database.
Similar searchs produce similarly bizare results.
| 9:18 pm on Jun 11, 2005 (gmt 0)|
"Google has totally given it away to the multi thousand page sites with usually only one page devoted to the keyword."
This is why I say the page rank system has been a total failure on Googles part. Case in point:(and these are not actual search results)
A 70,000 page site on say........the Beatles, with 2000 backlinks and lots of content and considered an authority site by Google should obviously rate high for the Beatles. What it shouldn't do is rate #1 for say.....Steven King. However it does! Why? It has one affiliate page on Steven Kink with absolutly no original content and about 700 backlinks with anchor text pointing to it. I haven't checked every link but I'm pretty sure they weren't all obtained naturally. Even a bot should be able to figure that out.
But still, Google just lets the Authority sites take over the serps no matter what the phrase. I see this everywhere. In Google's eyes there's only need for about three hundred websites. Everybody else is just taking up space.
| 9:36 pm on Jun 11, 2005 (gmt 0)|
Ehum. I thought none of my sites had been touched by Bourbon, but today one of them has sunk from #9 to #55 and #10 to #270 for the two most important phrases.
I didn't change anything at all during this update, and Google-bot has been re-spidering the site like crazy these past two days. In fact the G-bot is responsible for 95% of the page requests...
The site is CSS coded with pages around 10K. It has one medium-grayish SEO thing on it, with a bit of text being offset about a mile to the left of the screen, substituted for a logo and some navigational images. It's kind of heavy in the keyword density, but always in context with the text.
If nothing changes in a week, I'll go through the code, removing this 'invisible' text.
However, the site is only 3 months old, and has done very well in the SERPs for being so new. Maybe too well, and now it has been pushed back.
| 9:56 pm on Jun 11, 2005 (gmt 0)|
|Every single one mentioned here (that I've seen) has canonical issues. Not a single problem site has been mentioned that does all the "canonical protection"(tm) tactics Google Guy outlined. |
I agree that I have a very vulnerable site in terms of how I did my links. It's very likely that is what caused the plunge of my smaller site. My problem is that Google Guy is assuming we are all technical experts in carrying out the steps he mentioned. I've just barely figured out there even is such a thing as "canonical protection". Many people with websites won't have any idea what happened. If Google wants informational sites in their serps they need to consider the fact that all people who research and write on their topic won't especially be technically savvy beyond basic HTML.
| 10:21 pm on Jun 11, 2005 (gmt 0)|
There should be 2 threads - one for people who have had their sites nearly wiped out of Google and one for others who are concerned about the niceties of positioning for sites that still exist. I'm one of the former.
First, remember that Bourbon is just a continuation of what has been happening over the past 10 months. Aug 10, Aug 26, September 20(?), December 26, Allegra, and now Bourbon - in each of these updates sites have been wiped/restored. It's been like a rollercoaster for me and others. It just seems to have spread wider.
We know that Google has been very concerned about spam over the past year and the ability of people to automatically generate 75,000 page sites with practically no effort (Yahoo, Google, Gigablast, and Amazon all provide the feeds to do this). But just generating a big site doesn't get you anywhere in Google. Every spammer knows Google is about links so you have to generate multiple sites - get spam links (guestbooks, blog comments, forums) for the satellite sites and your other legit sites and feed the PR to target sites.
Google has tried penalties before for cross-linking (the PR0 penalty in 2002/3). With PR0, if you were targetting one site then only the satellites were hit (more complicated structures could hit all the sites in the network). Now Google is also hitting the target site as well as the satellites. There was a paper on analysis of spam networks and it seems to me that Google has been applying some of the ideas. They have ways of identifying 'spam networks' based on 'it looks like spam so it must be spam'. It seems to me that a lot of good sites have triggered this penalty (but it is an algo-driven thing rather than a manual penalty so Google can answer 'no penalty on your site').
I've applied the 'spam network filter' theory to my own sites, dannys and EFV (even though he's back) and I can justify it for all of them. It would be rude to talk about other people's sites without their permission, so I shall just talk about mine. I have what I regard as a good site with original reviews of submitted CDs by unknown bands, but it has had a band index of Indie bands that point towards band pages on a big music directory that I run. So these 27 pages had lots of links to individual pages on my big site. Plus for a while I had a link to the big site on all my other pages in the left-hand navigation.
Ways this filter may get triggered: a number of pages (on one site) with many links to pages on the same site. A site where every page links to at least one page on another site (ignoring affiliate links). Domain ownership details are almost certainly taken into account. The scary bit is that it is the link history that is taken into account - not the current site's pages. Deleting the suspicious pages does no good.
So I can hold my hand up and say I have used my other sites pages to promote my main commercial site (but I have not used guestbooks etc). I have seen other sites that have been set up purely to give good anchor text in links to another site. I don't believe I should be classified with real spam networks for a bit of minor tweaking but I have to accept it.
There is one other thing I believe - the www non-www confusion can lead to interpreting a site as a spam network e.g. all example.com pages link to www.example.com pages. I imagine that a major 302 scraper site could also affect your site in the same way (but it would have to be extreme).
Anyway, it's all just theory. But we all know that Google was built on analysing links. Anything as severe and targetted towards whole sites as the last 10 months (repeat, the last 10 months not just Bourbon) has to be based on link patterns and the fight against spam.
| 10:34 pm on Jun 11, 2005 (gmt 0)|
"But still, Google just lets the Authority sites take over the serps no matter what the phrase."
Must be a different Google. The Google I see has not just twisted the authority knob clean off, its now buried it under ten pounds of tar. Inner pages of authority sites have dropped very significantly with Bourbon. All this redirect garbage reflects the extreme devaluation of authority in Bourbon, and continuing the trend of valuing trivial links much closer to how it values quality links.
| 10:54 pm on Jun 11, 2005 (gmt 0)|
Very informative post. Thanks.
>Ways this filter may get triggered: a number of pages (on one site) with many links to pages on the same site. A site where every page links to at least one page on another site (ignoring affiliate links).<
My site meets your above mentioned "conditions". The majority of site pages are linked to each other. And it lost around 75% of its Google referrals on 3rd Feb 2005 (allegra). Since then it is gradually recovering. I can say that Bourbon has very little impact on my site, if there is any.
| 11:06 pm on Jun 11, 2005 (gmt 0)|
the conditions I mention are kind of wide - most people on this board probably meet them if they have tried to increase their link power and have more than one domain. Sadly, serious spammers (rather than naieve, tinkering websmasters like me) will avoid the conditions because they have always known about registering domains under different names and how to spread the link amongst many sites.
It was when I read the paper about spam networks that I suddenly realised that my links COULD 'look like spam, smell like spam...'. But I still think all my sites are worthwhile (and so so the ODP/Musicmoz).
| 11:30 pm on Jun 11, 2005 (gmt 0)|
Following your thoughts, one can say;
Among other measures to deal with consequences of Bourbon (and mightbe allegra) and future updates:
- Sites of the same owner shouldn´t do cross-linking
| 11:51 pm on Jun 11, 2005 (gmt 0)|
Um, would that be restricted to certain "money" keywords or something? Because every educational site I know of fits those criteria. Our sites are all itnerlinked with each other. We all have at least one external link on every page. And as far as I know none of my colleagues have dropped out of the SERPs the way people here have. That can't be the whole story.
| 11:58 pm on Jun 11, 2005 (gmt 0)|
for all it's worth: I'm getting a few hits from the updated DCs. Very few, but got one even from AOL for a keyword that it's "domain name" and I always ranked high till I got hit in Feb.
8.01 PM est:
out of all the DCs, I'm out on only 4
number of zeros: 4
average of nonzeros: 1.0
| 12:07 am on Jun 12, 2005 (gmt 0)|
|Inner pages of authority sites have dropped very significantly with Bourbon |
Some of this may be related to how two and more word search phrases are being treated differently. There is less emphasis on the words being together on a page in fact some top 10 results don't even include all the words in the phrase anywhere on the resulting page. Sinced most inner pages require a search phrase to bring them up even top information sites may have pages buried in scattered results.
There is a discussion going on this at [webmasterworld.com...]
| 12:41 am on Jun 12, 2005 (gmt 0)|
I do think the apparent demerit for exact match (in title, in text and particularly in URL right of the tld) is hurting some inner page authority site rankings, but I think that is just one of a few factors.
| 12:57 am on Jun 12, 2005 (gmt 0)|
Today has been one of my worst for turning up in Google searches (7 google searches for 23 Yahoo! searches). And probably one of my worst Saturdays, stats-wise ever. :-( :-( :-(
Um, this sucks.
| 2:07 am on Jun 12, 2005 (gmt 0)|
I know how you feel, Janiss. I'm at a total loss.
I wish we could get something definitive from GG on the status of Bourbon (finished or unfinished), and that I would get a response from Google's support about my queries to them.
| 2:23 am on Jun 12, 2005 (gmt 0)|
|Google has totally given it away to the multi thousand page sites with usually only one page devoted to the keyword. |
I was seeing a lot of this in the travel sector during the two months before Bourbon, but it's no longer true for the keywords and keyphrases that I watch.
| 2:59 am on Jun 12, 2005 (gmt 0)|
Major changes just occured for my lost site.
Went from 55 indexed pages to 135 (of 287 total) fully indexed pages and no url-only pages listed in site: command.
Haven't found site in any search results though... still severly penalized.
| 3:23 am on Jun 12, 2005 (gmt 0)|
Website has 1700 actual pages.
Jan. 05 site: showed about 4500 pages.
After Allegra site: dropped to about 100 pages.
This week site: increased to 800/900/1000 pages.
Today site: shows over 2000 pages
Remember, only 1700 pages really exist.
Most listings not url only, but are supplemental. Some pages have cache date Sept. '03.
Still no SERP traffic.
| 3:39 am on Jun 12, 2005 (gmt 0)|
Just checked my site.
All main pages (100) are fully indexed.
Only missing pages are my stupid screen shot popups with one paragraph of text. And, a few programming examples with too little text.
Today, I had added the NOINDEX meta tag to all those popup pages since they weren't designed to be entry pages anyway.
I'm not changing anything more... maybe these new cached pages will return some serp results in a few days when this Bourbon bottle is empty.