Welcome to WebmasterWorld Guest from 22.214.171.124
We need to keep this thread focused on the followings:
- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).
- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.
- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).
- Effective ethical measures to deal with the above mentioned changes.
econman - based on your definition I think that he had nothing to lose by saying those dc had the future.
He he - GG posts get analysed like a politicians :)
Mind you a lot of us have a lot of skin in the game.
(I have no idea if that makes sense)
[edited by: Dayo_UK at 7:29 pm (utc) on June 11, 2005]
>I avoided most of the damage done to my one site which was slaughtered in this "update" by moving the content to another domain.<
I see this as one of the effective ways to deal with the consequences of Bourbon and any other update.
And this means that we should at any given time have several "enmergency domains" which are indexed on Google with little contents (few pages) and ready to accomodate our major contents (from current domain(s)) in case of being hit by an update.
Another effective way mightbe by splitting our contents on several domain addresses instead of keeping it on one domain.
joined:Dec 29, 2003
He also mentioned a binary push (update to programs) sometime this week with the hooks for the third set of data.
I think a lot of people noticed two sets of data changes, no one's reporting a third set of data. GG thought it might go out this week. My guess is that it didn't. He also mentioned he would give us an email address when it's all over. He's got a good track record - why would he fail us now?
When GoogleGuy is back with an email address - that is the Fat Lady Singing.
If in a couple of months there hasn't been any progress maybe that will be the only solution.
>If in a couple of months there hasn't been any progress maybe that will be the only solution.<
The emergency domains should be created in good time to start their aging process. This shall give you another choice when and if needed. We are talking of little investment here, maybe $10 for each emergency domain.
"There are STILL no commonalities to the sites removed"
Every single one mentioned here (that I've seen) has canonical issues. Not a single problem site has been mentioned that does all the "canonical protection"(tm) tactics Google Guy outlined. The folks believing this is a coincidence ought to give that up by now, and figure out that "how to deal with them" means making your sites as friendly to Googlebot as possible.
<of course, sites that don't protect themselves might not be effected; or may have been effected by other things rather than duplicates, 302s, and other canincal stuff; and people who protect themselves from the canonical stuff could drop because they lose good links or a million other reasons>
Another effective way might be by splitting our contents on several domain addresses instead of keeping it on one domain.
After losing huge G traffic we think this multi-domain structure may be getting us incorrectly filtered for "link farming" and we are about to revert to a single domain.
>After losing huge G traffic we think this multi-domain structure may be getting us incorrectly filtered for "link farming" and we are about to revert to a single domain.<
I guess that your domains have been crosslinked?
Is it possible to split your contents on several not crosslinked domains? For ex.
But No... the SERPs have been turned upside down today with nearly irrelevant stuff at the top.
A search for Microsoft's well know desktop database plus consultancy company results in #1 result that does not have the words microsoft access together on the page and has 'access' mentioned once but not in ref to a database.
Similar searchs produce similarly bizare results.
This is why I say the page rank system has been a total failure on Googles part. Case in point:(and these are not actual search results)
A 70,000 page site on say........the Beatles, with 2000 backlinks and lots of content and considered an authority site by Google should obviously rate high for the Beatles. What it shouldn't do is rate #1 for say.....Steven King. However it does! Why? It has one affiliate page on Steven Kink with absolutly no original content and about 700 backlinks with anchor text pointing to it. I haven't checked every link but I'm pretty sure they weren't all obtained naturally. Even a bot should be able to figure that out.
But still, Google just lets the Authority sites take over the serps no matter what the phrase. I see this everywhere. In Google's eyes there's only need for about three hundred websites. Everybody else is just taking up space.
I didn't change anything at all during this update, and Google-bot has been re-spidering the site like crazy these past two days. In fact the G-bot is responsible for 95% of the page requests...
The site is CSS coded with pages around 10K. It has one medium-grayish SEO thing on it, with a bit of text being offset about a mile to the left of the screen, substituted for a logo and some navigational images. It's kind of heavy in the keyword density, but always in context with the text.
If nothing changes in a week, I'll go through the code, removing this 'invisible' text.
However, the site is only 3 months old, and has done very well in the SERPs for being so new. Maybe too well, and now it has been pushed back.
Every single one mentioned here (that I've seen) has canonical issues. Not a single problem site has been mentioned that does all the "canonical protection"(tm) tactics Google Guy outlined.
I agree that I have a very vulnerable site in terms of how I did my links. It's very likely that is what caused the plunge of my smaller site. My problem is that Google Guy is assuming we are all technical experts in carrying out the steps he mentioned. I've just barely figured out there even is such a thing as "canonical protection". Many people with websites won't have any idea what happened. If Google wants informational sites in their serps they need to consider the fact that all people who research and write on their topic won't especially be technically savvy beyond basic HTML.
First, remember that Bourbon is just a continuation of what has been happening over the past 10 months. Aug 10, Aug 26, September 20(?), December 26, Allegra, and now Bourbon - in each of these updates sites have been wiped/restored. It's been like a rollercoaster for me and others. It just seems to have spread wider.
We know that Google has been very concerned about spam over the past year and the ability of people to automatically generate 75,000 page sites with practically no effort (Yahoo, Google, Gigablast, and Amazon all provide the feeds to do this). But just generating a big site doesn't get you anywhere in Google. Every spammer knows Google is about links so you have to generate multiple sites - get spam links (guestbooks, blog comments, forums) for the satellite sites and your other legit sites and feed the PR to target sites.
Google has tried penalties before for cross-linking (the PR0 penalty in 2002/3). With PR0, if you were targetting one site then only the satellites were hit (more complicated structures could hit all the sites in the network). Now Google is also hitting the target site as well as the satellites. There was a paper on analysis of spam networks and it seems to me that Google has been applying some of the ideas. They have ways of identifying 'spam networks' based on 'it looks like spam so it must be spam'. It seems to me that a lot of good sites have triggered this penalty (but it is an algo-driven thing rather than a manual penalty so Google can answer 'no penalty on your site').
I've applied the 'spam network filter' theory to my own sites, dannys and EFV (even though he's back) and I can justify it for all of them. It would be rude to talk about other people's sites without their permission, so I shall just talk about mine. I have what I regard as a good site with original reviews of submitted CDs by unknown bands, but it has had a band index of Indie bands that point towards band pages on a big music directory that I run. So these 27 pages had lots of links to individual pages on my big site. Plus for a while I had a link to the big site on all my other pages in the left-hand navigation.
Ways this filter may get triggered: a number of pages (on one site) with many links to pages on the same site. A site where every page links to at least one page on another site (ignoring affiliate links). Domain ownership details are almost certainly taken into account. The scary bit is that it is the link history that is taken into account - not the current site's pages. Deleting the suspicious pages does no good.
So I can hold my hand up and say I have used my other sites pages to promote my main commercial site (but I have not used guestbooks etc). I have seen other sites that have been set up purely to give good anchor text in links to another site. I don't believe I should be classified with real spam networks for a bit of minor tweaking but I have to accept it.
There is one other thing I believe - the www non-www confusion can lead to interpreting a site as a spam network e.g. all example.com pages link to www.example.com pages. I imagine that a major 302 scraper site could also affect your site in the same way (but it would have to be extreme).
Anyway, it's all just theory. But we all know that Google was built on analysing links. Anything as severe and targetted towards whole sites as the last 10 months (repeat, the last 10 months not just Bourbon) has to be based on link patterns and the fight against spam.
Must be a different Google. The Google I see has not just twisted the authority knob clean off, its now buried it under ten pounds of tar. Inner pages of authority sites have dropped very significantly with Bourbon. All this redirect garbage reflects the extreme devaluation of authority in Bourbon, and continuing the trend of valuing trivial links much closer to how it values quality links.
Very informative post. Thanks.
>Ways this filter may get triggered: a number of pages (on one site) with many links to pages on the same site. A site where every page links to at least one page on another site (ignoring affiliate links).<
My site meets your above mentioned "conditions". The majority of site pages are linked to each other. And it lost around 75% of its Google referrals on 3rd Feb 2005 (allegra). Since then it is gradually recovering. I can say that Bourbon has very little impact on my site, if there is any.
the conditions I mention are kind of wide - most people on this board probably meet them if they have tried to increase their link power and have more than one domain. Sadly, serious spammers (rather than naieve, tinkering websmasters like me) will avoid the conditions because they have always known about registering domains under different names and how to spread the link amongst many sites.
It was when I read the paper about spam networks that I suddenly realised that my links COULD 'look like spam, smell like spam...'. But I still think all my sites are worthwhile (and so so the ODP/Musicmoz).
joined:Dec 29, 2003
8.01 PM est:
out of all the DCs, I'm out on only 4
number of zeros: 4
average of nonzeros: 1.0
Inner pages of authority sites have dropped very significantly with Bourbon
Some of this may be related to how two and more word search phrases are being treated differently. There is less emphasis on the words being together on a page in fact some top 10 results don't even include all the words in the phrase anywhere on the resulting page. Sinced most inner pages require a search phrase to bring them up even top information sites may have pages buried in scattered results.
There is a discussion going on this at [webmasterworld.com...]
joined:Oct 27, 2001
Google has totally given it away to the multi thousand page sites with usually only one page devoted to the keyword.
I was seeing a lot of this in the travel sector during the two months before Bourbon, but it's no longer true for the keywords and keyphrases that I watch.
joined:Mar 17, 2005
Jan. 05 site: showed about 4500 pages.
After Allegra site: dropped to about 100 pages.
This week site: increased to 800/900/1000 pages.
Today site: shows over 2000 pages
Remember, only 1700 pages really exist.
Most listings not url only, but are supplemental. Some pages have cache date Sept. '03.
Still no SERP traffic.