Welcome to WebmasterWorld Guest from 22.214.171.124
These were all dates when Google applied their Filter Updates (Allegra was also an algo update). Some sites dropped out of sight, some reappeared.
I'm very tempted to say "where have you all been for the last year?". There have been threads that have discussed this phenomena after each filter update. I don't think anyone has figured out exactly what the filter is (we all have our pet theories) but every thread has talked about the devastation to traffic of dropping 100+ places even on your sites own name.
If this is the first time you have been hit then you have been very lucky up until now. Some of us have been buried for a long time and have come back into Google over the past few filter updates. Whether our re-appearance is related to whatever actions we have taken is uncertain.
Google has had multiple technical failures the past few years, some catastrophic and some small. In some cases, Google employees were clueless about the problems until webmasters more or less forced them to see, with Google Guy's unfortunate comments about 302s six months ago as the best example. They didn't know how screwed up they were, despite it being obvious.
The Supplemental index was a stupid idea that has some merits, similar getting out of dusting the living room by setting fire to the house. However, unlike other things, the Supplemental index is seldom seen by the public, so the humiliation it brings Google is limited to people in the search-conscious community. Eric doesn't tell cnet about how proud they are to have listings for pages deleted two years ago; that a webmaster has even told Google to delete from its index more than once because it doesn't exist; that Google has crawled to and seen as 404 literally 100+ times, etc.
Google's problems include mistaken notions, but its mostly ineptness. No one at Google sat down before the recent mess and said "hey, let's look incompetent by making stupid decisions that anyone can see by adding &filter=0 after our search page URLs."
This isn't an easy job they have, and sometimes they do a terrible job at it, even if they do better than their competition.
There will always be tin hat conspiracy posts here, but the fact of the matter remains that dropping a glass on the floor is not "updating" it.
[edited by: steveb at 6:47 pm (utc) on Sep. 28, 2005]
Of course everyone affected has been searching for the answer. 302 redirects (everyone has a few pages with 302s pointing at them), www/non-www issues and/or duplicate content, adding large numbers of pages, Adsense, affiliate links.
I did examine closely 3 very good, non-spammy sites hit as well as my own. In all cases (including my own sites) I could see a quite large proportion of links coming from a few domains. So, as a veteran of the PR0 cross-linking penalty I tend to favour linking patterns as the trigger.
There are a few other threads that has discussed this. Even one with a note from GG.
Anyone ever recovered? Any hints at what could be causing the penalty? (Ex: Add the "&filter=0" onto any term which you used to rank puts you back in the right place.)
My bet is on scrapers causing this issue.
so you think this filter isn't related to all those other filter updates? Certainly I've had one site come crashing back into the SERPS with this update after being missing since February. I just think that this regular pruning of search results has to be deliberate and stems from a filter (probably with multiple factors applied differently each time) and that this update is the just the latest in the line. Having the &filter=0 to spot the filter is useful though
I dont know - as you know though I think it is another attempt at a fix for canonical urls - they still have not got it right though :(
Seriously - this should be the main priority at the plex at the moment (IMO - of course)
Listen Google - I will say this only another 96 times. The canonical url for my site is the homepage with the www - I have done the 301 - this is the page with the most backlinks - it is the page that should rank for the company name search. Etc.
if so, how do you think these filters work?
Scrapers are rampant. They copy titles and meta-descriptions. If I look at one article of ours, there is at least 100 different scraper sites with the same title and meta description of the article. On our site, the meta-description is the first few lines of article. We also have index pages that link the articles together. These index pages are made up of titles and meta descriptions of the articles.
I imagine that Google use some type of Bayes classifier for the filters. In order for these filters to work - they must be manually fed a list of bad sites or sites that they would like to penalize or rid the SERPS of.
Since the scrapers contain the same title and meta description tags, it not a stretch to see that our site would get caught up in the filter.
I dont know - as you know though I think it is another attempt at a fix for canonical urls - they still have not got it right though :( <<
You mightbe right!
However, I see the problem in more simple manner:
Google keeps listing 1000īs of duplicate pages while at the same time removing "original pages" as part og deduplication process
And its hurting...
This is not an update tread.
This is not a "let's define what an update is" thread.
The thing is long and cluttered enough without this extra crap.
Start a new thread, or contribute to one of the other similar threads, if you want to discuss what an update is.
In my post msg #:86 on this thread, I wrote:
view member profile
send local msg
joined:Feb 6, 2005
msg #:86 3:18 pm on Sept 23, 2005 (utc 0)
Just like any other previous update, its gonna be very tough as the update proceed. Google updates arenīt something for the weak souls. During Allegra and Bourbon some of our fellow members couldnīt take it any more and did a very wise thing. They took a break ;-)
And I have nothing to add ;-)
Well I got a reply from Google Search via Adsense, who forwarded my enquiry, they said that we were not being penalized for anything on the site and that, as they add new sites and content, positioning moves about...
So I am not sure what to make of the huge drop, though I guess they are always cagey about any updates. At least it looks like it not due to anything on our site, which is good.
I guess more wait and see then.
After drying up the puddle of tears below my desk and waited for my keyboard to dry out I made a list of alterations I would set about making.
I firstly listed anything different I'd done to my site within the past two months. Only 1 major thing so that was easy.
I gave up checking what the #1 - 10 ranking sites for my keywords were doing that I was or wasn't because I could find no sense or consistancy there.
I used the removal tool to take down 2 folders of pages that were recently added. I also added NO INDEX, NO FOLLOW tags to these and added them to my robots txt file.
The following had always exsisted on the site but decided it was now bad practice where before I found it worked very well.
Cross linking to another site I own on a simillar theme and done likewise to it. These were plentiful.
site 1 big widgets interlinked with
site 2 small widgets and reverse.
Site 2 funnily enough has made a comeback, It hasnt been updated since May and Id meant to rework it into something else and never found the time. I still intend to at some point.
One other thing Im gonna do is use site 1 under a diff domain, juggle it around a bit, ban googlebot from the entire site and use it over at yahoo where the original does very badly.
After a little work on a seperate site yahoo seems a lot easier to rank in. Older techniques still seem to work there.
Should I ban yahoo from the original version?
EDIT: I just checked to see how removed pages request was pending or complete. Answer complete.
Try it on G. three word search.
Results 1 - 1 of about 44
shows only 1 result on serps though.
I add the &filter=0
Results 1 - 10 of about 18
shows all 18.
There were only 16 related pages in the rmoved folder that relate to this search.
18+16=44 - it does when you use the calculator sponsored by Big G - lol!
[edited by: djmick200 at 10:27 pm (utc) on Sep. 28, 2005]
I should say we use subdomains for all our section so this maybe further complicates things.
Also they mentioned getting morehigh quality links, yet we probably have hundreds of them added every month, on everything from major news sites to blogs. I doubt many sites get as many so quickly in fact. Yet we seem to have been degraded despite these, which are all organic to articles.
So will see what they say.
The 18+16=44 was a slight jibe at googles inflated number of pages per site count.
I used the removal tool to take away 16 pages that would contain 'big red widget'.
There remains 18 pages in my site that contain the text 'big red widget'.
The two together = 34.
When I done the initial search it said 1-10 of 44.
10 extra. No where to be found. Non existant.
site.com -word1 where word1 is on 14 of the sites 1000+ pages... it returned 9000+ results
site.com -word2 where word2 is on 19 pages... it retuned 997 results
After a few dozen checks, if a word appears on 15 or less pages on the site, the result comes back over 9000 results; if the word appears on 16 or more pages, the results come back at 1000 or less. This could be because I have 1015 pages on the site. I don't really have an accurate count, or it may be because 15 is somehow mystical to Google, but it is interesting that I am able to get an accurate count by -someword
A suggestion to the engineers at Google working on this problem. Back out of this trainwreck, whether it be an update or a new filter or whatever. You can't save it by trying to tweak it a little bit more. Back out and start over again once you actually have a solution that's been tested!
On the plus side my Yahoo traffic is rising as it appears that people are getting as fed up with Google's results as I am. My Yahoo referrals have doubled in the past two days and it's not because my rankings have changed. It's because more people are using Yahoo rather than Google. If this drags on much longer I think Google is risking large scale defections to the competition, which come to think of it, is a good thing! ;-)