Is your keyword phrase at the beginning of your title or at the end?
I had kw at the beginning and I was number 6. I just changed the kw to the end of the title and I jumped to number 2.
Everyone is telling me that it is a coincidence, but it doesn't hurt to try it.
|I tweaked my site title and description and bam filtered. |
Not a filter. Happens all the time.
The whole concept of a 'filter' has always been in dispute - it now looks increasingly unlikely. Probably a stemming issue.
[edited by: WebGuerrilla at 6:21 pm (utc) on Mar. 10, 2004]
Sorry if I wasn't very clear. I went from #8 for the keyword to not in the top 400 results, unless I use +a.
The keyword was in the middle of the title, I moved it to the beginning of the title.
|Not a filter. Happens all the time. |
Happens all the time, but yeah, sounds like a filter to me. Actually I think that it's a penalty - a big one. Not a penalty for being naughty, just one cos G felt like it. That's the way it is these days.
I don't have a solution, but my first advice would be to "untweak" whatever you changed.
I am glad someone brought this topic up again. I still see the filter reeking havoc on the serps. I Noticed this for city terms, except a few large cities which are back to pre-Florida. If anyone needs proof of the filter I can easily provide dozens of examples.
I have beaten the filter in the past and could do so again, but it is time consuming and interferes with the ability to build my business. In order to beat the filter it is best you have control over 3+ sites on different servers. The idea is to create a directory/authority, where only one of the 3+ sites will really see the benefit.
Why would Google need a filter?
There are over 100 factors involved in weighting a site. Suppose they were weighed like paintings. "Sergei's Landscape Gallery" picks out likely landscapes by their color: the average landscape has about 50mg of blue pigment, 20 of red, and 40 of yellow: the correlation between placement of blue and yellow is high, and the correlation between red and yellow is low; etc.
Now, Sammy the Spamming Spraypainter starts creating thousands of colorwheels, tweaking the proportions to figure out just how large the blotches of blue, green, yellow, red, etc. As a result, Sergei's "Landscapes" are dominated by hideous blue-green blotches.
Sergei, no simpleton, considering that his gallery is being depreciated by these spammers, introduces some new factors: orthogonal edge filters, perhaps. In fact, he hires a whole roomfull of mathematicians to invent and test various weighting parameters. The blotches that used to rank so high, suddenly don't. Is this a "filter" -- or is it simply the universal fact that no picture has the right to be judged forever by the same criterion, and that other equally valid criteria judge it much less favorably?
There's a major difference in scope here, resulting in mutually incomprehensible attitudes. The ants are complaining about the elephant dancing on its food trails. But the elephant is just busy carrying logs to the other side of the clearing.
I used the term correctly. Google is filtering sites. I understand how algo's work, this is not what is happening. Google is deliberately removing sites. If you can trigger the filter to turn off suddenly the sites are back. I am not talking about ranking well or not ranking well, I am referring to appearing or disappearing. The problem is not spam. The problem is Google. Google is removing many legitimate sites which are rich in content and leaving only annoying directories in their place. Brandy partly restored these sites for select terms; however, the vast majority of terms are still experiencing the filter.
hutcheson, nice theory, but you're wrong. There is a filter, although as I remarked I suspect that it's not a filter, just a massive penalty. The effect is the same. And it doesn't just filter out spam, although it does remove some spam. Plenty of good quality sites have been removed as well.
Form Google's prespective, there can only be ten sites in the top ten, so what does it matter if some quality sites have been removed? As long as the spam has been removed, then the ten results that survive must be relevant. That's what G is aiming for (hasn't got there yet). Like your metaphor, the elephants are doing their job, but it's tough on the ants that got squashed.
The word "filter" is a very good layman's description of what Google has been doing since November. The reason some Google pundits object to the word "filter" is because it demeans their own punditry. The word makes it sound like they have less control over Google than they like to pretend. In many cases, the loudest of those protesting against the word "filter" are precisely the ones who make money from convicing clients that they can improve the client's ranking in Google. An all-or-nothing "filter" throws a wrench in their capacity, as SEO professionals, to make such claims.
The fact is, these SEOs do have less control than they used to before November. And another fact is that this "filter" can still be turned on and off with stupid filter tricks that invoke the advanced search options. These stupid tricks should have minimial or zero effect logically, yet they consistently throw sites that are off the map back into the top ten.
It's a filter, and the effect it has on SEO, whether it was initially intented or not, is by now considered desirable by Google.
Define 'punditry' and we might take you seriously.
This is the silliest word I have come across in 39 years of life - but I'm not dead yet and there is always the possibility you might come up something more silly.
|Define 'punditry' and we might take you seriously. |
Look it up. It's in Merriam-Webster's. And 41,000 hits in Google too. Just because you think the word is silly doesn't mean that the word is silly. It might mean that you are uneducated.
pundit - one who gives opinions in an authoritative manner - punĚditĚry noun
I'm becoming less and less convinced of the filter theory - One very competitive site bounced back for its main 2 word key term along with all my long term competitors (bless their cotton socks).
I personally believe that G is experiencing technical issues, patched where possible but resulting in illogical & inconsistent but not irrelevant results.
Try beating the "filter" whilst G sorts itself by submitting your site here [submit.search.yahoo.com] and crossing your fingers...
I'm not sure if this is a filter or not, but if you search for one of my main keywords and put them in quotations..."blue widgets", then my site can be found on page two of the serps.
But, if you search for blue widgets without the quotations, you will find my site on page 40 of the serps.
Anyone have any idea what this is about?
I have two sites that this has happened to - been filtered where used to be top. Been hit for virtually all their pages/search terms. I emailed Google and was told that the sites "had no penalty" (yeah right), was told to look at the Google guidelines.
All are on the same server, so looks like I'll be transferring to a different server, with links from the mother site (existing server) to give the PR and "Authority".
Any more hints would be appreciated.
Call it what you want. We refer to it as a thing-y
For the same kw (or kw pair) in some categories, simply using different geo terms causes entirely difffernt *kinds* of SERP's to be displayed. Not just different rankings. Entirely different *kinds* of SERP's. It's the thing-y at work.
Without getting specific, one kind of SERP profile resembles pre-Florida, but with less spam. The other kind resembles Florida/Austin, with sophisticated spam, more edu's, .govs, directories. Both seem to contain a larger amount of news items than pre-Brandy.
Unless you don't think there is a difference between pre-Florida and Florida/Austin, it's impossible to *not see* the difference, if you look in the right places, and exchange the right geo terms.
I believe that the two sets of SERP's are different algo's in place, triggered by, um, different thing-y's.
At least this is our observation. None of us at my place really know much about all this algo / filter / index stuff.
>I believe that the two sets of SERP's are different algo's in place, triggered by, um, different thing-y's.
This would fit in with the theory elsewhere expressed, that Hilltop or something like it is used for certain frequent queries.
IMO, these are among the more plausible theories in common circulation.
If I were to speculate, I'd also consider the possibility of applying an alternate algo for "competitive" searches -- that is, for searches where there were a great deal more hits than would be expected based on some heuristic.
Remember Google's attitude is algorithm-based. The chances of a Google employee ever actually looking at the results for, say, "Cincinnati Hotels" is rather low. They will, at most, have picked several "representative" cities [worldwide] to check the "Hotels" results.
>>I believe that the two sets of SERP's are different
>>algo's in place, triggered by, um, different thing-y's.
>This would fit in with the theory elsewhere
>expressed, that Hilltop or something like it is
>used for certain frequent queries.
Perhaps the algo that is applied to the frequent queries is more resource intensive and google can not YET apply it to the entire index.
>>The reason some Google pundits object to the word "filter" is because it demeans their own punditry.
The reason some webmasters try to promote the idea of a "filter" is because simple conspiracy theories are far easier to accept than complex changes in search algorithms which don't favour their own websites.
|The reason some webmasters try to promote the idea of a "filter" is because simple conspiracy theories are far easier to accept than complex changes in search algorithms which don't favour their own websites. |
That is a ridiculous statement. My site is extremely relevent to my #1 search term but not to be found in the top 400+ results. When I add +A to my search, my site is in the top ten. That would suggest a filter even though Google will not admit it via canned e-mail. The strange part is that the filter only seems to apply to that term which is also my domain name. My site shows up for other relevent keywords.
An algo itself is a form of filter. Your website may be gold one month. The next, it sinks to the bottom of googles pan with the rest of the silt..)
madman21: find some posts here at WebmasterWorld about Local PR and how it's applied only to a handful of the most popular keywords. That theory seems more convincing to me than a filter.
The title of the thread is "Beating the filter" not "Do you believe that the filter exists?".
Okey then, let me rephrase that then. IMHO in order to beat the "filter" you need to get links from sites ranking high for the particular keyword you're being "filtered" on.
> The strange part is that the filter only seems to apply to that term which is also my domain name.
Google have stated that the reason for some changes is that "some sites were ranking higher than they should".
Domain name seems to be less relevant than it was.
Keyword proximity seems to have been altered. Searching for Blue Widgets is different to "Blue Widgets". The second example the words must be together, in the first example, it seems as if G likes the words to be seperated more than it did before. Typing in a five word exact phrase which is in one of titles is not likely to bring you into the top result, unless you use the quotes around the search.
I don't think it is a filter, but something has changed very drastically. Such weight on Domain name was getting silly - just buy 10 domain names, and you can get to the number one position. In fact, companies were selling this "service" under their page one Google guarantee. No wonder G changed it.
The way I beat the filter to was make my commercial site look like a directory. The filter looks at the site and filters on the commercial element. It is easy to determine which sites are commercial and which are directories. Before Scr**gle ceased to work it demonstrated the filter at work. Brandy has removed the filter for certain city terms, but not most. The best way to find out how to beat the filter is to look at the sites listed.
I find the way by noticing a commercial site which managed to reappear. It simply developed a directory structure. This means link out to relevant sites and receive links from relevant sites and make sure these are not same sites. It is best achieved by having multiple hosts and domains. I did it using 3 domains which were on two different servers and all had the same topic (but very different content). Two of the sites are filtered out the other sits in the #1 spot for a competitive keyword. I know the filter is not based on over-optimizing because the 3 sites share an equal level of optimization. It is only the linking structure that seems to trigger the filter. Look as I said on how your site links out to other sites in the same theme this is the answer.
"Before Scr**gle ceased to work it demonstrated the filter at work"
Actually it showed plainly that there was no filter.
| This 135 message thread spans 5 pages: 135 (  2 3 4 5 ) > > |