homepage Welcome to WebmasterWorld Guest from 54.205.160.82
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 135 message thread spans 5 pages: 135 ( [1] 2 3 4 5 > >     
Beating the filter
Does anyone know how?
madman21




msg:87967
 9:57 am on Mar 2, 2004 (gmt 0)

Google is filtering my site out of SERPs for my main keyword. In is extremely relevant and I have plenty of on topic content. This happened a few weeks before Brandy. I ranked about #9 for the keyword previously. I tweaked my site title and description and bam filtered. If I run a search for Keyword +a I rank #1.
Any ideas on how to fix this?

 

yowza




msg:87968
 7:32 pm on Mar 3, 2004 (gmt 0)

Is your keyword phrase at the beginning of your title or at the end?

I had kw at the beginning and I was number 6. I just changed the kw to the end of the title and I jumped to number 2.

Everyone is telling me that it is a coincidence, but it doesn't hurt to try it.

martinibuster




msg:87969
 7:37 pm on Mar 3, 2004 (gmt 0)

I tweaked my site title and description and bam filtered.

Not a filter. Happens all the time.

SyntheticUpper




msg:87970
 7:38 pm on Mar 3, 2004 (gmt 0)

The whole concept of a 'filter' has always been in dispute - it now looks increasingly unlikely. Probably a stemming issue.

[edited by: WebGuerrilla at 6:21 pm (utc) on Mar. 10, 2004]

madman21




msg:87971
 8:19 pm on Mar 3, 2004 (gmt 0)

Sorry if I wasn't very clear. I went from #8 for the keyword to not in the top 400 results, unless I use +a.
The keyword was in the middle of the title, I moved it to the beginning of the title.

landmark




msg:87972
 2:30 pm on Mar 4, 2004 (gmt 0)

Not a filter. Happens all the time.

Happens all the time, but yeah, sounds like a filter to me. Actually I think that it's a penalty - a big one. Not a penalty for being naughty, just one cos G felt like it. That's the way it is these days.

I don't have a solution, but my first advice would be to "untweak" whatever you changed.

allanp73




msg:87973
 2:52 pm on Mar 4, 2004 (gmt 0)

I am glad someone brought this topic up again. I still see the filter reeking havoc on the serps. I Noticed this for city terms, except a few large cities which are back to pre-Florida. If anyone needs proof of the filter I can easily provide dozens of examples.
I have beaten the filter in the past and could do so again, but it is time consuming and interferes with the ability to build my business. In order to beat the filter it is best you have control over 3+ sites on different servers. The idea is to create a directory/authority, where only one of the 3+ sites will really see the benefit.

hutcheson




msg:87974
 2:59 pm on Mar 4, 2004 (gmt 0)

Why would Google need a filter?

There are over 100 factors involved in weighting a site. Suppose they were weighed like paintings. "Sergei's Landscape Gallery" picks out likely landscapes by their color: the average landscape has about 50mg of blue pigment, 20 of red, and 40 of yellow: the correlation between placement of blue and yellow is high, and the correlation between red and yellow is low; etc.

Now, Sammy the Spamming Spraypainter starts creating thousands of colorwheels, tweaking the proportions to figure out just how large the blotches of blue, green, yellow, red, etc. As a result, Sergei's "Landscapes" are dominated by hideous blue-green blotches.

Sergei, no simpleton, considering that his gallery is being depreciated by these spammers, introduces some new factors: orthogonal edge filters, perhaps. In fact, he hires a whole roomfull of mathematicians to invent and test various weighting parameters. The blotches that used to rank so high, suddenly don't. Is this a "filter" -- or is it simply the universal fact that no picture has the right to be judged forever by the same criterion, and that other equally valid criteria judge it much less favorably?

There's a major difference in scope here, resulting in mutually incomprehensible attitudes. The ants are complaining about the elephant dancing on its food trails. But the elephant is just busy carrying logs to the other side of the clearing.

allanp73




msg:87975
 3:19 pm on Mar 4, 2004 (gmt 0)

hutcheson,

I used the term correctly. Google is filtering sites. I understand how algo's work, this is not what is happening. Google is deliberately removing sites. If you can trigger the filter to turn off suddenly the sites are back. I am not talking about ranking well or not ranking well, I am referring to appearing or disappearing. The problem is not spam. The problem is Google. Google is removing many legitimate sites which are rich in content and leaving only annoying directories in their place. Brandy partly restored these sites for select terms; however, the vast majority of terms are still experiencing the filter.

landmark




msg:87976
 3:22 pm on Mar 4, 2004 (gmt 0)

hutcheson, nice theory, but you're wrong. There is a filter, although as I remarked I suspect that it's not a filter, just a massive penalty. The effect is the same. And it doesn't just filter out spam, although it does remove some spam. Plenty of good quality sites have been removed as well.

Form Google's prespective, there can only be ten sites in the top ten, so what does it matter if some quality sites have been removed? As long as the spam has been removed, then the ten results that survive must be relevant. That's what G is aiming for (hasn't got there yet). Like your metaphor, the elephants are doing their job, but it's tough on the ants that got squashed.

Scarecrow




msg:87977
 5:32 pm on Mar 4, 2004 (gmt 0)

The word "filter" is a very good layman's description of what Google has been doing since November. The reason some Google pundits object to the word "filter" is because it demeans their own punditry. The word makes it sound like they have less control over Google than they like to pretend. In many cases, the loudest of those protesting against the word "filter" are precisely the ones who make money from convicing clients that they can improve the client's ranking in Google. An all-or-nothing "filter" throws a wrench in their capacity, as SEO professionals, to make such claims.

The fact is, these SEOs do have less control than they used to before November. And another fact is that this "filter" can still be turned on and off with stupid filter tricks that invoke the advanced search options. These stupid tricks should have minimial or zero effect logically, yet they consistently throw sites that are off the map back into the top ten.

It's a filter, and the effect it has on SEO, whether it was initially intented or not, is by now considered desirable by Google.

SyntheticUpper




msg:87978
 6:46 pm on Mar 4, 2004 (gmt 0)

Scarecrow:

Define 'punditry' and we might take you seriously.

This is the silliest word I have come across in 39 years of life - but I'm not dead yet and there is always the possibility you might come up something more silly.

agerhart




msg:87979
 6:49 pm on Mar 4, 2004 (gmt 0)

SyntheticUpper:

[dictionary.reference.com...]

Scarecrow




msg:87980
 6:58 pm on Mar 4, 2004 (gmt 0)

Define 'punditry' and we might take you seriously.

Look it up. It's in Merriam-Webster's. And 41,000 hits in Google too. Just because you think the word is silly doesn't mean that the word is silly. It might mean that you are uneducated.

pundit - one who gives opinions in an authoritative manner - punĚditĚry noun

SyntheticUpper




msg:87981
 7:03 pm on Mar 4, 2004 (gmt 0)

Oh

subway




msg:87982
 7:10 pm on Mar 4, 2004 (gmt 0)

I'm becoming less and less convinced of the filter theory - One very competitive site bounced back for its main 2 word key term along with all my long term competitors (bless their cotton socks).

I personally believe that G is experiencing technical issues, patched where possible but resulting in illogical & inconsistent but not irrelevant results.

Try beating the "filter" whilst G sorts itself by submitting your site here [submit.search.yahoo.com] and crossing your fingers...

Becky




msg:87983
 10:28 pm on Mar 4, 2004 (gmt 0)

I'm not sure if this is a filter or not, but if you search for one of my main keywords and put them in quotations..."blue widgets", then my site can be found on page two of the serps.

But, if you search for blue widgets without the quotations, you will find my site on page 40 of the serps.

Anyone have any idea what this is about?

t2dman




msg:87984
 10:50 pm on Mar 4, 2004 (gmt 0)

I have two sites that this has happened to - been filtered where used to be top. Been hit for virtually all their pages/search terms. I emailed Google and was told that the sites "had no penalty" (yeah right), was told to look at the Google guidelines.

All are on the same server, so looks like I'll be transferring to a different server, with links from the mother site (existing server) to give the PR and "Authority".

Any more hints would be appreciated.

caveman




msg:87985
 11:39 pm on Mar 4, 2004 (gmt 0)

Call it what you want. We refer to it as a thing-y

For the same kw (or kw pair) in some categories, simply using different geo terms causes entirely difffernt *kinds* of SERP's to be displayed. Not just different rankings. Entirely different *kinds* of SERP's. It's the thing-y at work.

Without getting specific, one kind of SERP profile resembles pre-Florida, but with less spam. The other kind resembles Florida/Austin, with sophisticated spam, more edu's, .govs, directories. Both seem to contain a larger amount of news items than pre-Brandy.

Unless you don't think there is a difference between pre-Florida and Florida/Austin, it's impossible to *not see* the difference, if you look in the right places, and exchange the right geo terms.

I believe that the two sets of SERP's are different algo's in place, triggered by, um, different thing-y's.

At least this is our observation. None of us at my place really know much about all this algo / filter / index stuff.

:-)

hutcheson




msg:87986
 2:43 am on Mar 5, 2004 (gmt 0)

>I believe that the two sets of SERP's are different algo's in place, triggered by, um, different thing-y's.

This would fit in with the theory elsewhere expressed, that Hilltop or something like it is used for certain frequent queries.

IMO, these are among the more plausible theories in common circulation.

If I were to speculate, I'd also consider the possibility of applying an alternate algo for "competitive" searches -- that is, for searches where there were a great deal more hits than would be expected based on some heuristic.

Remember Google's attitude is algorithm-based. The chances of a Google employee ever actually looking at the results for, say, "Cincinnati Hotels" is rather low. They will, at most, have picked several "representative" cities [worldwide] to check the "Hotels" results.

chrisk2012




msg:87987
 5:40 am on Mar 5, 2004 (gmt 0)

>>I believe that the two sets of SERP's are different
>>algo's in place, triggered by, um, different thing-y's.
>This would fit in with the theory elsewhere
>expressed, that Hilltop or something like it is
>used for certain frequent queries.

Perhaps the algo that is applied to the frequent queries is more resource intensive and google can not YET apply it to the entire index.

Krapulator




msg:87988
 5:51 am on Mar 5, 2004 (gmt 0)

>>The reason some Google pundits object to the word "filter" is because it demeans their own punditry.

The reason some webmasters try to promote the idea of a "filter" is because simple conspiracy theories are far easier to accept than complex changes in search algorithms which don't favour their own websites.

madman21




msg:87989
 8:34 am on Mar 5, 2004 (gmt 0)

The reason some webmasters try to promote the idea of a "filter" is because simple conspiracy theories are far easier to accept than complex changes in search algorithms which don't favour their own websites.

That is a ridiculous statement. My site is extremely relevent to my #1 search term but not to be found in the top 400+ results. When I add +A to my search, my site is in the top ten. That would suggest a filter even though Google will not admit it via canned e-mail. The strange part is that the filter only seems to apply to that term which is also my domain name. My site shows up for other relevent keywords.

1milehgh80210




msg:87990
 8:50 am on Mar 5, 2004 (gmt 0)

An algo itself is a form of filter. Your website may be gold one month. The next, it sinks to the bottom of googles pan with the rest of the silt..)

mykel79




msg:87991
 9:25 am on Mar 5, 2004 (gmt 0)

madman21: find some posts here at WebmasterWorld about Local PR and how it's applied only to a handful of the most popular keywords. That theory seems more convincing to me than a filter.

landmark




msg:87992
 9:34 am on Mar 5, 2004 (gmt 0)

The title of the thread is "Beating the filter" not "Do you believe that the filter exists?".

mykel79




msg:87993
 9:49 am on Mar 5, 2004 (gmt 0)

Okey then, let me rephrase that then. IMHO in order to beat the "filter" you need to get links from sites ranking high for the particular keyword you're being "filtered" on.

PCInk




msg:87994
 10:07 am on Mar 5, 2004 (gmt 0)

> The strange part is that the filter only seems to apply to that term which is also my domain name.

Google have stated that the reason for some changes is that "some sites were ranking higher than they should".

Domain name seems to be less relevant than it was.

Keyword proximity seems to have been altered. Searching for Blue Widgets is different to "Blue Widgets". The second example the words must be together, in the first example, it seems as if G likes the words to be seperated more than it did before. Typing in a five word exact phrase which is in one of titles is not likely to bring you into the top result, unless you use the quotes around the search.

I don't think it is a filter, but something has changed very drastically. Such weight on Domain name was getting silly - just buy 10 domain names, and you can get to the number one position. In fact, companies were selling this "service" under their page one Google guarantee. No wonder G changed it.

allanp73




msg:87995
 11:50 am on Mar 5, 2004 (gmt 0)

The way I beat the filter to was make my commercial site look like a directory. The filter looks at the site and filters on the commercial element. It is easy to determine which sites are commercial and which are directories. Before Scr**gle ceased to work it demonstrated the filter at work. Brandy has removed the filter for certain city terms, but not most. The best way to find out how to beat the filter is to look at the sites listed.
I find the way by noticing a commercial site which managed to reappear. It simply developed a directory structure. This means link out to relevant sites and receive links from relevant sites and make sure these are not same sites. It is best achieved by having multiple hosts and domains. I did it using 3 domains which were on two different servers and all had the same topic (but very different content). Two of the sites are filtered out the other sits in the #1 spot for a competitive keyword. I know the filter is not based on over-optimizing because the 3 sites share an equal level of optimization. It is only the linking structure that seems to trigger the filter. Look as I said on how your site links out to other sites in the same theme this is the answer.

steveb




msg:87996
 11:56 am on Mar 5, 2004 (gmt 0)

"Before Scr**gle ceased to work it demonstrated the filter at work"

Actually it showed plainly that there was no filter.

This 135 message thread spans 5 pages: 135 ( [1] 2 3 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved