homepage Welcome to WebmasterWorld Guest from 54.198.130.203
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 135 message thread spans 5 pages: < < 135 ( 1 2 3 [4] 5 > >     
Beating the filter
Does anyone know how?
madman21




msg:87967
 9:57 am on Mar 2, 2004 (gmt 0)

Google is filtering my site out of SERPs for my main keyword. In is extremely relevant and I have plenty of on topic content. This happened a few weeks before Brandy. I ranked about #9 for the keyword previously. I tweaked my site title and description and bam filtered. If I run a search for Keyword +a I rank #1.
Any ideas on how to fix this?

 

landmark




msg:88057
 1:17 pm on Mar 11, 2004 (gmt 0)

Granpops - you haven't lost any SERPs at all? How do you account for that? Are you talking about just one site? What kind of site architecture do you have? Backlinks? PageRank?

GranPops




msg:88058
 1:25 pm on Mar 11, 2004 (gmt 0)

lots of commercial sites for friends,as it's only a hobby for me.

However, as some of the seniors on here know, I know absolutely nothing about HTML, and could not produce a web site to save my life.

The only two things I rely on is the maths of the ALGO, and that no webmaster skills must be used by the youngster who produces the pages for me.

And I mean NONE, I have a list of 37 "skills" that have been discussed on here, and they are all forbidden on my pages.

Old fashioned HTML produced by a 12 year old.

So, as the pages created by highly intelligent and creative webmasters go down, others move up without making any changes.

So I guess that rather than it being a case of "beating the filter", it maybe more a matter of producing good text pages in the form that the visitor and the search engines want, rather than what a webmaster wants.

That's all I know

GranPops

SyntheticUpper




msg:88059
 2:01 pm on Mar 11, 2004 (gmt 0)

Old fashioned HTML produced by a 12 year old.

I'm not sure I want to see webpages created by 12 year olds in the Google top ten ;)

Nevertheless, "What I had for school dinner yesterday" might be relevant for some...

Surely you must be kidding :)

Hissingsid




msg:88060
 2:12 pm on Mar 11, 2004 (gmt 0)

Are you using relative or absolute urls for internal links.

Could you elaborate on the importance of this?

Hi,

Just a suspicion that if you have a dupe problem its best not to compound it by having relative URLs. If you use absolutes Googlebot is only going to find the odd page as a dupe if you use relatives it could think that you have two duplicate sites.

By dupes I'm referring to the www non-www versions of the same domain resolving to the same index.html page with a 200 OK server response.

Best wishes

Sid

landmark




msg:88061
 2:18 pm on Mar 11, 2004 (gmt 0)

GranPops, with all respect, from your posts it sounds like your results are just good luck and you have no real idea why you've done well.

GranPops




msg:88062
 5:28 pm on Mar 11, 2004 (gmt 0)

Good luck maybe, but Professor of Mathematics might have a little to do with it. My hobby is attempting to create an ALGO to crack the G ALGO.

Maybe I should have added, that I produced mytown real estate, on reading of a so called filter, and got No. 1 for the 3 cities chosen. All the leads I give away to those who used to be in the SERPS.

If you read a document from Stanford that is No.1 in an extremely competitive search, it is No.1 because of incredible relevant backlinks.

However my point is that it is a document that covers a particular topic in an interesting way. Every word that one can read is on topic, and here is the rub, there is nothing hidden in the code that is not on the page.

The Algo is gradually downgrading pages that have these extra webmaster skills, leaving the clean pages to float to the top.

As for the 12 year old, that was to make the point, she is actually 23, and perfectly capable of producing my mathematically designed page on the screen, without resorting to anything hidden.

We have plenty of search terms at No.1 against millions, all chosen because they were discussed here on WW.

At least try it, create a sample page on any search term that is against millions.

And try to remember "producing good text pages in the form that the visitor and the search engines want, rather than what a webmaster wants."

GranPops

landmark




msg:88063
 5:39 pm on Mar 11, 2004 (gmt 0)

At least try it, create a sample page on any search term that is against millions.

Try what? All you seem to be saying is that you create pages with very simple HTML. But you don't need to be a professor of mathematics to do that ;) Many pages that use simple HTML have been removed from the SERPs. Many people who understood the old G ALGO have been removed from the SERPs. What is special about your pages?

GranPops




msg:88064
 5:51 pm on Mar 11, 2004 (gmt 0)

"All you seem to be saying is that you create pages with very simple HTML"

That is merely the final result

"But you don't need to be a professor of mathematics to do that"

Exactly - but it helps in constructing a page taking into account what we think this week's G ALGO is trying to exclude.

"Many pages that use simple HTML have been removed from the SERPs."

All those that stickied me admitted that there were "web design skills" in addition to what was on the page.

"Many people who understood the old G ALGO have been removed from the SERPs." The ALGO engineers are testing new ALGOs maybe 8 or 10 times a month.

GranPops

funandgames




msg:88065
 7:14 pm on Mar 11, 2004 (gmt 0)

Grandpops has a point.

All 400 or so of our web pages contain nothing but HTML and lots of text.

The pages are top ten for all phrases (many competitive) and..

The pages don't get 'hit' by Google updates and in fact barely move from their top spots ever. Same positions have been the same for five years or more.

allanp73




msg:88066
 7:44 pm on Mar 11, 2004 (gmt 0)

Granpops,

All of my sites use basic html (no database and no junk code). I don't believe in using spammy tricks to get to the top, but these clean sites were hit hard by Google's filter. Even excellent competitors sites were killed. For the most part the major terms I target are dominated by quasi-relevent directories. I even see sites that cross-link to death ranking well. My approach to resolve the problem is to build good directories and hope that my rich content will once again rise to the top. I also praying to the Google gods to awake up and see Hilltop algos are junk.

SyntheticUpper




msg:88067
 7:49 pm on Mar 11, 2004 (gmt 0)

Dear GranPops,

On the one hand you're saying that your 'no tricks, simple html' naturalistic approach to web design, automatically beats all filters due to its inherrent innocence.

And then you grandly announce that you're a professor of mathematics, ahem, and that in fact, behind it all is an extremely cunning system: you've analysed the latest algo, and built it into your web pages.

Shome mishtake surely ;)

t2dman




msg:88068
 9:09 pm on Mar 11, 2004 (gmt 0)

Have noticed the first large change in my serps since adding about 8 keyphrase outbound links. 54th to 1st for phrase=city type business. Outbound links to 2xcity, 1x city type business, 1x city type business's, 4 x type business's - all phrases I am trying to get to the top of.

Only problem, one of my clients hosted on same servers with links back to me, was first for the phrase, now 60th

Ahhh.

Still to see any large changes for other pages and sites. Google has cached the pages with outbound links on, no significant movement.

coffeebean




msg:88069
 10:40 pm on Mar 11, 2004 (gmt 0)

>Does the anchor text of the outbound links contain your term?
Yes, this is a must.

Have noticed the first large change in my serps since adding about 8 keyphrase outbound links. 54th to 1st for phrase=city type business. Outbound links to 2xcity, 1x city type business, 1x city type business's, 4 x type business's - all phrases I am trying to get to the top of.

So help me understand this: you place outbound links containing your exact target keyphrase as the anchor, correct?

1) Are these links to authority sites?
2) Aren't you linking to your competition?
3) Or are the linked-to sites only partially related and the anchors just variations or pieces of your keyphrase?

KevinC




msg:88070
 11:40 pm on Mar 11, 2004 (gmt 0)

well after reading a few posts by allanp73, doing some research and ingnoring a few posts by others ;)

It definatly looks like currently google is giving good weight to outbound links to authorities. I don't think its not necesarrly directory style sites that are being preferred - but sites that are well connected to authorities in their particular area.

Of course I'm just looking in my area - but I think Allanp73 has produced some of the best advice I've seen regarding missing sites.

extra webmaster skills
what does this mean by the way?

[edited by: KevinC at 2:17 am (utc) on Mar. 12, 2004]

SyntheticUpper




msg:88071
 11:44 pm on Mar 11, 2004 (gmt 0)

Brett has always stated, have an outbound to an authority site on every page.

allanp73




msg:88072
 12:03 am on Mar 12, 2004 (gmt 0)

KevinC,

Thanks. It's nice to be appreciated.
Now if I could just get DMOZ to like me again.

SlyOldDog




msg:88073
 12:11 am on Mar 12, 2004 (gmt 0)

Hey there Granpops

Build me a simple site and get it to #1 for my favourite 2 keyword combination for more than a month and I'll mail you a cheque for 2000 dollars a month. No bull. That's less than I have to spend on Adwords for those keywords.

We can't get past #10 and we've been trying for nearly a year.

______________________________________________________________

My personal take on the filter theory is that Google is using words on the page to link sites to a theme. This is why some sites appear filtered.

To see what I mean consider a hypothetical search for say, "banana processing". Google will want to present an even selection of sites covering the full spectrum from bananas to processing containing both terms.

Now let's say that a bunch of these sites also have the word "apple" on the page. What if Google recognised the connection and decided to list only a fraction of those pages in the top 100? Is that a filter? Well, if you say so :)

Google could then go through and find the next most popular term which appears on the remaining pages in the selection. Say "food". So then if google listed only the most relevant page containing "food", and put the others to one side, we have the makings of what might be called a filter. But what is it really? Just a way of ensuring that the themes are spread among the results.

So far as I understand it, this is how Latent Semantic Indexing could be applied to rank results.

t2dman




msg:88074
 12:40 am on Mar 12, 2004 (gmt 0)

Definately thanks to AlanP73!

1) Are these links to authority sites?

Top SERPS on Google for each of the terms

2) Aren't you linking to your competition?
3) Or are the linked-to sites only partially related and the anchors just variations or pieces of your keyphrase?

Yes, but doesn't a magazine have competitors ads. If it gets us to the top, as compared to 54th, or +1000, I am sure that loosing several customers is not an issue. The benefit of being first is certainly worth it. I link to most relevant sites for customers on top ten/twenty for each of my target phrases, therefore they are already top ten, I get first as a result... Its them that lose out by my linking to them.

Until now I have not needed to link out much to get first, didn't want people looking elsewhere like you mention. Now Google seems to require it, so now we link out.

Just looking forward to seeing what happens with all my other pages. And getting back the clients page that I have just lost.

KevinC




msg:88075
 2:56 am on Mar 12, 2004 (gmt 0)

hey t2dman
so are you saying that you tried adding more outbounds with other pages and only one page made any big movement in the serps?

Drum




msg:88076
 4:27 am on Mar 12, 2004 (gmt 0)

On March 5th I made the changes as allanp73 suggested.

My site had held the #1 or #2 spot for 2 years before the "filter" hit a few months ago. The filter had me sitting at number #37 for several weeks now.

Now 6 days later i am now #99. Sorry but i went in the wrong direction.

I didn't change anything else on the entire site and i was not going after new links. My site does not have any spam tricks on it.

I hope i am the only one this has happen to.

It was worth a try.

phazex




msg:88077
 7:22 am on Mar 12, 2004 (gmt 0)

I'm new to SEO'ing, and I'm wondering what is defined as an "authority" website. Thanks

-phazex

Hissingsid




msg:88078
 8:30 am on Mar 12, 2004 (gmt 0)

It definatly looks like currently google is giving good weight to outbound links to authorities. I don't think its not necesarrly directory style sites that are being preferred - but sites that are well connected to authorities in their particular area.

Of course I'm just looking in my area - but I think Allanp73 has produced some of the best advice I've seen regarding missing sites.

Hi,

Three of my outbounds are to ODP, Google Directory and Yahoo categories that list my site. Take the hint Googlebot!

Some of the others are to high ranking sites that offer the same service as me but in different parts of the World. The product is one that is very geo specific. Two are to other sites of mine and two are to major organisation sites that do sell the service that I sell but its not their main thing.

If you are careful in selection of the sites that you link to then you can choose ones that will do you more good than harm. All of these are on my home page. I only have one or two outbounds on inner pages.

I have also broadened my theme a bit by adding pages on topics around the theme and added links to these from my home page.

I can't definitely confirm that any of this has a cause and effect relationship though. However I am fairly confident that it has moved me up in Teoma from #7 to #1.

Best wishes

Sid

Hissingsid




msg:88079
 8:32 am on Mar 12, 2004 (gmt 0)

Build me a simple site and get it to #1 for my favourite 2 keyword combination for more than a month and I'll mail you a cheque for 2000 dollars a month. No bull. That's less than I have to spend on Adwords for those keywords.

Hi SlyOldDog,

I wouldn't mind seeing that brief ;)

Best wishes

Sid

SlyOldDog




msg:88080
 11:12 am on Mar 12, 2004 (gmt 0)

I'll give Granpops the first shot :)

KevinC




msg:88081
 5:52 pm on Mar 12, 2004 (gmt 0)

Hey Hissingsid,

Teoma is "nice" but what about google - did your changes cause any noticable effect there?

Scarecrow




msg:88082
 6:03 pm on Mar 12, 2004 (gmt 0)

The word "filter" is a handy euphemism for describing what no one can dispute is a two-step process since Florida.

The first step relies on precomputed variables. The old-style PageRank score is the purest example of this. The chief characteristic of a precomputed score is that it can be computed independently of knowing what terms users will be using in the search box. PageRank, for example, was computed without regard to any on-page content. All it used was external links pointing to the page.

This first step may have been modified by now, so that some keywords are extracted from the page, and these are used to characterize the general content of the page. But without knowing the user's search terms, there is a limit to such categorization. One candidate for this possible modification is the use of anchor text in links to describe the content of a page.

_________

The second step is what we've seen since Florida, and is popularly referred to as a "filter." Let's say the user has their preferences set to show 20 results. The algo could take the top 60 results, ranked by the scores from the first step. The intention is to use the second step, the "filter," to scrape off two thirds of these 60 results based on some new linguistic algorithm.

Two characteristics of this second step are these: 1) You already have the full-text of each page available, because at least one-third of them have to be decompressed to get the snippet anyhow; and 2) you know what terms the searcher entered.

Using some new linguistic algorithms, you now have a limited subset of pages and the user's search terms. It does not take that long to screen 60 pages. And if keywords for each page were pre-extracted in step one and stored separately from the page itself, it would take even less time.

__________

The evidence for this two-step process comes from the "stupid filter tricks" that invoke advanced search options. These tricks apparently skip the second step, because the algorithms in this second step were not engineered to accommodate advanced search options.

I don't care if you call it a "filter" or not, but it's as good a word as any to describe this second step. What amazes me is that four months after Florida, so many pundits are still in denial that this is happening. Now that suggests a conspiracy to me.

moose606




msg:88083
 8:11 pm on Mar 12, 2004 (gmt 0)

I am still confused by alanp73's strategy. Since site A and C both link to B, and B doesn't link to either one, wouldn't it make sense that site B would be at the top of the serps? Wouldn't this make site B the "authority" between the sites? Does that me that inbound links do not count? Only how many outbound links to top ranking sites? I would think that it would be the opposite.

landmark




msg:88084
 8:21 pm on Mar 12, 2004 (gmt 0)

For one 2-word phrase my site was #1 before Austin. Then it dropped to #186, but bit-by-bit rose back up to its current position of #27. Did I make any changes? Nope.

My point? That it is very hard to prove a statment like "I did X and it beat the filter" when I can say "I did nothing and it beat the filter".

My conclusion? The filter (or penalty) is a moving target. A target that moves like this is impossible to hit (or avoid) reliably.

I strongly believe that there is a non-deterministic element to Google's algorithm, which makes it impossible to guarantee that anyone can "beat the filter". That is why none of the theories presented (Hilltop, semantics, local rank, commercial filters, etc, etc) can adequately explain what is happening. I think that the real explanation is actually much simpler than all these high brow theories.

itisgene




msg:88085
 9:58 pm on Mar 12, 2004 (gmt 0)

I have been experimenting a few things for one of my sites for the last few weeks.
In My two word competitive keywords, I was #4-6 before/after Florida/Austin and become #3 after brandy. Since I changed an anchor text last week on one of the incoming links(also mine) to the site, I dropped to #12 within a few days. So, I changed back to the original anchor text(with exact target keywords) and added one more link to the target site. It came back to #3 yesterday and it is #1 now as of today.

All These experiments happened after Brandy only with Fresh updates.

So, I think having a good targeted link from a decent(highly crawled) site will change the ranking quite dramatically. It depend on the industry or competitiveness of course. It is niche but there are many (optimized) sites for the keywords.

Instead of giving outbound links, getting a decent link with targeted anchor text would do better IMHO.

Hissingsid




msg:88086
 8:42 am on Mar 13, 2004 (gmt 0)

Teoma is "nice" but what about google - did your changes cause any noticable effect there?

I don't think that you can get past #1 can you ;o)

I can't honestly say I did this stuff and it was these elements of what I did that got me back to #1. There's no cause and effect relationship in my case. I had worked my way up from #450 to #96 before Brandy then Brandy put me straight back into #1 but other sites I look after plumeted at the same time.

The point is that the stuff I did was not harmful but I don't know for sure if it helped. I wonder if Brandy was all about adding back in the influence of ODP listing in a related topic cat and that ODP link pushed my site over the hurdle. The sites that did badly at Brandy in my case are not ODP listed because the editors of those categories don't edit any more :o(

Best wishes

Sid

idoc




msg:88087
 1:16 pm on Mar 13, 2004 (gmt 0)

"I wonder if Brandy was all about adding back in the influence of ODP listing in a related topic cat"

*I sincerely hope that is not the case* that the listing carries enough weight to influence the actual merchantability of a site.

<self sacred cow snip>

This 135 message thread spans 5 pages: < < 135 ( 1 2 3 [4] 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved