My sites are all over the place. Algo or filter are the same thing as the algo filters what it deems irrelevant.
My sites keep dropping and then returning. Most of my expansion plans are based on getting good serps and these seem unobtainable in the current climate. I used to be there for plural and singular now I am there just for singular etc.
This is indeed a smart algo indeed. An anti webmaster algo. I am sure some people will not have the stomache for the fight
I seo 5 sites. Three are doing great. Two are filtered/algoed (it doesnt really matter). All use heavyish but legal seo techniques - metas, h1s, keywords, link texts.
So whats the difference? All have good content. In my view, one bad one simply doesnt have enough text to balance out the seo stuff (interior designers - designery style, lots of photos). The other has over-optimised noframes text to compensate for a mainly flash opening page.
So I believe what I need to do to beat the filter/algo is get a better balance of seo type stuff to other content on both bad sites.
My two pence.
The problem for another set of sites we have is this duplicate penalty thing. see [webmasterworld.com...]
All sites have been independently promoted in terms of links and content yet google has somehow connected them and they do not appear on the same page of results. Now I am in the process of removing all similarities whatsoever. If it is the info from the toolbar I guess there will be no return.
""Before Scr**gle ceased to work it demonstrated the filter at work"
Actually it showed plainly that there was no filter."
I know that you are delusional and believe the Scr**gle results were the same as the allinarchor but it was simply was not the case. There are just too many examples of the filter acting only on a certain set of keyword phrase rather than acting as a algo which would be applied to the serps as a whole.
Austin increased the usage of the filter. For example in the real estate field. Florida hit the major cities + "real estate" phrases very hard. Where as smaller cities were unaffected. However, after Austin the smaller cities and towns were affected and secondary terms were also targeted like "homes, condos, luxury, etc." Brandy backed off on a few major cities but still the majority of the effect of Austin is in play.
It is obvious that the filter is applied to the results after they are compiled. It is like a sieve removing unwanted particles, the filter removes commercial sites. If it targeted spam I would not complain. It targets mainly honest rich content sites and leaves directories. Personally I do not feel the DMOZ and directories like it should be considered the best results. If it wanted a directory I would just go to a directory in the first place, why waste my time with Google.
allanp73, almost everyone else has backed on the filter dead horse, but you keep trying to ride it.
The Scr**** site showed clearly that results were not filtered, that they were ranked completely differently. Sites that didn't fall or get lost all changed ranking relative to each other. Once you get over your delusion you should be able start to understand what is going on better. And for heaven's sake get out of your niche. Phenomenon that may be occuring there are not occuring elsewhere.
Google always has and always will filter results for various things. But the few who still stubbornly think the recent major changes are the result of a filter really need to address their own tunnelvision before they address ways to work on their sites.
Is there nobody else but allanp73 who knows of a method to beat the filter/new-algo?
Actually, it seems that you have tunnel vision. I work in hundreds of fields for myyself and for clients, real estate is just one field I work in.
I wanted to state that my method to get around the filter uses whitehat techniques and I would never want to employ spam or encourage others to spam Google. I believe in ranking high by developing quality sites. If Google likes directories, I build directories and good ones at that.
There is alot of sense in your answer and it goes with what I have seen since Florida. Folks who escaped Florida posted alot of theories about the sites of those of us who did not. Austin changed some of these same folks ideas. I never believed an algo change alone would hit selected keywords and industries and not everyone. My best guess now is...regular old google algo with semantic filters applied after the fact and then a hilltop or a bell type curve type filter being applied after the fact again just to target those pages that do too well at passing the first level semantic filters. So, I guess I believe in two sets of filters then... call them what you will. I call them candy coated filters myself.
I have added 10,000 pages of lesser quality googlebot pages across varying domains to pass external anchor text. I am planning a themed small directory for the main domain and want to keep these googlebot pages external to the main site. Also, I have selected certain pages and reduced KW density by both reducing frequency on some and increasing overall page size on others. I think I can find how to pass the filters with google without adversely affecting the main site with Y and MSN. It is early to see yet the results, but, that is the ultimate goal now... do no harm to the MSN and Y serps while trying to chase google... someone here referred to this as chasing a dog who is chasing his own tail and to me that about sums it up right now.
Googleguy addressed this question himself:
|"Has Google applied some sort of OOP or filter to the algorithm since the Florida update or was the drastic change in SERPs purely the result of new ranking criteria?" |
With a pretty straight answer nearly 3 weeks ago:
|It's the second one. People post around here about filters, blocking, penalties, etc. etc. A far better explanation is "things which used to work before don't receive the same amount of credit now." It's natural for people who are way out there with their linking strategies or their page-building strategies to think of a drop as an over-optimization penalty, but it's more realistic to conclude that Google is weighting criteria differently so that over-optimized sites just aren't doing as well now. |
By all means, if you don't like Googleguy's answer - keep up the conspiracy theories.....
I think a better discussion would be "WHICH of the 100 points of weighting criteria are being weighted differently...."
I think that the reason many are floundering to understand what is going on (myself included) is because there are so many things going on.
I think that the +www thing that folks point to as absolute proof of a filter is actually better proof of some form of pre selection for a range of terms but even if that is the case there are still other new things going on as well.
I've seen gateway pages stuffed with terms in the top 10 for very competitive commercial terms with very few backlinks, I've seen pages with no associated terms on the page with exceptionally relevant backlinks (not loads just relevant). I've got one site that does very well for all of my target terms and one that does very badly (but did do well pre Brandy). This site does not come back for a term that it was in the top 3 for even when I do the +www thing.
If you are in the 1000 results available you can improve your position by adding backlinks, and by a range of on page actions including using stems of the search terms and other close semantic matches, adding out links with relevant anchor text, adding pages targetted at close semantic matches, changing the anchor text pointing to your page that is in the top 1000 etc etc.
Why you don't get into the top 1000 at all is a different matter. I've seen one example where www non-www domain versions being seen as a duplicate subdomain sites caused both versions to be dropped (not in the top 1000). The owner got his ISP to switch off non-www and got back into the top 10 for the other version within a couple of weeks.
I might be wrong but this is my oppinion FWIW one of the symptoms we are seeing is caused as follows and I must stress that these ideas are presented for discussion. I think that having duplicates and closely associated sites on the same topic leads to one of these potentially getting into pre selection whilst the others do not. One way to overcome this would be to dilute the cross linking with outlinks to other on topic sites and reduce the cross linking, remove unique words that both sites share from one of them and work on broadenning the language of one of the sites. In short make them look totally independant to a technology that can map link structures (and therefore spot too tight clustering) and can fingerprint the on page language. Oh and look very carefully at your index page. Do you have any outlinks from that page or are you too intraspective.
Does any of this ring any bells with anyone here?
|I think that the +www thing that folks point to as absolute proof of a filter is actually better proof of some form of pre selection for a range of terms but even if that is the case there are still other new things going on as well. |
BINGO - most of the debate in this thread is about terminology.
I do not think there is a filter (buts thats because of they way I would define "filter")
What Hissingsid is calling "some form of pre selection" is what others call a "filter" (I wouldn't)
I think the idea of a "filter", in which specific keywords have been targeted has been dismissed.
In its widest definition, PR could be considered a "filter".
Maybe we need to agree on some terminology.
Exactly, everytime these threads start to reach some meaningfull discussion, it gets sidetracked into the pro-filter, no-filter argument. Just ban the word 'filter' & substitute 'effect' or something... )
In order to beat the filter it is best you have control over 3+ sites on different servers. The idea is to create a directory/authority, where only one of the 3+ sites will really see the benefit.
Can you say a little more abut creating a directory/authority.
Also do you mean that 2 of the sites would probably fail the filter?
|In order to beat the filter it is best you have control over 3+ sites on different servers. The idea is to create a directory/authority, where only one of the 3+ sites will really see the benefit. |
Absolutely no disrespect intended. But it's notable that since the recent 'improvements' in Google's algo, many of us who previously relied on content, are now thinking like spammers. It's a crying shame, and the big spammers will always be ahead of us. I've bought a few extra domains myself. Shouldn't be necessary, didn't use to be necessary, I guess that's the nature of 'progess'. Hmmm..
"since the recent 'improvements' in Google's algo, many of us who previously relied on content, are now thinking like spammers"
I think that is the thing confusing people... I always thought of myself as a "white hat" because I don't cloak or redirect or steal content or otherwise use techniques to sell family type services that were once the mainstay of the "adult only" industries. I don't think google differentiates us like that... I think to them if we attempt to manipulate their serps we are all the same to them. It seems to me that they have targeted seo as much as commercial terms and that end result has taken priority over the serps results themselves. IMHO that is a big mistake for a business to let outside forces have that much influence in the overall product they deliver. Just my two cents.
Luckily Yahoo is using PreFlorida Google technology. Things that won't work on Google anymore still work on Yahoo 'new' technology. :)
I agree that it would be nice if this conversation will stick more with it's initial topic.
As I'm BUILDING the site, not just SEO-ing it, and also running a business as well, I don't have enough time to experiment on how to beat it.
I can only give example of why I feel there is something to beat - I'm in travel industry. So my town guide is for town Widgetvile in Neverlands - I have domain widgetvile-neverlands.com, and, of course there is mentioning of Widgetvile and Neverlands all over the place. Guess what are my two problematic words?
I rank extremely well for "widgetvile province", "province neverlands" and few others, but I droped from #2 to out of first 1000 for "widgetvile neverlands" after Brandy. After some reducing of "widgetwile neverlands" from all parts of page I got back to #30.
OK, let's not call it a filter - just an algo reducing weight of parameters in certain cases? but there is certanly a need to SEO some pages in oposit direction (artificialy reducing KW?) to avoid being hit by that part of the algo
Don't know if I'm missing something but can't you check the title theory by looking at what's coming up tops in your SERPS - what are they using for titles?
They are using similar things - more or less same phrase. Just they probably avoided some poisones combination or concentration (URL/title/headers/text). But guessing which exact ingridients and in what kind of concentration become poisones is tricky business. Also, it's tricky to detect where to stop decreasing the keywords - in order not to loose their initial relevancy.
I'll try to explain how I got around the filter so that it will be easy to reproduce.
In my case I was fortunate that the three sites I needed were already at my disposal. I had two sites with both the same niche term (actually one site was for the husband (site A) and the other was for his wife (site B) who both work in the same industry). The two sites had unique content but both have same level of optimization (h1, titles, etc.). The third site was a directory (site C) for general term that links out to the niche sites. All sites had good outside links and pr4 or 5.
Before Florida Site A #1 for two years. After Florida not in top1000. Site B was built soon after Florida so it wasn't ranked yet. Site C targeted a different set of terms.
On site A I added links out to various resources (weather, news, etc. related to the target city term and sites from different cities but same theme) also I linked to both sites B & C all from index page. Site C links to B, but site links to neither A or C.
Once Google updated site A reappeared to #40, then gradually reached #1 spot. I did no other changes except adding these link out.
I noticed that the internal pages of site A which targeted other niche city terms did not rank well. I added the directory to these pages and suddenly they appeared for their target terms.
Now the site is #1 for primary and secondary terms.
To a robot the site would appear to be a directory.
I have repeated this experiment with clients web sites with the same results. So far have done so with 5 clients.
allanp73 do I understand correctly that neither B or C link to A? If so, why are B and C needed for A to do well? A could have linked to any sites (as replacements for B and C)?
Caveman, sure you're right. My fear is to accidentally benefit a competitor (very competitive industry). I was being stingey about any potential loss in traffic. Though, it is useful to be able to control the content therefore to achieve maximum theminess. Also, the way the links are set up site A looks like an authority, where it points to both relevent directories and same topic sites.
Actually with my clients sites most of them I linked out to completely unaffiliated sites.
If you have no fear that go ahead and link to your competitors from you main page.
|Luckily Yahoo is using PreFlorida Google technology. Things that won't work on Google anymore still work on Yahoo 'new' technology. :) |
It's amusing to see on Yahoo's masthead, just above the search box the following
|Instantly find the most popular hotels in 46,000 cities. Try it: City Name Hotels |
That's something that was nearly impossible to do in Google for several months. Intriguingly, at least in our location, the single/plural stemming has been disabled in the past few days and we can now actually find individual hotels again. Wonder if that heading is Y taking a subtle dig at G.
Thanks allanp73! I will give this a try.
allanp, i've recently added links on my homepages to charitable organizations that are related to the sites' topics. charities aren't competitors, so i don't mind linking to them. we'll see if this "linking to authority sites" has any effect on my serps. i assume it won't hurt my serps, and a little push for charity from my sites is a good thing all round. something others might want to consider...
I linked to four high ranking authority sites whose names happen to include the keyword I am being filtered for. My site is religious so I don't mind sending them traffic. I'll let you guys know if it works.
Good luck madman21. I found it takes about 6-10 links out, it does some pr drain, so be careful.
Avoid crosslinking as well.
Thats ok. I currently don't have any PR due to domain name change. My old domain -www is PR4 and +www is PR3 both are 301ed to www.newdomain.com.
By the way, my new domain name is the keyword for which I am being "Filtered". Any input on this?
Madman21 are you sure it is not a Florida-filter but rather a dup content filter?
I have a couple sites for a location.One is finished and the other is just a page "under constuction". The site with all the content is not on the first3 pages for its' keywords but the page "under constuction" is. The thing is that the page that is "under constuction" comes from a high page rank page. Therefore google is placing more relevance on a link from an authoity site opposed to the one with the quality content...daft eh?
Would a dup content filter apply to only one keyword?
My other keywords are ranking ok. Google dropped my old domain with the last update and picked up my new one. Googlebot spiders my page everyday and has indexed about 10 of my pages.
| This 135 message thread spans 5 pages: < < 135 ( 1  3 4 5 ) > > |