Forum Moderators: open
As you can see above I changed our home page and removed all repetitive keyword combinations. I also removed them from title.
Our index page is now back from total oblivion (not in top 1000) to somewhere in the top 100. I wasn't anal enough to count the position.
So, Yankee thanks for the observation. It helped me out. Now I know that the filter can be tweaked and we don't have a total ban.
I'll wait a few days to see what happens and to be sure it's not just a fresh listing position, and then I'll start adding prases back one by one to improve the position.
This thread has been worth it for me.
Thanks guys.
After Florida, I lost all my 2 KW money combinations not 3 word.
I intially dropped my KW density and didn't repeat any KW combination more than 2 times. My Index page immediately began to go downhill so I reversed it back and now I am trying more KW text links and outbound links and higher KW density not lower.
My fear is that G is making everyone choose to optimize or unoptimize so that you will only rank well on G. What happens when Yahoo drops G and starts to use INK? Do we now have to manage 2 sites, one for Google and one for the other SE?
Unfortunately, there is no way to check all back links, because there is no way to list all of them (neither Google nor alltheWeb.com will provide you with that information).
Interesting, though. We've seen lots of similar cases since Florida.
Ok what if your destination is three words.
Here in Spain we have many different Costa ... ...
So this is a problem we now have to write in our pages just Costa without the rest.
Our destination is not a city name like London or Paris which is one word but its three words.
Usually we have:
keyword Costa ... ...
now all the best sites for the region Costa ... ... have gone because we have to write this phrase due to our location and Google is dropping all these sites because of this three keyword location.
Google treats this as three words and because we repeat this twice in our page we have been dropped.
I don't see the logic of this filter...?
Now we have to write "come to the Costa and rent on the ... of the ..." seems crazy.
This maybe the reason that many real estate and travel sites have been badly effected.
Though I think your attempt to de-optimize to test Google's filter was a noble effort, I do not think this was what got you back in the serps. I noticed several sites coming and going which made no changes to their text. I had de-optimized too and my site temporarily came back. But now it is gone again and it is still de-optimized. I don't believe de-optimization ever worked. It might work only in those cases where the keyword density was so high like 40-100% and almost any search engine would consider the keyword usage as spam, but lowering the keyword density below 5% is silly. I have personally given up on Google when you look at the sites that rank well you will see that they are predominately directories. I think any attempts to get Google to like your site must be focused on determining what elements Google looks for in a directory and emulate these characteristics.
think any attempts to get Google to like your site must be focused on determining what elements Google looks for in a directory and emulate these characteristics.
Hi allanp73,
You are assuming that what you see in those directories that turn up at the top of SERPs is what got them there ie by inclusion. What if they are just left there because other sites where excluded.
For example in my case it looks like in order to get back to #1 for the top search term, if I follow what you suggest above, I need to develop 50,000 plus backlinks, have a page of little relevance to the subject, one link on that page to aonother page (again of little relevance) including the search term in the anchor text, and include my own Espotting ad. on both pages.
Oh and give up all of my secondary terms on Google and all of my terms on Inktomi, Fast, Teoma and AltaVista.
It could be that some of those top ranking sites (Steveb's being a notable exception) are there simply by default.
Best wishes
Sid
PS Sorry If parts of this sound a bit sarcastic I'm not trying to have a go at allanp73. I guess the frustration is getting to me.
For example in my case it looks like in order to get back to #1 for the top search term, if I follow what you suggest above, I need to develop 50,000 plus backlinks, have a page of little relevance to the subject, one link on that page to aonother page (again of little relevance) including the search term in the anchor text, and include my own Espotting ad. on both pages.
Same with me, but the site above me is tripad... with over 1 million web pages 202,000 backlinks with the keyword written once.
I have to some how create another 990,000 pages of some type of content with only 2 pages related to the keyword then get another 201,950 backlinks and then I should be No.1 again :)
No, I wont go down that line, I will carry on making my site with 200 pages each with relative content and unique content. Then increase my back links from good web sites that are relative from maybe 50 sites that I feel are worth linking too. (If google dosn't like it hard cheese every other search engine likes it, and I am not going to change the way i make web sites for the last 4 years just to please google.)
Google have punished our site and my competitors that provide a good local service and provide good solid information. I never thought i would be feeling sorry for my competitors but i do, I know too well that they have worked hard and suddenlly vanished for no reason at all.
One site i know that vanished hardly links to anyone but was well ranked they have now gone so it wasn't a link issue also the keyword density was the same as a competitors site.
It does appear that your site requires more then 500 pages to survive all sites less then 100 pages that I know of have gone. A little unfair i think as their is only so much one can write about a blue widget!
he he. Thanks Alan, but I'd hardly call it that. Greed and desperation would be closer to the truth.
I have done this twice. Once a few days after Florida, and again 4 days ago. This last time I announced I was going to do it in this thread just so it would not look coincidental.
Both times as soon as I was crawled the site came back. The first time with all the other sites, and that's why I was sceptical and didn't try again for a while.
As you say, maybe it's a fresh listing. I'll let you know.
Though you seemed not to like my point, it does reflect the reality of the situation. Google has in many categories filtered out the sites that offer content and left only directories in their place. Now in to get re-established by not be as hard as you think. The question is what about directories does Google like? How can we beter approach these qualities of a directory? Sure directories have thousands of inbound links, but some of the high ranking sites are low pr sites, so this is not the only element. Sure directories lack keywords and little relevance, but there are sites still ranked with a reasonable keyword density, so this isn't necessarily the complete answer either. My main point was not to de-optimize based solely on keyword density. Instead re-optimize based on what seems to work now.
Sorry I was in a very negative mood. I think I should be restrained from typing when that black cloud comes over.
I recently went back and read some of the early posts in that massive thread on the Florida update which started, the thread that is, about 14th November.
One of the first things that was noted was that it was index.html pages that were hit. My index.html page is the only one I have that ever ranked at #1 for the term affected. So I went back and looked afresh at my SERPs. Out of the top 100 only the #1 result is an index page only with no supporting sub pages. The notable thing about this page is its very low density of the term searched for, high relevance only for the general "field" of the term and a very high relative number of links to other domains (I think that they are all owned by the same organisation but are brand name domains, like brand-financial.com). Interestingly it has a main.html and an index.html which are exactly the same page duplicated and given another name.
The pages listed from directories that are in the top ten are all inner pages as you would expect. From memory virtually every page that was dropped out of the top 100 was a domain default page like my own. If I search for one of my target secondary terms that I rank highly for the SERPs eith shows one or two inner pages or the default page supported by a very relevant page inset under that listing.
I am thinking that what I need to do to get a high rank for my main target two word term I need to do more optimisation on inner pages focusing on one page and get some back links pointing at this with the right term in the anchor text.
Directories and big firm sites often have very general index pages not optimised for any particular terms with very specific inner pages.
A long, long time ago Brett wrote in his article on Theme Pyramids.
It may be a shock to some, but the index page has very little SEO rankings value. They rarely rank well -- you will rarely get them to rank well -- there is very little that you can do to change it's ranking. The best you can hope for is if a couple good directory listings come through a engine or two may take notice of your root page. A well ranking root index page is the exception rather than the rule. When one does rank well, it is usually as a result of external off-the-page factors.
In my niche this wasn't correct and then "wam bam thank you Florida man" it bacame 100% absolutely correct.
Could this be the key. Do any of the seniors/mods reading this thread know for sure that this is the key and were just waiting for the realisation to dawn on the rest of us poor suckers?
Steveb, what do you think?
Callum, you started a thread on this topic on May 7th 2002. What do you think now?
Brett please give us the benefit of your wisdom?
Best wishes
Sid
It appears, atleast in our niche market that Google is expanding the filter.
Post Florida only on #1 money keyword - blue widgets was hit.
As for today, blue widgeting is also being filtered, our site and all of our competitors are now gone for blue widgeting, replaced by redirects, educational, and mainly non-relavent sites.
I had hope that things would start to turn around at Google but it appears that are adding more "words" to their filter.
Anyone else seeing this?
it sure looks like the big G is plowing ahead with the "clean-up" so to speak.
In my area it looks like if you're using a kw or kw phrase more than once in any tag or paragraph you are subject to the hangman...
steveb,
I've got plenty of inner pages bringing in all sorts of traffic but the main kw phrase is awol.
The filter here is applied on a site wide basis, so regardless of whatever new pages and directories I create I don't stand an iceberg's chance in hell of getting results for that kw phrase.
This leaves me 2 choices:
1. Keep trying to figure out what makes the filter tick and risk losing my good results in all the other search engines
2. Create a new site and go for it again (lots of work)
Any ideas?
"stemming" is when the root of the word (often a verb, sometimes a noun) is the focus of the search term.
For example, a search for "search engine failure" could return results for "search engine failures" if stemming is applied.
Similarly "screw seos" could return the same results as "screwing seos".
What Google has done here however does not appear to be pure stemming, although it's not clear exactly how they are now interpreting nouns.
I've seen numerous cases where "blue widget producers" is now coming up high in SERPs where previously "blue widget producer" was - and no longer IS. This suggests a filter.
customdy,
it sure looks like the big G is plowing ahead with the "clean-up" so to speak.
Yes, I thought they may have been spending this time tweaking the algo/filter as not to punish good clean realtive sites but instead looks like that are adding more words to the filter, guess they don't see the problem. blue widgeting was our #2 word. I know our #3 word, guess it is just a matter of time now.
That's the main problem I see with the new algo these days, and why I think it's niche sites that seem to be the worst "innocent bystander" casualties - the algo is too harsh on a lot of repetitions of a kw phrase that might, of necessity, be repeated a lot on index pages and menu bars of niche sites. I'm sure google will adjust this.
The "explanation" of the algo is pretty straightforward, really - Google has targetted excessive density at the centre of a nebulous "concept shpere"
The cocept sphere includes stemming, but it is more than that - it is also related phrases, broad topical categories - think of these as stars orbitting the centre of a concept galaxy and various distances from the sun - the broad keyword "concept".
What is Google looking for? A neat spider's web that runs throughout the galaxy.
What is Google not looking for? A big dark clump at the centre.
It's that simple, really... it's just not something you can reduce to a formula. Which is great - that laundry list formula of SEO wasn't just wrecking the internet - it was also dull.
Once the few remaining bugs are ironed out, I think this thing is going to be great for the internet.
I'm convinced the G is targeting highly competitive words and word combinations, and they are probably adding more and more evil keywords to their dictionary by the minute.
theitboy,
The main kw phrase that has disappeared COMPLETELY is not in any menu at all, and is only used once as anchor text on the index page, but many external sites link to me with the same text.
I agree with you that Google is targeting excessive density for certain words or phrases, but what I'd really like to know is if the G is just looking at the index page (which I doubt) or whether it's a site wide accumulation.
This leaves me 2 choices:
1. Keep trying to figure out what makes the filter tick and risk losing my good results in all the other search engines
2. Create a new site and go for it again (lots of work)
Any ideas?
----
I concurr, however, from a test that I just finished, I can tell you that making a new site really dosen't seem to make a diference.
I put up one three weeks ago thinking the same, booted it with a few choice links, already got it to a PR4, basically same thing.
It's in the same industry I believe you are in - Travel. Sure I can get good first page ranking for lots of terms, some even keyword 1 keyword 2 but simply not the big money keywords.
Example Paris Hotels NADA Paris Resorts no problem.
Again In MHO it is a filter, and to make it for any big two keyword terms you better have hundreds (maybe more) of good quality links coming into your site.
PR means diddley SQUAT.
For the two keyword money terms: It's the links, the title set up, and from what else I see, it has very little to do with what is actually on the page.
No way is it an OOP - It's simply a money keyword filter.
For example I have attained a few # 1 pacements out of 1.7 million to 2 million results, with a title stuffed with three KW Prases each using the same city name. Title exceeded 125 characters.
If it were OOP that page would get dumped.
I'm convinced the G is targeting highly competitive words and word combinations, and they are probably adding more and more evil keywords to their dictionary by the minute.
Total agree, I could see that I may be over optimized for blue widgets but I definately was not over optimized for blue widgeting... It only shows up 1 in the index and no where else, I do have seperate instances of blue and widgeting but not together and only a few widgeting but it is was a good #2 money word. Now totaly gone. Alteast I have alot of company most all of my competition is also gone. There is still one competior that has a #3 position. It is a banner ad that he pays for on another site that does an auto redirect to his site. Nice. Way to go Google.
It's the links, the title set up, and from what else I see, it has very little to do with what is actually on the page.
Trawler, I have to beg to differ a little.. I have had an 6 year old abandoned site, very, very "honest" site shoot from 300 or so to No.1 - and it's not title optimized - the keywords (3-word search term) aren't even all included in the title.
Stranger even, the 3-word ST that ranks this site #1 only appears in it's verbatim form twice or three times within the site's 10 page content.. however if a human editor were to review the site it would definitely be listed as what it's presented to be; it's very focused on one topic/product only.
I will say that the site's link structure is very simple and easily "crawlable" which I'm certain has something to do with it's sudden rise to the top; and it's text content is very simple, all bold and centered.
I think that's also part of it's sudden appeal to the new filter/new algo. or whatever it turns out to be.. but I could be wrong. The site's listed as my homepage in my profile and the three word search term is my "Intrests" if you'd like to check them out.
.