Forum Moderators: open
The results are below, and somewhat detailed (lengthy) but I think looking through this thread they confirm a lot of what's being set forth..
If others here have noted similar things or can rule this one out because all of their top 10 have matching URL, anchor text and one of the search term words is included in the domain name I would very much like to hear about it.
I can't say our marketplace's SERP's rule out the matching URL + Anchor + Search term..
-But- in the four 3-word searches that dominate our site's industry, site's with matching anchor text + "three keywords" (adjacent, in order) in the title + again in content ( adjacent & in order) are still maintaining >10 in all four 3-word search terms.
What's puzzling (but I can't complain) is my site doesn't comply with the above, and is No. 1.. And it's outranking sites w/ far better PR, far more keyword density {due to their page totals or better keyword optimization} and uses frames, and old framesets at that..!
I perused the various SERP's this AM, after reading Hissingids' post:
Here's the differences between my (now) No. 1 site and my competitors & my other "regular" sites, in the Top 10 under each SERP, examining all four SERP's of all four 3-word search terms:
a) My (suddenly) No. 1 site's keywords in the title are not adjacent, or in order, nor are all three there;
b) The only place the keywords are adjacent & in order are in the two metas - 'description' & 'keywords'..;
c) My suddenly No. 1 site uses no alt tags whatsoever;
d) The keywords are never repeated in text (adjacent & in order or otherwise) except for one or two exceptions;
e) The site is very "honest", for lack of a better term - it doesn't repeat the search terms, verbatim anywhere in the content, I used no optimization techniques when I wrote it (6 yrs. ago) other than a description meta and a keyword meta. Even the title isn't optimized.. it's rather benign for sites in this industry, actually.
* Additionally, the two peripheral frames -have- in fact been crawled, and indexed! One of the SERP for one of the four 3-word terms shows a page off this suddenly No.1 site that is only linked to from a menu frame.. And the page that links to the page appearing in the SERP is completely devoid of keywords, period.
Here's what's even stranger - The newly listed page contains the 3-word search term (not adjacent or in order) only once - in the title.
The menu frame that was crawled to get to the page mentioned above has no keywords in it's title, no meta tags at all, no nothing.. Only links.
So it's logical I think to postulate that..
1} Google can definitively crawl framesets, at least old-style ones,
2} Pages Gbot crawls, and lists links from do not have to be optimized at all, but I'm guessing the linked-to pages that get indexed must have at least links to keyword-containg pages, and must have the keywords in the title
...? Any comments on all this from more knowledgeable members?
It looks like most are relying on one page for all their traffic. That has always been a recipe for disaster in my books.
antrat, I think most of us would agree with you.
Although my bread 'n butter site is still pulling in about 80% of the traffic as before (the sub pages all pull in equal amounts of visitors), it's the main KW phrase that is "banned" from SERPs.
a_chameleon, what you are suggesting is that sites with matching anchor text, title and content are still ranking high in your field BUT your more "simple" site is miraculously ranking higher for who knows what reason.
I think the term "honest" is a good base from which to interpret Google's new algorithm. The question is, what exactly is "honest"?
My feeling is that Google is penalizing sites which it views as "dishonest", and may be basing this on an accumulation of overused SEO techniques.
Maybe using the term blue widget production in title, content, alt, anchor text and file name is ok as long as you don't overuse it in any one, or the total uses does not exceed some threshold point.
If I were a programmer trying to catch spam I would simply assign a point value system which recognizes know SEO techniques. Once those points reach a certain level (perhaps in percentage to words on a page or even site wide) the filter discredits the site. The higher the points the lower the site in the SERPs.
If I were a programmer trying to catch spam I would simply assign a point value system which recognizes know SEO techniques. Once those points reach a certain level (perhaps in percentage to words on a page or even site wide) the filter discredits the site. The higher the points the lower the site in the SERPs.
coughs loudly....
[webmasterworld.com...]
Judging from your post on the 18th of November it looks like you have or had the same ideas as I do, that only reinforces the idea. I'd be curious to get it straight from the horses mouth (not that I think programmers resemble horses mind you), would such a system be practical for a search engine like Google?
If there IS indeed a point system being implemented here then we could scientifically test it out by removing "over-optimization" of one element at a time and then checking on new results after Freshbot appears.
A points system is the only way to explain why nobody can find one top ranked site which doesn't keep to every rule. It's out there in other areas fighting spam now. E-mail spam filters use point scoring, adding different points for each violation of their rules. And they work amazingly well.
nobody can find one top ranked site which doesn't keep to every rule
Hi Sly',
What do you mean by every rule?
I have a funny feeling that we are finding lots of answers but we haven't found the right question yet.
When I explain what has happenned to folks who don't know about it the one thing that I find completely breaks all logic is the fact that my site is #460 for one search term but #1 for the exact same term when I make one of the words plural and for virtually every other closely related term.
I go on to explain to them the lengths, that I've gone to, to try and eradicate any potential duplication and they ask why, when the problem only relates to two search terms. "Surely if you were victim of some form of duplicate penalty it would apply to all terms" they say. And I can't explain the logic.
If Google wanted to eradicate some form of spam or generally improve results, surely that spam technique could apply to all search terms and surely the "improvement" in results should be applied to all search terms. If they wanted to penalise duplicate content then surely it is better to say so and let us get rid of it if it is causing a problem.
I'm now running out of ideas on what to do to my site to clean it up further. It now gleams with inner clenliness but no signs of any movement.
Becomming very depressed by all of this.
Best wishes
Sid
But I guess you don't optimize for every word on your pages :-)
> the fact that my site is #460 for one search term but #1 for the exact same term when I make one of the words plural and for virtually every other closely related term
Too much repetitious anchor text?
The theory behind it would be that if you over optimize for one phrase (you score too many SEO points for that phrase) you are dumped for that phrase.
Do you mean anchor text on the page or in backlinks to the page (including in site backlinks)?
If what GoogleGuy says is true then anchor text from outside domains cannot harm our sites, therefore it follows that anchor text on site might be seen as a spam technique.
Come to think of it my main search phrase "blue widget production" (it seems like lots of people produce blue widgets round here doesn't it!) got knocked shortly after I added another page and linked to it from my home page with that exact phrase. In fact it is the only incidence of that phrase being used on site to link to another page (it's actually a page that I have pop up in script without toolbars as a sort of explanation and definition of the term).
What's the score here? Has anyone else used the anchor text phrase (for which you no longer appear in SERPs) to link to another page on your site?
What I mean is that for every theory of why pages might not be at the top of SERPS when they used to be before Florida, there is always a page somewhere that defies the rule.
eg
Too many keywords in title - page somewhere still found at top of serps
Repetition of phrases - page somewhere still found at top of serps
Keyword density in anchor text - page somewhere still found at top of serps
Keyword density on page - page somewhere still found at top of serps
Target phrases in meta tags and on page - page somewhere still found at top of serps
.
.
.
etc.
So, what could possibly explain this? The only thing I can think of is a cumulative penalty. A few points for each misdemeanour, summing up to a value which says "SEO - delete for keywords".
Of course there might be mitigating factors too:
Big
Old
High PageRank
Listed in Dmoz
Each also scoring -ve points
Anyone else got a better explanation?
Too much repetitious anchor text: five SEO points
Hi Dirkz,
Do you mean that the term effected is repeated too much in backlink anchor text. Or that any text is repeated too much in backlink anchor text.
Sorry if I'm being pedantic but...
Most of the anchors that I've found on other sites use our trading identity name which is like widgetsmart and the effected term is widget financial. So in my case it is very rare for a backlink to use the term affected.
I've just been over my home page and have reduced the density of the two word term to 2.5% in body text and 4.8% in on page anchor text. It is still high in the title 28% but does not occur in h tags.
The density of financial is Anchor 4.8%, First word 8.6%, Body 8.6% All words 10.6%.
And for widget it is now Anchor 2.4%, First word 4.9%, Body 4.9%, All words 7.6%.
I've eradicated any possible hint of duplication through redirects etc. and will now leave it for a while and see if what I've done helps.
I'll report back in due course.
Best wishes
Sid