| This 69 message thread spans 3 pages: 69 (  2 3 ) > > || |
|Avoiding the over-optimisation penalty|
...are the rules upside-down now?
This was posted under the "dropped-site checklist thread", but I'd like to know more about it:
Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
• More aggressive kw optimization, e.g., changes to Titles, META's, <Hx> tags, placement and density of kw's, etc.
• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.
Given that the above were more-or-less the rules of SEO, I'm confused now - are we supposed to *stop* doing all the above?!? In which case, how do you indicate what your target keywords/subject is?
I've been out of the loop for a little while, so would appreciate any pointers to threads on this subject (wish there was search function for the forums!), or a re-visit to the topic by our learned members?
If these were the old "rules", what are the new ones:
1. Keywords in Title, Meta tags, H1 - all matching.
2. Keywords mixed into other Hx tags
3. Keywords at beginning, middle, end of page - good density, but natural
4. Keywords in alt text of images
5. Keywords in anchor text of links to your page, matching the keyword phrase used in your Title/Meta/H1 from above.
6. More links to the pages that you want to have more PR - i.e. link back to home page on every page, assuming that home page is important!
Comments appreciated, need to dig myself out of the SERPS hole I just fell into...any guidance on what the new rules of SEO are is much appreciated!
Vary the content slightly between the Title, Desc, H1, e.g. Longest in Desc, medium long in Title, Short in H1. Avoid keyword stuffing in image link Alt tags beyond one word. I think the old way was to repeat the same phrase in all three.
If you spend the effort in fixing off-site stuff, it will be worth it.
Thanks for the reply!
Good tip for Title/MetaDesc/H1 - does anyone disagree with this, or is this now accepted practice? Repeating the same phrase in all 3 is indeed what I have done to date!
What do you mean by:
"Avoid keyword stuffing in image link Alt tags beyond one word." Do you mean only one word in the alt tag, or only one instance of keyword in the alt tag?
Also: "If you spend the effort in fixing off-site stuff, it will be worth it." Fixing which off-site stuff? Should I get people to put random text in the incoming links?
It seems that the rules are different if your home page has a PageRank of 6 or more. For lower PageRank sites, I believe that over-optimization can trip a filter that removes your page from the index for the optimized terms, or for all terms.
On 2 of my sites, with PR = 4, I had changed the link back to the home page to include anchor text that said "Site Name Home" instead of "Home". This may be why these sites were virtually dropped from the index in the past week.
Is it SPAM to link back to your home page with the site name? No... but Google's automated ranking algorithm may think so.
Where is the "dropped-site checklist thread"?
>Is it SPAM to link back to your home page with the site name?
>Google's automated ranking algorithm may think so.
Sorry but that's Nonsense.
>changed the link back to the home page to include
>anchor text that said "Site Name Home" instead of "Home".
>This may be why these sites were virtually dropped from
>the index in the past week.
Sure, you did check the other 99 possible factors, right!?
>Where is the "dropped-site checklist thread"?
I just completed some tweaking including making a phrase in the global include header into a <h2> tag. Now I'm gone from G. Toast. The only pages found in the index are a pdf and a map, neither of which have the header. Site is new, was sandboxed for 3 weeks, then came out and was OK for 2 weeks. I guess it's possibnle that it's sanbox-flux, but my money is on OOP.
Back to undo the tweaks. opinions on whether the the recovery be as swift as the dropping?
were you tweaking up or down? i mean, e.g. did you drop to an H2 from an H1, or go up to H2 from a P tag? either way, my sympathies :-(
My sites are a 2 word key phrase which appears in the title and other meta tags, as well as on the page frequently. I think that adding backlinks from every page on the site that include the site name (key phrase) helps to cause the over-optimisation penalty.
There is no other reason whatsoever for my sites to be dropped... but they were.
RossWal - Sounds like normal sandbox to me. If your phrase you added was a very competitive keyword phrase, it could have triggered something, but my money is on sandbox-flux.... if it exists?!
DVDBurning - I think you might be right, although this 'oop' seemed to die down a few months ago with old and very optimised sites coming back without changing them. Maybe it applies to a small range of keywords, and you have hit one of them.
Can anyone provide some recommendations for the maximum keyword or key phrase density that can be used without tripping the over-optimization filter?
My experience is that the only good way to get your site to come up for sure is to get good incoming links. I play with the onsite stuff, but we all know that our new sites don't come into play until we get those links.
If you take google's perspective, a site that has no incoming links can't be too good, no matter how it is SEOed.
I have seen evidence of what was said about how when you get past a certain page rank that SEO work pays off more. It is like you have built some credibility with them and now they will trust what you say is what.
Tweeked up. Phrase removed from logo image and placed in H2 tag. That violates hierarchy in that h2 now precedes h1.
I lean towards OOP because pages are gone. Not burried, gone. Only a couple remain, and cooncidently they don't have the new h2's. This is not a competitve phrase, but it is a location known to Local Search.
Anonymouse - There is a search feature on this forum:
use the G site-specific search - i.e
google penalty site:webmasterworld.com
Typed into G's search will search this site for your keywords.
forgive me if you already knew this - just thought it may help :-)
This thread is bogus.
When we talk of "over optimization" we are talking of:
- misleading cloaking.
- autogenerated content.
- redirect games
- lovely 3rd level domain playing with wildcard dns.
- insane keyword to stop word ratios.
- link pages for the sole purpose of pr.
We are not talking about h1 tags.
"- lovely 3rd level domain playing with wildcard dns."
Why does google allow this crap? Spammers are laughing at google.
>why does google allow it.
Because they are a biz model based on everyone elses content. That includes sleeping with some powerhouses like Amazon, Netscape, and a dozen other top 50 sites that use various tricks and gimicks - just like you do.
"They don't produce anything. They exist off the production of others." - Carl Fox Wall Street.
- misleading cloaking.
- autogenerated content.
- redirect games
- lovely 3rd level domain playing with wildcard dns.
- link pages for the sole purpose of pr.
I don't do any of this crap. My only concern is the keyword density. I'm not sure what you mean by keyword to stop word ratio, or why ignored stop words would be considered in the keyword density. I've reduced the keyword density... but now have to sit and wait to see what will happen.
Do you have some sense for the desired keyword / key phrase density, or what would be considered rediculous?
"Because they are a biz model based on everyone elses content."
Yes, we all know that. But by including these third level domains they are displaying spam redirects, not content.
In reference to key word density, I have tested and can confirm that when I put key words in my page link text I can add more % of each key word(that is in link text) than if I just had text only on page and it helps me tremendously.
In outbound links? To other pages within your site, or to other sites?
To both as it doesnot really matter as they are both a link to another page. To put it in perspective, can and should a search engine spider care if it is a link to an outgoing page on internal page. I dont think it should as a link is just a link to a link. A visitor to a website doesnot think if the link is either outbound or internal so why should we treat it any other way.
Brett - I don't understand your comment at all.
None of your examples are "over optimisation" ........
"- misleading cloaking." - Thats just plain pointless
"- autogenerated content. " - Can be good, can be bad, but other factors will determine if it is over optimised or not.
- redirect games - This is not over optimisation, its just bad optimisation.
- lovely 3rd level domain playing with wildcard dns. - Too complicated for "over optimisation".
- link pages for the sole purpose of pr. " - Thats what link pages are for, lets not pretend otherwise!
IMHO "over-optimisation" are tactics that are good and fair, but done in an unnatural way which demonstrates an awareness of what the algo may be looking for. If Google sees too much emphasis on a keyword phrase, via a combination of h1, anchor, bold, title etc. it will assume the webmaster is trying to get an advantage, not via pure content but via code manipulation, and google appear to (in some sectors) ignore this code.
Over optimisation = Google has sensed you are algo aware.
Can anyone reference a thread wherein GoogleGuy or someone from Yahoo! references and validates "filters" and "penalties"?
I have a hard time believing that algos simply change their value system. I don't think they really penalize.
I could understand identifying cross-linking, but is this a penalty, or are unique environment links rewarded more?
I wish people would quit saying over optimization penalty. There is not one. People have been saying this ever since Florida. Brett is right but I think we should not call it over optimization. It should be called black hat. If you repeat your word 50 times on a page that only has 100 words you are not over optimizing you have a bad site. Most of the time when people are talking about penalties or filters they have a site that breaks all kinds of good practice rules. They also have like one or two backlinks with no anchor text. Unless you have a fair amount of backlinks with anchor text from different sites you are not going to do well. There are just so many unthought out theories on this board. It is funny listening to armature SEO's that have all kinds of weird theories. I have had a few myself. We are so superstitious. The SEO field has more unsubstantiated theories running around than the diet field. I know people that spend a ton of money and time on having their sites all registered at different places and hosted at numerous hosting companies. There is no proof that this is necessary. I even hear people say that cloaking helps you rank better. The list goes on.
grant go back and read some of the florida threads. That information is burried in a lot of threads. You could also type (site:webmasterworld.com GoogleGuy florida) into Google. Play around with that for a while. You could also add -2004 to narrow it down.
|I wish people would quit saying over optimization penalty. |
You hit the nail on the head :)
I believe based on some articles that I read that the new Algorithm only takes "on site SEO tactic" as 20% of the formula.
You need good themed links (on topic), relevance, authority are all major factors.
As GG said, what worked in the past doesn’t necessarily work now. In the past you could spam the hell out of a site, and it would get somewhere. I'm afraid that that sort of practice no longer works.
By decreasing KW density and other elements, you are shooting yourself in the foot with the other search engines.
That my opinion :)
There's a thread running called "A dropped site checklist", which many people have been praising. I quoted that site at the beginning of the post, as I wanted to dig deeper into what caveman was saying re:keyword density and backlinks.
My sites were sitting steady for months, rode out the Florida storm nicely, even good for me as many competitors did not. A few weeks ago, I changed my Title/H1 tag/keyword phrase from being a 3-word phrase to a 2-word phrase (which is in fact the main search phrase), and also added some site-wide links containing the 2-word phrase in the anchor text.
Wham! Goodbye to Top10 SERPS - I thought by more closely targetting the 2-word phrase, I would move up to 1-3 zone, from the 4-6 zone - not off the page alltogether!
Given that the site has a ton of solid backlinks and is a PR7, and nothing else changed, I think I can be pretty sure that the drop in the SERPS is related directly to my more aggressive targetting of the 2-word phrase. Note that I am not talking about outrageous overuse of keywords, simply the use of the keywrods in what used to be "accepted practice" (as per my first post in this thread).
My main question, which does not appear to have been fully addressed, is "is this accepted practice no longer valid" - i.e. is it now a no-no to put your keywords in title/H1/top page/throughout content/in links/in anchor text - because Google then thinks you are "algo aware" and penalises you?
The follow-on questions, in which case, is "what the hell do you do to target your keywords"?!?
"is it now a no-no to put your keywords in title/H1/top page/throughout content/in links/in anchor text"
I agree with Brett. I don't put much stock in the oop. I think it is more spam techniques that trip up websites.
I have done some testing as far as keywords in H1, H2, title and kw density. Never did 'trip' a filter at Google. (Course the pages looked liked crap).
I have no answer as to why specific sites drop. One of my sites is buried in Google. I update it frequently, etc., but it's still buried. It's less optimized than my best site. Not sure why it happens.
I believe Bretts list woould be more legit reasons for a site to drop.
I think it is funny how webmasters who have not suffered the effects of this particular filter are so confident that it doesn't exist. If you had the same experience as AnonyMouse and me, you might feel differently. We are talking about sites that have good PR, were ranking highly in the SERPs, and have plenty of high-quality incoming links. We are not talking about spammy sites with auto-generated content or cloaking or anything that violates Google's stated TOS.
There have been too many webmasters reporting their experience here at WebmasterWorld to dismiss the fact that Google has implemented a filter that penalizes sites for optimizing too heavily on particular keywords or key phrases. It seems clear to me that internal backlinks that use the key phrase can help to trip the filter... that is to say that if you have every page on your site link back to the home page with a keyword combination instead of "Home" - EVEN IF THIS KEYWORD COMBINATION IS THE NAME OF YOUR SITE - you may get booted from the index. Of course, the keyword / key phrase density on the page is also a critical factor.
The only person who could say with certainty that this doesn't exist is someone who knows the recipe for the secret sauce... and I doubt we will hear from him. I appreciate the feedback from AnonyMouse and others who have confirmed what the problem may be for me.
| This 69 message thread spans 3 pages: 69 (  2 3 ) > > |