homepage Welcome to WebmasterWorld Guest from 50.16.112.199
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 236 message thread spans 8 pages: < < 236 ( 1 2 3 4 5 [6] 7 8 > >     
A close look at what over optimization really is
brinked

5+ Year Member



 
Msg#: 4442234 posted 1:51 am on Apr 18, 2012 (gmt 0)

I will not share my entire story about how I came into the SEO world, but lets just say that in order to have any type of success in this industry, I needed to overcome what turned out to be over optimization penalties.

Since then I have pretty much made a living at buying established sites that suffer from obvious on site over optimization, fixing them up and profiting big time. One site I paid $12,000 for and within 2 weeks it was recovered, making a good $400-$600/day. Thats what type of money people can be missing out on with a simple over optimization penalty.

But what exactly is over optimization? More importantly, what is Google's latest update all about? While I do not have the answer to everyone's questions, I have a lot of experience with over optimization and I think I might just be the most qualified person in the world when dealing with this type of penalty.

I know a lot of people feel vulnerable right now and that is why I am making this post. Looking over on the google help forums, many of the issues I am seeing have blatant over optimization problems on site. One lady had a site that offered free widgets for kids. How do I know this? When I went to her site, "free widgets for kids" was in the domain name, the page title, and every other sentence on the home page. I did not dig any deeper in her site and I am not saying this is the exact cause for her punishment, but I am pretty sure it is. If you read her content, it sounds spammy and unprofessional and I have no idea how this site managed to elude over optimization penalties earlier.

In any case, here is a list of over optimization factors. Some are tried and proven, others are semi proven and some others are just strong theories.


1. Keyword/phrase over usage. Known by seo experts as keyword stuffing. This is also the most commong form of over optimization and also the easiest to recover from. When you are trying to rank for a specific phrase, you want google to find that term. Many webmasters will do this by placing the same term in the page title, url's, meta tags, body text, anchor text, header tags etc etc. It is important NOT to do this. Google will know what the subject of your site is without having to repeat the same phrase over and over. That just gives a poor user experience. Not only that, google will know you're trying to game the system. You could possibly overcome this by strong content and a good backlink profile, but it will likely still hold you back in some way or another.

2. Redirects. I seen a major competitor just lose its number 1 position after holding it steady for 3 years. This site was not only number 1 for this industry, but its also one of the top 500 most popular websites in the USA. It is a huge huge brand and it has just dropped to position #7 after being #1 for 3 years. One thing I noticed about them is they redirect several domains to their main site and they have bought out competitors over the years and just redirected them to their main site. Be careful not to redirect too many sites to your main business, and if you do, make sure to follow recommended procedure as offered by google.

3. Same/similar anchor in back links. This is an oldy but a goody. The best backlink profile is a well rounded, diverse and natural looking profile that has links coming from many types of venues. If all of your links come from blogs, that looks pretty artificial, what are the odds that all of the sites linking to you all happen to be blogs? Top that off with if these blogs link to lots of other unrelated sites, it wont take google's algo too long to detect that. Aggressive reciprocal links can hurt you as well.

4. Same Niche same server - This one I truly believe in, or it could be me just being paranoid. I always believed that having 2 websites on the same server in the same or similar industries will cause 1 or both of them to be punished. For this reason, I leave no trail for google to connect any 2 of my sites together unless they are completely unrelated. I make sure to have different whois info, I never use GA, adsense or any other means for google to connect two similar sites. I have no concrete proof of this one, but its something I feel strongly about.

5. Doorway/thin affiliate - Is the main goal of your site to get people to another site? Then google may just decide to drop your site and favor the source site since that would provide a better user experience. Counter this by offering something truly unique.

6. Link schemes/cheap backlink packages - Quite simply, dont waste your time. This does more harm than good and even if you do get a good ranking fast, it will fade soon enough. Any back link package is truly ridiculous and you are playing with fire. If you see any package that is offering you more than 100 links for less than $10, you should stay far far away. it also goes unmentioned you should stay away from ANY back link building schemes that include gaining mass amounts of backlinks in a short period of time. Links are votes and should be earned.

This list was rushed and not proof read so take it as it is. I hope this insight can help some people. These are just the 6 most common factors that I see webmasters suffering from. Best of luck.

 

bekyed

10+ Year Member



 
Msg#: 4442234 posted 9:12 am on Oct 24, 2012 (gmt 0)

Hi Brinked.

Our website is 10 years old has the keyword in the domain and we have completely tanked to page 5 from no1 position and lost around 80% of traffic from Google.

I sent in a request to Google who came with 'no spam actions found'

I have de-optimised the website by removing the search term from the drop downs and the site has dropped lower now.

The keyword is still in the heading 1, but no longer in the title.

I have also noticed that no sites in our niche appear in the top 20 with the main keyword in the domain.

My next move is to remove the keyword from the page completely as the domain holds the strength.

Ironically we are no1 in Bing for most terms.

Can anyone suggest more as I am completely lost.

Bek.

bekyed

10+ Year Member



 
Msg#: 4442234 posted 10:06 am on Oct 24, 2012 (gmt 0)

And can I add Google have taken away our title in the serps and replaced this with the url, which is not helpful.
Can you shed some light on this please?

Thanks.

chalkywhite



 
Msg#: 4442234 posted 12:59 pm on Oct 24, 2012 (gmt 0)

I hope having the page title the same as the URL is not a big no no as wordpress by default does this

diberry

WebmasterWorld Senior Member



 
Msg#: 4442234 posted 1:45 pm on Oct 24, 2012 (gmt 0)

I hope having the page title the same as the URL is not a big no no as wordpress by default does this


Actually, no, by default Wordpress generates a nonsense string. In our Permalinks settings, we change it to "%postname%" to get the effect you're talking about. That said:

[support.google.com ]

A URL like http://www.example.com/index.php?id_sezione=360&sid=3a5ebc944f41daa6f849f730f1, is much less appealing to users.

Consider using punctuation in your URLs. The URL http://www.example.com/green-dress.html is much more useful to us than http://www.example.com/greendress.html. We recommend that you use hyphens (-) instead of underscores (_) in your URLs.

Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.


Google is clearly advising we use a URL of words, and it seems natural enough for it to match the title, or at least include keywords that are relevant to the page.

Ditto on images: [support.google.com ]

There's no quotable snippet, but the basic idea is that an image should be something like "green-poodle-sassy.jpg" with an alt tag of "My cousin's green poodle Sassy walking through a field of daisies." But the alt tag should NOT be "green poodle poodles dogs greens..." etc. for, like, a whole paragraph.

Remember too that the alt tag is what gets read out to a visually impaired person if they have a browser to do that. It should sound natural, like how you'd describe the image to them.

It kind of breaks my heart to see people wondering about crap like this. If Google's getting this petty about stuff that's really a matter of best practice, then the next step would be "no site with an R in its name can rank" or something equally vapid. And even if you think Google might get to that point, or already has, my point is: we have to get off this ride at some point.

chalkywhite



 
Msg#: 4442234 posted 2:17 pm on Oct 24, 2012 (gmt 0)

Your correct diberry, im on autopilot when I setup a WP site now, its one of the first things I do as do most I bet. If it is true though and they dont want us to have the title and url the same then it feels "unatural" to make up a generic title or url. I want my users to know EXACTLEY what the page is about.

santapaws

5+ Year Member



 
Msg#: 4442234 posted 2:29 pm on Oct 24, 2012 (gmt 0)

dipberry i think you are hitting the nail there. Pretty much keep adding something new and random for no purpose other than it keeps anyone trying to make money from their serps chasing their tail when only the sites they want to rank get a freedom pass. Its easier than actually trying to get the algo right to simply rank the right sites, they gave that up long time ago when it was never ending game and new spam just replaced old spam. What you have now are a bunch of guys who themselves are deciding what kind of sites THEY (personally) want to see and setting the many algo's and bolt-on's to achieve that aim.

diberry

WebmasterWorld Senior Member



 
Msg#: 4442234 posted 4:13 pm on Oct 24, 2012 (gmt 0)

Your correct diberry, im on autopilot when I setup a WP site now, its one of the first things I do as do most I bet.


I see very few WP sites using the default permalink structure. I really believe for user experience, it's better to have words, and ideally those words should match the article title rather than being just keywords. As long as your titles aren't just your keyphrases, this shouldn't be seen as over-optimization - or at least, I wouldn't expect it to be. I yield to anyone who knows better.

Santapaws, that's just it. For example, Google says a new update is about blahblah, and most of us really don't have a clue what that means. So we start pruning our content, merging articles, getting rid of backlinks... and things get worse instead of better. I realize some people around here are able to figure out what Google wants and address it. But the rest of us? Probably better off doing nothing until we really are convinced we've figured out the problem, or found someone who can give us solid suggestions of how to address it.

One thing that most of us CAN do, whether or not we'll ever understand anything Google's updates are about, is concentrate on best practices and user metrics. No, doing these things does not mean you'll stay in Google's good graces. But if you're in this game for the long haul, it might be the better option in the end. After all, what Google advises you to do one year might be considered spamming a few years later. Maybe Google prefers sites that ignore them and just do their own thing consistently, and do it well.

santapaws

5+ Year Member



 
Msg#: 4442234 posted 4:22 pm on Oct 24, 2012 (gmt 0)

for example scraping, you cant do that, and when im personally scraped i get hammered by the duplicate content, apparently because i dont have enough (cough cough) good signals. But two of the biggest sites out there scrape massive amounts of content, Google and Wikipedia. Yes i know they do OTHER things too, but clearly you CAN rank with scraped content IF it suits them. My point being its being made impossible more and more for anything other than a major player to rank BECAUSE of the so called guidelines. I think thats how they want it. De-optimise to the point of self destruction and they have the job done for them, no algo required. So its now about the whitelist that lets some sites through than anything else. Of course your vertical will see this play out in a big way or smaller way depending on whether its a targeted niche.

gouri

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 3:09 am on Nov 1, 2012 (gmt 0)

I think I both reduced duplication amongst my H1, title, and meta descriptions tags, and deoptimized by removing each page's main keyword from regular text and anchor text on the page.

@Tonearm,

The main keyword that you removed from anchor text on the page:

(1) Is that link in the body text of the page?

and

(2) Does that link go to another page on the site and used to contain a keyword that the page you are linking from is trying to rank for? E.g. Page you are linking from trying to rank for Blue Tools and anchor text going to other page was Sharpened Blue Tools and you made it into something that didn't contain Blue Tools so it would be Sharpened + Some other words or a Related Phrase to Sharpened Blue Tools that didn't contain Blue Tools?

Tonearm

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4442234 posted 6:58 pm on Nov 1, 2012 (gmt 0)

The links were in the body text. I didn't remove Blue Tools from the page that was linked to.

gouri

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 2:53 am on Nov 2, 2012 (gmt 0)

@Tonearm,

The links were in the body text. I didn't remove Blue Tools from the page that was linked to.

Thanks for the response.

I think I both reduced duplication amongst my H1, title, and meta descriptions tags, and deoptimized by removing each page's main keyword from regular text and anchor text on the page.

Can you give me an example of removing the main keyword from the anchor text the way that you did on your site? Also, the keyword that you removed from the anchor text, what page does that link go to? Home page to inner page? Inner page to inner page?

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 3:13 pm on Nov 17, 2012 (gmt 0)

so after a nice afternoon of doing some requeries about "keyword" i notice that when i search for a widget and use 2 or more word queries without "keyword" but a similar key i always find my site in top ten. If i use "keyword" for search for the widget then my site seems to suffer a -30 penalty. A comparison with WMT keywords shows me that this keyword is the most important keyword for my site. I can do it with several widgets that contain the keyword, it is always the same. THis key had a count of over 140.000 times due to navigation descriptions for a 15.000 page site. I was able to reduce it to a number of 12.000 times.
Traffic has improved a little bit but how can an ecom site avoid using the most important keyword if it describes the widget?

Tonearm

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4442234 posted 10:29 pm on Nov 17, 2012 (gmt 0)

@gouri
Can you give me an example of removing the main keyword from the anchor text the way that you did on your site? Also, the keyword that you removed from the anchor text, what page does that link go to? Home page to inner page? Inner page to inner page?

I changed links like Purple Widgets to Purple. The links were inner page to inner page.

@Martin Ice Web
Please let us know how that goes. I'm very curious to find out how Google responds to what you did.

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 1:45 am on Nov 18, 2012 (gmt 0)

how can an ecom site avoid using the most important keyword if it describes the widget?


Exactly! Ecommerce sites have to give more descriptive link information when site users are drilling down to what they're really looking for. It's unreasonable to think that we should dumb things down for the sake of search engines. Unless they expect us to cloak just for them.

diberry

WebmasterWorld Senior Member



 
Msg#: 4442234 posted 2:57 pm on Nov 18, 2012 (gmt 0)

MartinIceWeb, I've got a similar thing going on. My top keyword is xxx, and I have a lot of pages that use it, and most of them rank well enough. But one particular page of mine has a "nowhere to be found, not even in the omitted section" penalty that seems to be based around that word.

diberry

WebmasterWorld Senior Member



 
Msg#: 4442234 posted 6:52 pm on Nov 18, 2012 (gmt 0)

Er, my top keyword is not actually "xxx" in case it wasn't obvious that was just a stand-in for the actual word. Sorry! :D

epmaniac



 
Msg#: 4442234 posted 8:27 pm on Nov 18, 2012 (gmt 0)

@brinked and others about point number 4

if two sites are on same niche on same server and IP, but they are geo-targeted to two different countries... do u guys think it would still hurt?

gouri

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 7:23 pm on Dec 2, 2012 (gmt 0)

On some of my pages, I have many words that contain a hyphen.

Is a word with a hyphen considered one word or two?

Would long-distance runner be considered two words or three?

I need to know this because I need to know how many words are on a page to calculate keyword density. I don't want to over optimize.

I know that keyword density may not be considered as important now as it was years ago, but I think that it is still good to have an idea of what the keyword density of a phrase is on a page?

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4442234 posted 12:35 pm on Dec 3, 2012 (gmt 0)

Over optimization has to be off-site related only, there doesn't seem to be much you can do to a sites code to make any impact anymore.

Proof - switch templates between one that is horrendous in terms of SEO, full of code bloat, no description tags, tables instead of css, poor usability etc and one that is ultra-lean, fast, easy to use and covers important metrics and you will find no difference in traffic from Google. Occasionally a brief temporary lowering of traffic occurs while Google evaluates the layout change but it always comes back quickly.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 12:58 pm on Dec 3, 2012 (gmt 0)

@tonearm,

cutting down the overall keyword count for a specific keyword ended in two seperated results:
-widgets where the keyword stands alone - most are gone
-widgets where the keyword is the part of a word - went up in rankings if it is not keyword itself

In fact of this I started to retitle close widgets to more powerfull titles. Also the user must think I am not just right in my head. But we are at a point when doing a site for users does not work anymore, we have to do sites for google, when u are not ebay or amazon.

Tonearm

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4442234 posted 7:19 pm on Dec 3, 2012 (gmt 0)

@Sgt_Kickaxe,

full of code bloat

tables instead of css

Is this the code-to-content thing?


@Martin Ice Web,

most are gone

Do you mean they appeared in Google's search results before but now do not?

taberstruths



 
Msg#: 4442234 posted 10:16 pm on Dec 3, 2012 (gmt 0)

Well I am doing a test on this concept. I never keyword stuffed. Always was 2% or below however I did use the old standard of keyword phrase in Title (H1), H2 and H3. I also used bold, italics, and underline of keywords 1 time each.

I have removed all the bold, italics and underlines. I have limited the use of the exact match keyword phrase to the Title (H1) and once in either in an H2 or H3. I have also limited the use of the keyword phrase to at most 3 times. I replaced the other instances of using the keyword phrase in the title or content with either partial use, or LSI use.

We shall see if it makes a difference. Too soon to tell yet since I only started about 3 days ago, but I am seeing some promising preliminary results already.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 9:44 am on Dec 4, 2012 (gmt 0)

@Tonearm,

with gone i meant they suffer a -30 penalty, and i mean penalty. Because, then more than one result comes on page 3 in a row.

Now, get this, i have a second site. With the same widgets. All with other titles, descriptions.... but the keyword "widget" is prominent and counts >75.000

Many of the widgets from site 1 that suffer the -30 penalty are one page 1 from the second site.

Explanation? None. Maybe every site is treated in an other way?

Tonearm

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4442234 posted 11:50 pm on Dec 4, 2012 (gmt 0)

Nice experiment taberstruths. Please let us know how it goes.

Martin Ice Web, how long has your experiment been running now? Are planning to change everything back to avoid the -30 penalty? If it hasn't been too long it could still come around.

taberstruths



 
Msg#: 4442234 posted 3:04 am on Dec 5, 2012 (gmt 0)

I just finished re-doing 78 pages. Approximately 10% of my site. These were the biggest traffic pages up until the first penguin.

Up until now, I have been frustrated trying to figure out how to fix this problem. After reading this post and thinking about how my pages have reacted post penguin to updates I came up with a working theory.

Forgive me for putting this in an algebraic formula, but I hope it makes it clear what I think might be happening with penguin plus over-optimization penalty or both in combination.

On page optimization score = Exact match keywords that are optimized in some way Title (H1), H2-6 tags, image url, image alt tag, density, meta desc., meta keywords, ect. (more than likely these are given different weights)

Off page optimization score = percentage of exact match keyword links pointed at the page, trust factor of the TLD's pointing at your page, and possibly a tie between social signals and amount of total links pointing at a page.

So the equation would go as follows.
On page score X Off page score = Penguin score.

The higher the penguin score, the higher the penguin penalty.

This would explain why we can't get a hold on what one thing will effect ranking.

Anyways, I will let you know how things go. I know I had too many exact match links. I have worked to change that with social links which are easiest to build but haven't seen a full recovery.

So far it has caused rankings to fluxuate but it is too soon to tell if it is a long term gain or loss. Some have gained and some have lost.

Shatner



 
Msg#: 4442234 posted 9:21 am on Dec 5, 2012 (gmt 0)

Brinked, great stuff as always.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4442234 posted 10:11 am on Dec 5, 2012 (gmt 0)

@tonearm,

its now over 4 month. Think its enough time.
As g$$gle does rewrite most of the -30 penalty titles, if will write new titles for these widgets.

I wonder what g$$gle will do, when I take the rewriten title? Does they rewrite it again?

taberstruths



 
Msg#: 4442234 posted 3:05 am on Dec 7, 2012 (gmt 0)

quick preliminary results so far for the test pages.
I call anything with a change less than 10% unchanged either direction to account for flux ect.

Pages affected negatively 11
Pages affected positively 25
Pages unchanged 6
Revived from no traffic 4

the rest do not have enough data to compile yet.

Here is what I have noticed. It seems that most of the pages have gained 2-5 places in the serps. It is hard to tell for sure since the data sets keep changing in the serps. They have also started getting more traffic for synonym keywords. Long tail has also shot up for the pages.

If my page was geared for "red widgets for homes" then longtails would be "the best red widgets for homes" and "red widgets for homes and gardens"

However understand that my site is an informational site and not a buyers keyword site.

santapaws

5+ Year Member



 
Msg#: 4442234 posted 10:29 am on Dec 7, 2012 (gmt 0)

my daughter came home crying last night. I asked her whats wrong. She said Google had sponsored her local gymnastics competition and she was disqualified for having top marks. She was told scoring high in every area where the judges marked her meant she was trying to manipulate their subjective opinion. Her coach told her to forget about trying to impress the judges, just go out and do what you think the spectators would like to see, then success will follow.

snowbunny

5+ Year Member



 
Msg#: 4442234 posted 11:46 am on Dec 7, 2012 (gmt 0)

We have a successful UK site and are trying to build up a US equivalent (totally different content, same niche). A while ago we found that UK visitors were entering the dotcom domain (now used for the US site) in their address bar, and coming across the US site by mistake. We made the homepage of the dotcom site a 'flag page' which allows visitors to make the choice between the UK domain and the US one (deeper on the dotcom domain). Google alerted us to an issue - they thought the flag page was a doorway page. We sent them a response, saying it should be perfectly valid, but never heard back, and feel that our US site may be being penalised, as it's not doing well from an SEO perspective. What do you advise?

Tonearm

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4442234 posted 9:49 pm on Dec 7, 2012 (gmt 0)

snowbunny, I recommend starting a new thread for your topic as it's not related to overoptimization.

taberstruths, very interesting results so far. How long has the experiment been running?

Martin Ice Web, I agree that 4 months is enough time.

This 236 message thread spans 8 pages: < < 236 ( 1 2 3 4 5 [6] 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved