homepage Welcome to WebmasterWorld Guest from 54.234.2.88
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 85 message thread spans 3 pages: < < 85 ( 1 [2] 3 > >     
Over Optimization June 2 Penalty - steps to reverse it
newborn




msg:4151943
 2:47 pm on Jun 13, 2010 (gmt 0)

My business model was building mini-websites with strong content. However we chose topics that we could fill out using 5 to 10 pages.

Many of these were local based and we ranked well for them. We had upwards of almost 100+ web properties and growing. We did use Adsense and other methods of monetizing the websites, we were not MFA sites, we sold our own ebooks, free downloads of guides etc. And updated the site if there were any relevant changes. All the local sites were on the same IP as well because we weren't doing any Spam practices and the content was well written and not Spun.

We invested heavily in that arena.

In May we hired a "reputable" link building team on an outsourcing website that indicated the method they were going to use was article marketing. So we agreed as our link building was quite slow and we were hearing so much about Caffeine.

On June 2 all traffic fell and I mean off a cliff. I was stunned. We did not know what was happening as most of our domains targeted longtail traffic, keywords were even in the domain. So we thought the Google MayDay would help us.

I have checked and all the sites are still in the index, none are de-indexed as they show when I do an exact search, and site:mydomainname they all show up. Though the cache date does not seem to change.

After discussing with the link builders they said they used an automated link building service to distribute the articles. I cried foul and demanded a refund - which the gave me - however nowhere close to the revenue being lost as a result of their actions.

Having read on the forum I have seen others address the June 2 update as an over optimization update and suggested several methods to regain rankings.

I have added these and wanted to hear thoughts on which would be appropriate to conduct:

- Get more authority in bound links and wait 2 weeks to 4 months
- Add more content and get more authority inbound links and wait 2 weeks to 4 months
- Dump those sites, take the content and create new websites
- Send Google a reinclusion request and wait doing nothing

If anyone who has been hit by the June 2 update and has seen a return of normal traffic can they weigh in here...

 

Mikey85




msg:4152772
 7:47 am on Jun 15, 2010 (gmt 0)

Our organic traffic dropped 80% at june 2 too. We have made now several improvements and continue getting fresh content.

Have send a re-inclusion request with all our improvements, let's now wait and pray.

caribguy




msg:4152818
 10:34 am on Jun 15, 2010 (gmt 0)

I wasn't optimizing for it, or out link building, or doing anything at all really except for tracking it. And, I'm the first to admit I wasn't strictly the most relevant site for it in the first place.


We also lost a 'trophy keyword' like the one you described but this happened between May 18 and 21. This niche site used to rank between position 6 and 30 for the past two years for this very generic term, but hardly received any traffic from it.

May 21, 2010 9,999
May 17, 2010 15
May 16, 2010 15
May 13, 2010 14
May 10, 2010 14
May 7, 2010 10
May 5, 2010 13
May 2, 2010 11
Apr 28, 2010 8

Interestingly, the site just started ranking for another generic term. A term that is very closely related to the first, but used less often.

Apr 25, 2010 7
Apr 21, 2010 14
Apr 20, 2010 10
Apr 17, 2010 12
Apr 15, 2010 12
Apr 13, 2010 12
Apr 11, 2010 14
Apr 8, 2010 9,999

The site has very strong unique niche content and has otherwise been completely stable throughout the Mayday and June updates.

Hissingsid




msg:4152828
 11:21 am on Jun 15, 2010 (gmt 0)

Are these micro sites too focused on keywords and not semantically rich?

I'm wondering if they could look just ever so slightly too much like keyword stuffed auto-generated pages. If that is the case and I had one of these sites I'd be doing some synonym and semantics research around the keywords I'm targeting and replace some of the stuffing with richer language.

kd454




msg:4152887
 12:58 pm on Jun 15, 2010 (gmt 0)

Are these micro sites too focused on keywords and not semantically rich?


Mine are not keyword stuffed, the content is written for the reader. I gave my writer the topic, and she wrote the articles. She has a masters degree in education and knows nothing about keyword density etc.

ammanu




msg:4152962
 3:22 pm on Jun 15, 2010 (gmt 0)

Same thing for 40 websites, -99% on June 2

I don't konw what happened. The contents are unique, I have a network of 80 "small" sites, link building made in same way, same keyword density, and so on.

I saw in Webmaster Tool that Google erase back link from directory and social bookmark site.

Any suggestions?

kd454




msg:4152968
 3:29 pm on Jun 15, 2010 (gmt 0)

I saw in Webmaster Tool that Google erase back link from directory and social bookmark site.


You need better links, the issue with trying to build quality links to 40 websites is the cost worth the end results. I am thinking it is not cost effective for most niche sites and Google knows this. They really thought this one through.

Hissingsid




msg:4153031
 4:44 pm on Jun 15, 2010 (gmt 0)

Mine are not keyword stuffed, the content is written for the reader. I gave my writer the topic, and she wrote the articles. She has a masters degree in education and knows nothing about keyword density etc.


I'm not saying that they are stuffed deliberately just that they may appear to be. Have you analysed them properly?

I guess I was a little naive to assume that people in this thread were discussing the topic in the title of the thread "Over Optimization June 2 Penalty - steps to reverse it" perhaps not.

All those years ago when we had the Florida update that caused many of us similar problems. People here tended to fall into one of two camps. Those who thought it was an over optimization penalty issue and those who thought it was about semantics. I was in the later group. If you have over optimization one of the ways you can try to overcome it is to use semantic richness.

Cheers

Sid

tedster




msg:4153041
 5:22 pm on Jun 15, 2010 (gmt 0)

It seems to me that what Google is trying for could be called an "Only Optimization Penalty" ;)

kd454




msg:4153166
 8:11 pm on Jun 15, 2010 (gmt 0)

The main thing about the niche sites was to buy the exact keyword in the domain name. ex thebluewidget.com with low competition for that keyword. Create a site about 5 to 10 pages and add a few links "article marking" (junk links) and get to the first page and "cash in" right ;) in a short period of time (couple months).

Google was placing such a high value on the "exact" keywords in the domain it was very easy to rank for that keyword (even with a very poor one page site). I believe they devalued the domain name thus pushing the sites back where they should be most off the map.

I am sure there is more to it than just this (site age?) but it has to be part of the puzzle. I have 2 of my small sites that have about 10 pages and some decent links coming to them and they are doing ok, not as good but still better than the others that have poor links.

I will take a look at the keywords in the article for over optimization, but I have looked at some sites that have WAY over optimized pages (keywords stuff into every sentence almost) that have been ranking for high traffic keywords and they are still going strong. These pages have lot of links coming to them and seem unaffected they are actually doing better than they were previously.

internetheaven




msg:4153226
 9:37 pm on Jun 15, 2010 (gmt 0)

So is this NOT an OOP? Is this something to do with link building?

My hit came on June 4th, 80% of traffic lost. There are 2 differences between that site and another site of mine that was not affected:

1. The penalised site uses Google WMT.
2. The penalised site has a resources/traffic exchange directory section (resources section uses nofollow on outbounds).

Apart from that, the two sites are practically identical in theme, SEO, setup, layout, monetization etc. etc.

CainIV




msg:4153316
 1:36 am on Jun 16, 2010 (gmt 0)


Are these micro sites too focused on keywords and not semantically rich?


Hissing, I wonder this too.

It might stand to reason that as part of the curtailing of machine / auto generated long tail landing pages for the monster websites - which would be in my estimation one of the first to go - would be about assessing both duplication and over-opt on those pages and analyzing whether or not they are too focused on one particular long tail money phrase.

Logically, it could be possible that the same change in the algorithm affected other websites in the same fashion.

CainIV




msg:4153319
 1:45 am on Jun 16, 2010 (gmt 0)

Just felt it was important to tag on this quote from that article to she some light on the frustration this update is causing:

Thatís what we would expect from Mayday. Users donít care if your site has many items, they care about descriptive content.


Uh, huh? The last time I checked, when I visited an ecommerce website looking for red shoes, I was looking for a wide variety of red shoes, not one item with a Pulitzer prize winning descriptive paragraph...

Planet13




msg:4153729
 5:27 pm on Jun 16, 2010 (gmt 0)

Are these micro sites too focused on keywords and not semantically rich?


Not to hijack this thread, but could someone give an example of 'Semantically Rich' versus not Semantically Rich?

As for June 2 update, it is hard to say. since May, my traffic has been fluctuating a lot. Up 10 or 15% one day, down 20% the next. May saw a -4% on one site and a -8% on another site over April (both were eccommerce sites with several hundred pages).

I am kind of scared to look at google analytics for the last two weeks - but I can tell you that sales have been down, down, down!

Mikey85




msg:4153840
 10:28 pm on Jun 16, 2010 (gmt 0)

Our traffic dropped 80% since june 2. Allthough we have daily fresh unique content and spammy competitors are still ranking well...

tedster




msg:4153880
 11:59 pm on Jun 16, 2010 (gmt 0)

Welcome to the forums, Planet13

could someone give an example of 'Semantically Rich' versus not Semantically Rich?

See Phrase Based Multiple Indexing and Keyword Co-Occurrence [webmasterworld.com] for one discussion and
Phrase Based Indexing and Retrieval [webmasterworld.com] for more background.

Hissingsid




msg:4154073
 8:27 am on Jun 17, 2010 (gmt 0)

My first step in creating semantically rich pages is to take all of the keywords I'm targeting and do ~keyword (synonym) searches on Google with prefs set to 100 results per page. IMHO synonyms are particularly strong in reversing over optimisation as they can be direct replacements for the keywords you are targeting. I also do the same with synonyms that are discovered as part of this process so I find synonyms of synonyms.

Stems such as keywording, keywords and keyworded are also powerful and can be easily woven into your text.

Then start to look for natural co-occurrence. I kind of have a running brain storm on this as I'm editing text.

I also hypothesise that certain parts of the page are weighted both from a keyword and a semantics point of view. Page title, headings, bolded text and anchor text have a greater weight than standard paragraphs. In effect you have to think about this in 3 dimensions.

Don't forget that the guys at Google are not as cleaver as they like to think they are and as a result you don't have to go too far towards perfection to have a good effect. Also although they have sorted out some of the US vs UK English keyword semantics this does not appear to extend to synonyms. So in the UK you can use a US synonym of a very specific UK word and that will help disambiguate the keywords you are targeting.

Cheers

Sid

Robert Charlton




msg:4154081
 8:37 am on Jun 17, 2010 (gmt 0)

I would be careful about overdoing the synonyms. Too much can get you filtered very quickly.

Read the page out loud. Stuffed synonyms sound as bad as stuffed keywords.

Hissingsid




msg:4154083
 8:44 am on Jun 17, 2010 (gmt 0)

Hi Robert,

Good warning but if you have already been filtered and are trying to recover using a smattering of synonyms to replace some of the keywords can't harm as you are already filtered for "Over Optimization" (see thread title).

Also too much of anything is never a safe thing.
Cheers

Sid

Hissingsid




msg:4154084
 8:45 am on Jun 17, 2010 (gmt 0)

Not sure I made it clear how I use the synonym search. Having set the prefs to 100 results I just scan the serps for bold words and put those on my list to be used as replacements on my pages.

Also there's another dimension and that is site semantics as well as page semantics. You want each page to have natural semantics and the mini semantic web that is your site to also be natural.

Cheers

Sid

Robert Charlton




msg:4154123
 10:56 am on Jun 17, 2010 (gmt 0)

...too much of anything is never a safe thing...

Sid - I fully agree, but let's say that lurking beneath any legitimate approach in SEO there is a strong temptation beckoning to a newbie to overdo it and get into trouble... and I felt that a warning might be prudent. ;)

...if you have already been filtered and are trying to recover using a smattering of synonyms to replace some of the keywords can't harm as you are already filtered for "Over Optimization" (see thread title).

I agree to a point. The use of synonyms is component of good writing, and, with an over optimization penalty, reduction of overused vocabulary is not a bad idea. But, if you have overused keywords to the degree that you're filtered, replacing some of them with synonyms is likely to keep you filtered. At least, that's been my experience. In such situations, it may be better to do some editing as well.

Also, the newbie might not confine this technique to situations described by the thread title, but might instead start with a blank page and a synonym search, and try to stuff the page from the beginning.

Robert Charlton




msg:4154127
 11:03 am on Jun 17, 2010 (gmt 0)

Having set the prefs to 100 results I just scan the serps for bold words and put those on my list to be used as replacements on my pages.

Re the ~keyword search... one search type that I've used to identify synonyms produces some extremely strange results when tried under the new algo.

In the past, I would modify a search for [adjective keyword] to include synonyms (by marking the word with the tilde operator) and exclude either the adjective or the keyword (with a minus operator) so I'd see only the results with the synonyms... ie, I'd search either for...

~adjective keyword -adjective

or...

adjective ~keyword -keyword

If you try this now with many common competitive two-word search phrases, you're going to get some very bizarre results. Such searches bring up an unusual proportion of extremely raunchy pages... my guess is pages that are stuffed with terms from popular searches that have been analyzed this way, probably in an attempt to optimize them for these phrases.

Note that Norton Site Safety flags enough pages in these serps as unsafe that I'd hesitate to click on what comes up... there's a lot of malware in there... but the titles and snippets should give you a pretty good idea what's being done.

Planet13




msg:4154378
 6:53 pm on Jun 17, 2010 (gmt 0)

Thank You Tedster and Hissingsid for your explanations;

...can't harm as you are already filtered for "Over Optimization" (see thread title)...


Ok, is there any "Test" for overoptimization (aside from the "Read it aloud - if it sounds spammy, it probably is" self test)?

tedster




msg:4154445
 9:47 pm on Jun 17, 2010 (gmt 0)

No precise test that I can think of. Look out for repeated keywords in lots of your on-page anchor text. That has been one very sensitive spot.

In the case of this thread, we haven't really established that there's an over-optimization problem, however. It could very well be a low level of some other signal (brand value?), not enough depth of content, or something like that.

Also, the opening post talks about a micro-site strategy. With the Mayday update, Google did take special aim at micro-sites. They see this as a ranking strategy in many cases, rather than a service to the user.

Hissingsid




msg:4154457
 10:30 pm on Jun 17, 2010 (gmt 0)

One approach I sometimes take is to view my pages through a Lynx browser or Lynx simulator. I then print out the page and highlight the target keywords. This coupled with Tedster's advice regarding anchor text can go a long way to showing you potentially problematic patterns.

Robert Charlton




msg:4154473
 11:26 pm on Jun 17, 2010 (gmt 0)

(aside from the "Read it aloud - if it sounds spammy, it probably is" self test)

I feel that the "read it aloud" test I suggested doesn't really work in the case of repeated anchor text, reportedly a common trigger for ranking drops. Here the repetition is a visual cue as much as a verbal cue, if not more so. But from the visual standpoint, the "good for the user" test is ambiguous too. Which is more user friendly, a list of nav links like this?...

- Red Widgets
- Blue Widgets
- Green Widgets
- Yellow Widgets
- Orange Widgets
- Purple Widgets
- Fuchsia Widgets
- Mauve Widgets

... or a list of nav links like this?...

Widgets
- Red
- Blue
- Green
- Yellow
- Orange
- Purple
- Fuchsia
- Mauve

I seen it argued that, in the case of a longer list, the nav with the repeated keyword is more helpful to the user. My own feeling is that in most cases the repeated keyword looks spammy and is not all that helpful, but I have seen examples where the page reads better with the keyword repeated.

I'm still not seeing any consistency about when an over-optimization filter is applied to this kind of navigation. I've seen sites using both types of lists in the serps and doing well. Tests I've run on live sites have been necessarily partial and inconclusive.

Since the new algos, has anyone with this kind of nav list seen any changes?

kidder




msg:4154638
 8:13 am on Jun 18, 2010 (gmt 0)

I use a micro site strategy to target different brands of the same product, I try to maintain a level of quality and reading this thread has me wondering about this strategy now.... Most of the sites I'm up against are large and well branded but in every case we are competing with "internal" pages from these larger sites. I think our user experience is unique but when repeated over the different sites I think it could start to look spammy. We are only running about 10 sites... Opinions?

newborn




msg:4154995
 6:49 pm on Jun 18, 2010 (gmt 0)

Ok so heres the thing - its those spammy links that caused it. Too many links too quickly - which were low quality. Some of the sites have gone grey bar which were previously PR0.

So something is wrong. Im hoping its not based on IP. What is my concern is now that these links are already out there and I have no control over getting them to GO AWAY, is it that the sites are now doomed.

How do I get rid of these toxic links!

proboscis




msg:4155005
 7:18 pm on Jun 18, 2010 (gmt 0)

I have a site that took a good hit on June 1, it's not a micro-site, it has around 1000 pages, but it is a niche site focusing in one area and each page deals with a more specific topic within the niche.

This same site was also hit in June of 2008, at that time some people thought it was due to an over optimization penalty.

Also, webmaster tools shows that the average position of the top search queries for this site have moved up, in some cases from the second page onto the first, but the impressions and clicks have decreased on each individual query. I thought that was weird.

Robert Charlton




msg:4155077
 10:55 pm on Jun 18, 2010 (gmt 0)

newborn - To bring this back to the subject of your sites... the phrase "over optimization" calls back discussions of the -950 penalty, and that's led to some side discussion about onpage factors... which, as you say, aren't the likely cause. I got caught up in the side issues too... sorry about that.

...its those spammy links that caused it. Too many links too quickly - which were low quality....

....Im hoping its not based on IP....

yes, I think you are right that it's about the new inbound links.

It may also be about IP. If the "automated link building service" relied own its own networks of sites to any great degree, chances are they were also coming from related IPs... and since all of your sites are on the same IP, Google would be seeing an inappropriate degree of coordination.

It may also be that if there were groups of links from pages or domains in common to your group of sites, that also would be seen as a sign of coordinated linking, which Google sees as a sign of spam.

The speed and timing of the links might also have been considered here and seen as coordinated linking.

Article promotion, if the same article was used multiple times, also leaves a distinctive footprint that Google might view as another degree of coordination.

(I'm assuming, btw, that the articles are not the same as any content on your site. If they are, the problem may be at least in part a dupe filtering issue.).

Re the IP issue... after the Florida update, I saw "sister sites"... same ownership, different companies, but hosted and promoted together... take a dive. Moving some of the sites to a different hosts (and perhaps getting some additional inbounds... that was in process at the same time) suddenly snapped the sites back up. If you moved site hosting on your sites, I'm not at all sure Google would react the same way now. You might want to try it one site at a time.

I think I'd first see if you can have the link builders get rid of the articles and inbound links. Possibly their automation can be made to work in reverse.

After that, you might want to try the reconsideration route. If your mini-websites are all in the same niche, though, or feed into other websites as doorway sites, it's likely that that reconsideration route won't work. It will depend a lot on how much independence, apart from hosting, your mini-websites actually had.

newborn




msg:4155095
 11:49 pm on Jun 18, 2010 (gmt 0)

Robert I really don't think that they can/will remove the articles with the links. So I might be dead in the water.

I started to add fresh content to a few and think if I can get the sites up to 100+ pages of good content then get some authority high PR inbound links then maybe just maybe they might come back to life.

I was hoping that 100 pages would be adequate if I covered as much as possible in terms of content. I also am trying the silo thing where I created a few folders to host pages on. I was also moving from the every page links to every page thing. But all pages must link to the home page and the initial silo page.

But when I realized that a few of the sites had gone from PR0 to Grey Bar, I now too despondent to do anything. I really feel like giving up on them. when I do a site:mysite.com/net/org it comes up but when I search for the domain name it self it does not show up.

I am still lost....

Is adding fresh content and changing site structure really going to help? or is the only option a reinclusion request.

tedster




msg:4155096
 12:00 am on Jun 19, 2010 (gmt 0)

So I might be dead in the water.

I've seen several successful reconsideration requests in this type of situation, and even much worse. Explain what happened, give Google a list of the bad links (if it's big, put them on a URL and just include a link in the Request), and let them know that you no longer contract work to thus third party.

This 85 message thread spans 3 pages: < < 85 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved