homepage Welcome to WebmasterWorld Guest from 54.227.11.45
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 5 [6] 7 > >     
Documenting my attempt to re-rank after "Farmer" update
Dead_Elvis




msg:4275674
 1:58 am on Mar 3, 2011 (gmt 0)

Ok, so for the sake of all who have been hit by the so-called "Farmer" update, I thought it might be useful to document my own attempt to increase my website's "Quality" measurement (whatever that means,) in an attempt to gain back some of my previous good ranking in the SERPS.

My site is rather small compared to many mentioned on these boards –– just 700 or so pages –– and at it's peak it has never made me more than $1,000 in any month period, but in this economy that lost $1,000 is kicking my butt.

The site I am referencing is built on Wordpress as a CMS, has been copied copiously by scrapers in the past, and is a review/news-type sight, where I post long, honest and well-written reviews, news, how-to's, etc about my favorite types of widgets.

I have never paid for a link, and I do not run any sort of linking campaigns, though the site is plenty popular, gets a fair amount of natural links, and has nearly 7,000 RSS subscribers.

Obviously, we know very little about the particulars of this latest Google update, but the concensus seems to be that this a radical change in the algo, and we can't expect our ranking to just suddenly reverse any time soon.

With that said, I thought I'd document my meager attempt to gain back some of my good graces with Google.

After reading as much as I can find on the subject, I came across this article posted in one of the other threads here, and it seemed like a good place to start.

[searchengineland.com ]

The quote that jumped out at me was this:

Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site.


With that in mind I decided it was time to clean-up my site... but where to start? Like most of the good folks here, I tend to think everything I write is pretty good, and I certainly don't copy and paste other people's work etc. To the extent that it's possible, my pages are unique and well-written.

I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.

Luckily there were only a handful of these.

As I began to dig through them, it became apparent to me that at least 80% of these pages were very obviously "thin." Much thinner than I even remember writing.

So, for lack of a better idea, I canned them; my hope being that by removing entirely the pages that have suffered the most in this update, my entire website will look better to Google.

All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again.

I can not think of any other ways to "clean up" my site... I long ago blocked things like Tag Pages, Category Pages, etc from being indexed, so there is really nothing extraneous from my site in Google's Index.

Now I just cross my fingers and pray. Nothing I've done is irreversible, so what did I have to lose.. my already tanked rankings?

I'll let you know how it turns out.

 

walkman




msg:4279371
 4:46 am on Mar 10, 2011 (gmt 0)

I predict that Google will do a re-calculation on 3/22-24. Why? Every month they do one around that time from what I can tell. I remember many major changes starting the last week of the month.

gyppo




msg:4279382
 5:11 am on Mar 10, 2011 (gmt 0)

To be honest I've got sites with really really thin content & heaps of ads that have been fine.

So there really has to be something more to it than that, I'm leaning towards duplication between pages.

koan




msg:4279412
 6:21 am on Mar 10, 2011 (gmt 0)

I've got sites with really really thin content


We tend to lose track that only 12% of keywords have been affected, possibly those that attracts the spammers most (think diets, dating, forex trading, etc). So it's hard to say if an unaffected site passed the test or just isn't part of the targeted topics.

walkman




msg:4279415
 6:27 am on Mar 10, 2011 (gmt 0)

Koan, we've seen travel, home improvement, review, answers, programing etc etc.

nuthin




msg:4279423
 7:06 am on Mar 10, 2011 (gmt 0)

I'm seeing local results affected across the board. Newish sub par <- 6 month old clients getting filtered out, dodgy "classified" / scraper sites starting to move in.

Just waiting it out as it's unlikely a stable index at the moment.

deadsea




msg:4279680
 4:22 pm on Mar 10, 2011 (gmt 0)

I typically work on mega sites with millions of pages. Of those pages some of the pages are super high quality and some are super low quality. The super low quality pages are in areas like forums (there are a lot of inane topics), user profiles, and thin product reviews.

None of the sites like this that I work with have been hit by this update. Clearly having some thin content is not enough to get you hit. I especially think that it may have to do with pagerank sculpting. The sites I work with certainly don't feature the bad content or send a lot of link juice to it. Perhaps Google is measuring how many times your site ranks for popular terms with thin content? Perhaps they are measuring user experience from the SERPs as the primary metric for being penalized. Maybe they boosted the impact of a user coming back to the SERPs and clicking on some other site.

That way they wouldn't have to try to detect "thin" content through lexical analysis, they could infer it through user behavior. They could also explain why excessive ads appear to be hurting sites.

valex




msg:4279725
 5:11 pm on Mar 10, 2011 (gmt 0)

Clearly having some thin content is not enough to get you hit. I especially think that it may have to do with pagerank sculpting. The sites I work with certainly don't feature the bad content or send a lot of link juice to it.


This is what I'm trying to do right now - removing the links on the home page to all of the pages with thin content which I suspect have killed the site's good ranking. I also disallowed these pages and they started dropping from the index. I have lowered the home page internal links as a total and left just some important and really good pages there. The result - for one major keyword my site climbed this morning from page 55 to page 26 (the home page used to rank #6 on page 1 for this KW 11 days ago, than it dropped to page 55-56). Some other KWs are performing a little bit better today, but still not as it used to be. May be these signs are nothing but I'll keep working in this direction.

falsepositive




msg:4280789
 7:32 pm on Mar 12, 2011 (gmt 0)

After assessing my site, I realized one thing. I suck at on site SEO. I think this algo is telling me just that. I had focused most my efforts on "growing" (producing more content), monetization and off site SEO (which I am good at). My link profile is enviable and I smoke my competitors here. What I did not realize was how poorly my site stacked up against my competitors in terms of "on site SEO", which I never bothered with. It looks to me that I would benefit from a better optimized site design -- I have around 1,500 pages but a lot of the older content is thin. I have pages with tons of links on them (I never cleaned them out) and wow, I realize now that if I were a robot, I'd hate my site too.

This was a wake up call for me. I couldn't grow in a healthy way for a robot if I kept doing what i was doing. So G was just reminding me I had to practice proper optimization techniques if I were to grow in a way that was not so haphazard. I would think of this algo as a way to measure my quality metrics -- how good are we at following the Google webmaster guidelines?

On that note, I am wondering whether scrapers are coming up ahead because their sites are "lighter", with less bloat and coming up with a copy that a bot is easier to digest. I can see how this can happen now. If Google can't tell who was the original, then they'd much prefer showing the content that is from the better optimized site -- less bloat, less fuss. Also, if my old page was buried deep in my site, the bot may not see it well or follow it. If it finds the copy in a cleaner site, then that copy appears because it's "less confused" and is able to follow it through.

The reason a lot of authoritative old sites are hit? Convoluted sites, badly designed, poor navigation, etc. Their content is found elsewhere (albeit stolen) and taken to appear in sites that are clean, easier to digest by both bots and humans. It IS unfair in a way, but if you've got great content buried under chaos, then nobody can really read that stuff anyway -- might as well show something else?

I see the logic there. I've probably tripped Google with "mixed signals" as collateral damage. But I didn't help them clear me 100% either...

Off to work I go!

Jane_Doe




msg:4280797
 7:45 pm on Mar 12, 2011 (gmt 0)

I have one page with nothing but thin content, nothing but ads above the fold and affiliate links below and it is doing great. Overall though that site is pretty solid with lots of nice content and has had a traffic increase since Panda. I definitely think this time they are looking at sites as a whole. I have other sites with better links and more content, but definitely lots of low quality pages and any pages like that have been trounced.

The reason a lot of authoritative old sites are hit? Convoluted sites, badly designed, poor navigation, etc. Their content is found elsewhere (albeit stolen) and taken to appear in sites that are clean, easier to digest by both bots and humans. It IS unfair in a way, but if you've got great content buried under chaos, then nobody can really read that stuff anyway -- might as well show something else?


I think those are all great points. I am going to rewrite most of my content on the pages that took a hit rather than send out 3 million DMCA requests. I just checked one unique line on a single page and found 70 matching results on other sites. I do think this algo change might have been unfair to older sites because of copied content issue. But it is what it is.

Also some dirty tricks from competitors have come into play more again. I can't complain specifically what they are because it just tips them off that it is working, but I do hope Google takes more into account that we can't control the external factors influencing our rankings.

falsepositive




msg:4280802
 8:03 pm on Mar 12, 2011 (gmt 0)

Jane_Doe, I would suspect frequency of spidering is a factor here. If you are a bigger site, your site is spidered more often, and more risk of getting hit by Panda. Could this explain why crobb was only affected later? I was hit right away but I am spidered quickly and often. If we have smaller sites that are also in violation of Panda, it's possible that they escape notice till later. If big sites drop off the map, then these small sites enjoy the limelight, but it could just mean they haven't been updated yet for this algo. Or am I just barking up the wrong tree with all this speculation?

Jane_Doe




msg:4280811
 8:41 pm on Mar 12, 2011 (gmt 0)

Jane_Doe, I would suspect frequency of spidering is a factor here. If you are a bigger site, your site is spidered more often, and more risk of getting hit by Panda.


I don't know for sure, but I do not get the sense that spidering frequency is that major factor here. I think with the older authority sites more easily stayed on top just by getting lots of low quality scraper and content farm links. Now that those links have been devalued, it makes it harder for those sites to rank. Add in more piece meal content, copied content, old links that once went to legitimates sites but now point to make enhancement products, etc. and those sites are stuggling more these days.

My pages that got hit were overdo for rewrites anyway.

dickbaker




msg:4280860
 11:28 pm on Mar 12, 2011 (gmt 0)

I just did a search for 14 phrases I track daily to watch for changes. Results are pretty much the same except for two pages. One of those pages moved from #42 to #29, and the other from #21 to #16. All of these 14 phrases were page one prior to this update.

For the page that moved from #42 to #29, I had disallowed many thin pages in the robots file. The page that moved from #21 to #16 is one I haven't worked on yet, but it's the same brand, and the sub-pages are in the same "/Acme/" directory. I don't know if the move up is attributable or not to the disallow.

One manufacturer page I've been doing extensive work on hasn't changed in the results. The cache is the old version of the page, but the preview shows the new version. I also beefed up the sub-pages for that manufacturer yesterday and Thursday, so it will be interesting to see what happens. This particular brand is one that has ranked well for more phrases than any other and sends more traffic than any other.

Pages that I did a noindex on back on 3/3 are still showing when I do a site: search.

Dead_Elvis




msg:4281184
 1:26 am on Mar 14, 2011 (gmt 0)

I am noticing something intriguing today:

While my traffic has fallen badly, and over the last few days it seems to have dropped even farther, I can say one thing for sure –– my earnings honestly haven't dropped in a way that is commensurate with the amount of traffic I've lost.

No, unfortunately, I'm not one who systematically follows keywords in the SERPS, or even carefully analyzes analytics, so I can't say exactly why my earnings haven't fallen so badly, but I can only guess that for whatever reason, the traffic I've lost wasn't "converting-traffic" to begin with.

This seems particularly odd to me considering that I've mostly lost US traffic, and my site is an affiliate site, with nearly ALL earnings coming from US customers only, and possibly a few Canadian folks as well.

Has anyone else seen this? That their earnings haven't dropped much in spite of the huge loss of traffic?

tedster




msg:4281194
 2:08 am on Mar 14, 2011 (gmt 0)

Most definitely. I work with one site that saw a 19% drop in Google search traffic immediately after the update and yet sales from Google traffic remained STEADY.

walkman




msg:4281198
 2:20 am on Mar 14, 2011 (gmt 0)

Dead_Elvis,
same happened to me during the instant search. Traffic fluctuated or went down a bit but my sales actually increased. Apparently that made people go to my best performing pages :). See how many of those performers are making you money, and if they are the same as before.

mslina2002




msg:4281201
 2:31 am on Mar 14, 2011 (gmt 0)

While my traffic has fallen badly, and over the last few days it seems to have dropped even farther, I can say one thing for sure –– my earnings honestly haven't dropped in a way that is commensurate with the amount of traffic I've lost.

No, unfortunately, I'm not one who systematically follows keywords in the SERPS, or even carefully analyzes analytics, so I can't say exactly why my earnings haven't fallen so badly, but I can only guess that for whatever reason, the traffic I've lost wasn't "converting-traffic" to begin with.

This seems particularly odd to me considering that I've mostly lost US traffic, and my site is an affiliate site, with nearly ALL earnings coming from US customers only, and possibly a few Canadian folks as well.


I could have written the same post. My traffic has dropped 25% after Panda, though seems even more yesterday and today. Perhaps also SPRING BREAK has started and college kids are on break. Conversions has been up since Panda so earnings (affiliate) has not dropped. I lost rankings for my vanity keywords from page 1 to page 2, but my long-tail and datafeed pages have been doing well.

dickbaker




msg:4281249
 5:05 am on Mar 14, 2011 (gmt 0)

I'm running a ranking report right now, and it's confirming something I've noticed about my site, but can't quite figure out.

On my site I offer advertising for retail widget stores. I use the phrases "widget shops" and "widget stores", but the word "shops" appears much more frequently.

For years I ranked #1 or #2 for "[insert any US state here] widget shops", as well as "[insert city name here] widget shops". Now my rankings for those types of phrases have fallen into the basement.

However, searches for those phrases but using "stores" rather than "shops" is still giving excellent results. #1 to #10, usually.

I wonder if going overboard with the word "shops" got me de-ranked for that?

dazzlindonna




msg:4281381
 1:44 pm on Mar 14, 2011 (gmt 0)

Just a quick update: Today is the first time I've seen a new cache since 2/28 (and I didn't make changes till after 2/28). No improvements in ranking though. So either I changed things that aren't helping, or the effect isn't immediate.

indyank




msg:4281407
 2:22 pm on Mar 14, 2011 (gmt 0)

@dazzlindonna I too saw a new cache for home page of one of the affected sites today. The previous cache was dated 3rd march. I didn't see any new cache until yesterday, but the new cache is dated march 12. I am keeping my fingers crossed.

SouthAmericaLiving




msg:4281417
 2:38 pm on Mar 14, 2011 (gmt 0)

Excellent Dead_Elvis - what great advice, am going to check my Webmaster's account now.

As well the meta tag 'ignore' info from SEOPTI
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW">

Gracias!

walkman




msg:4281447
 3:28 pm on Mar 14, 2011 (gmt 0)

This either isn't fast or I didn't change what Google hates. My homepage gets a fresh tag daily but I've noticed no improvement.

falsepositive




msg:4281455
 3:39 pm on Mar 14, 2011 (gmt 0)

Wanted to report something else I just don't understand. I have 2 sites in the same niche, one survived Panda, the other did not. The one that survived is smaller, has lousier content. The one that got hit is what I spend most of my time on to build out as top quality.

I've been reading a lot about the Panda update and what to look out for. They say that metrics counted for this. But by all measures, the site that survived is totally inferior to the site that got hit (user experience metrics prove this). By a huge margin. I compared time on site and bounce rate head to head.

As I've been mentioning earlier, I only see the big site getting copied extensively by copyright infringers. Its only sin (in my opinion as someone who runs these 2 sites) was that it was the popular one so it got scraped a lot.

I'm of the mind that my main issue is an external duplication problem.

If it comes to this, then why, from what I've read in the past, was Matt Cutts saying "Don't worry about scrapers". If I knew it would come to this, I would have been very fierce about protecting my content. There was this lackadaisical approach to handling duplicates in the past. Why didn't they just warn webmasters about the importance of protecting their work? Prevention would have gone such a long way.

Buying/selling links was a big no-no. Google and SEOers alike could have made this matter just as big a deal as link commerce. Or did I just not get the memo?

And another question: I wonder how much "user metrics" has got to do with this? Or am I looking at the wrong numbers? Again, I've been checking my google analytics and the site that stands does not hold a candle to the one that got impacted.

I think this is what puzzles a lot of webmasters with multiple sites. What is the difference in profile between their sites? For me: both visually and analytically, one is inferior to the other. The inferior was not scraped and therefore, still stands. It's the only conclusion I keep coming up with.

Thanks to Tedster for providing additional insights on the duplication matter as well.

TheMadScientist




msg:4281456
 3:39 pm on Mar 14, 2011 (gmt 0)

Crazy thought about getting back 'in' for those who dropped ... If the results now are considered to be better than they were before, don't you have to 'out-do' the sites currently ranking to get back to the top? What I'm getting at is it might be more than 'finding and fixing' to get there, it might take more of a complete overhaul, imo.

walkman




msg:4281460
 3:43 pm on Mar 14, 2011 (gmt 0)

TheMadScientist, I've checked my competitors, nothing that we have discussed in itself makes sense, that's why I focused on 'thin' pages. Especially since Google mentioned it.

Regarding improving results: they are so many pages that even if Google penalizes sites with +/- 20% margin of error results will improve.

dickbaker




msg:4281568
 6:20 pm on Mar 14, 2011 (gmt 0)

What I'm getting at is it might be more than 'finding and fixing' to get there...


That's what I've been thinking. If I have a page that was once #7 and is now #49, I think it's going to have to be pretty spectacular to get back to page one, even though what's on page one is nothing to write home about.

The only other thing I can think of is that Google goes back to check on demoted pages, sees they've been improved, and they got some or all of the ranking back. Something tells me that's not going to happen, though.

crobb305




msg:4281587
 7:05 pm on Mar 14, 2011 (gmt 0)

I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.


How do you find data for individual page rankings in WMT? Or do you mean Google Analytics? I don't see any stratification of data by page (or I am just not finding it).

Edit: Nevermind. I just found it! Search Queries >>> "Top Pages". I have never seen that before. :)

crobb305




msg:4281591
 7:13 pm on Mar 14, 2011 (gmt 0)

Wow, looking that actual pages that took a hit (in WMT), some of my top content pages were knocked down 200 to 300 positions. No ads, no duplication. This is a BIG mess. I just searched some of the content from the top 3 articles that dropped the most (fell 200+ spots), and they are the only copies in the index. They haven't even been duplicated. No ads on the pages. So how can Google take it upon themselves to tell my visitors that the pages are "thin" content? That's quite arrogant of them. This algorithm = BIG FAIL

walkman




msg:4281597
 7:33 pm on Mar 14, 2011 (gmt 0)

WMT data is weird: I went up in ranking (according to them) but on most keywords that I supposedly went up, my impressions and clicks decreased by as much as 10%-65%.

Filter or something that's not showing in WMT? I should be rolling in money according to WMT ;)

Edit: Never mind, I need to filter US results alone.
Maybe I'm doing it wrong but on most I had a drop of 1-teens but as expected a massive drop in impressions /clicks.

[edited by: walkman at 8:06 pm (utc) on Mar 14, 2011]

crobb305




msg:4281602
 7:39 pm on Mar 14, 2011 (gmt 0)

Walkman, I see the same thing! Impressions increase, but ranking positions fell.

I'm trying to use the data as best I can. I am breaking it down into smaller time frames. For example, if I break it down to Feb 24 through Mar 1, I see pages SOAR in rankings, +40 to +200 spots. When I look at just the day before and the day after my penalty hit, those SAME pages dropped by up to 2x the magnitude.

VERY odd that the pages would surge, then drop, then take down the whole site. Maybe something is really broken? Maybe something is triggering a penalty for exceeding a certain threshold in performance improvements? I saw traffic increase 30% to 40% during the last 2 weeks of Feb. Then penalty hit. Oddly enough, before I got hit with a -50 over a year ago, I saw a surge in traffic for the 2 weeks leading up to it. Coincidence?

proboscis




msg:4281611
 8:10 pm on Mar 14, 2011 (gmt 0)

As I've been mentioning earlier, I only see the big site getting copied extensively by copyright infringers. Its only sin (in my opinion as someone who runs these 2 sites) was that it was the popular one so it got scraped a lot.


Yeah, the change was supposed to "affect sites that copy others’ content" but it's not always working. With one of my sites it's putting the copied content first.

Sometimes my site is not even on the first page when searching for an exact snippet. I've gained and lost over the years for terms but always always I've been first for exact snippets, as it should be because it's my original writing.

Oh well, I guess they're working on it. I have some hope but I don't know...

econman




msg:4281655
 10:05 pm on Mar 14, 2011 (gmt 0)

I saw traffic increase 30% to 40% during the last 2 weeks of Feb. Then penalty hit. Oddly enough, before I got hit with a -50 over a year ago, I saw a surge in traffic for the 2 weeks leading up to it. Coincidence?


Some possibilities come to mind:

a. Perhaps whatever is happening to your site is unrelated to Panda.

b. Perhaps some pages were being "tested" in high positions briefly. If so, those pages have now dropped back to lower positions, while Google studies the data it collected during this brief testing period.

c. Perhaps you did something unusual that caused the site to shoot up the rankings (say, build 100,000 dodgy links) it was spectacularly successful for a short time, but an algorithm, or a human review, slapped the site back down once Google figured out what you did.

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 5 [6] 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved