| This 195 message thread spans 7 pages: 195 (  2 3 4 5 6 7 ) > > || |
|Documenting my attempt to re-rank after "Farmer" update|
Ok, so for the sake of all who have been hit by the so-called "Farmer" update, I thought it might be useful to document my own attempt to increase my website's "Quality" measurement (whatever that means,) in an attempt to gain back some of my previous good ranking in the SERPS.
My site is rather small compared to many mentioned on these boards –– just 700 or so pages –– and at it's peak it has never made me more than $1,000 in any month period, but in this economy that lost $1,000 is kicking my butt.
The site I am referencing is built on Wordpress as a CMS, has been copied copiously by scrapers in the past, and is a review/news-type sight, where I post long, honest and well-written reviews, news, how-to's, etc about my favorite types of widgets.
I have never paid for a link, and I do not run any sort of linking campaigns, though the site is plenty popular, gets a fair amount of natural links, and has nearly 7,000 RSS subscribers.
Obviously, we know very little about the particulars of this latest Google update, but the concensus seems to be that this a radical change in the algo, and we can't expect our ranking to just suddenly reverse any time soon.
With that said, I thought I'd document my meager attempt to gain back some of my good graces with Google.
After reading as much as I can find on the subject, I came across this article posted in one of the other threads here, and it seemed like a good place to start.
The quote that jumped out at me was this:
|Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
With that in mind I decided it was time to clean-up my site... but where to start? Like most of the good folks here, I tend to think everything I write is pretty good, and I certainly don't copy and paste other people's work etc. To the extent that it's possible, my pages are unique and well-written.
I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.
Luckily there were only a handful of these.
As I began to dig through them, it became apparent to me that at least 80% of these pages were very obviously "thin." Much thinner than I even remember writing.
So, for lack of a better idea, I canned them; my hope being that by removing entirely the pages that have suffered the most in this update, my entire website will look better to Google.
All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again.
I can not think of any other ways to "clean up" my site... I long ago blocked things like Tag Pages, Category Pages, etc from being indexed, so there is really nothing extraneous from my site in Google's Index.
Now I just cross my fingers and pray. Nothing I've done is irreversible, so what did I have to lose.. my already tanked rankings?
I'll let you know how it turns out.
|All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again. |
Hi Dead_Elvis, interesting idea.
What HTTP status code are requests to those URLs returning now?
Yes, please do post a follow-up. Thank you.
Hi GlobalMax, the pages are now returning a standard 404 error. I don't think I'm going to bother requesting a removal via Google... I'm not in a big enough rush, and with a site my size it gets crawled pretty quickly.
I was really surprised to see just how short or "thin" some of these pages were. They were totally legitimate, and they said exactly what needed to be said (ie, they weren't lacking info for their subject -- the subject just happened to be short,) but clearly Google didn't like them.
There were a few longer posts in the list of worst offenders, but they were a minority. All in all, these were pretty thin pages. None of them were painful to discard.
I will keep you informed as I see how this works out. *gulp*
Finally someone who talks how to regain rankings, thanks Elvis.
I will also block a whole section with thousands of thinner URLs, I will use:
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW">
The question is, how long will the new algo need until the on-site changes will be re-evaluated ... no one knows.
My cat is staring at me at the moment but I think it just can't hurt since this meta tag always did a good job. Probably I will start a "cat rescue" site if my cat is right on this ;)
Beautiful. You're being very smart and proactive - please do keep us updated.
I think you've got a great idea and are on the right path ... Love what you have bolded, because I think it's a great thing to point out noticing, and the 'step back' view you looked at your site with is great.
The following sentence stood out to me and I think it effects quite a few people here, maybe more than they think (much like the content you had that was thinner than you thought), so I wanted to point it out.
|To the extent that it's possible, my pages are unique and well-written. |
The bold italic text would concern me as a site owner wanting to rank. So, here's the challenge as I see it for many webmasters, yourself included: How can your site be totally different or unique?
What value can your site offer others don't that sets you apart from them, other than content that's unique 'to the extent it's possible'? What completely unique 'thing' can your site offer people can't find anywhere else? What can it do differently that makes it standout?
I know those are tough questions to answer, believe me, I know...
I see tons of thin, I mean really thin URLs rank with this algo change, in the local business area sites which only have an address and a google map, there is no semantic at all. Not a single sentence throughout the whole site with thousands of URLs! Just think about it.
Therefore I have hesitated to remove thinner URLs.
Google algo can't evaluate uniqueness. Probably in 2100+ but not now. This is not AI, this is just a plain dumb algo change.
It is the same as with the -950 re-ranking, tweaking a single part of the site will release it from jail. It's just plain stupid BS therefore they hide all the specifics for this change.
BTW, you don't need value, you just need to adapt to this change.
Let's play Sherlock Holmes and finally game their dumb algo changes a 5 year old could code.
[edited by: SEOPTI at 6:08 am (utc) on Mar 3, 2011]
Here's what I'm doing:
- Merging and purging similar articles.
- Getting nice writer bios. I have writer bio pages for everyone, but I'm going to include them on the actual articles, so people can see their PhDs, careers at space agencies, and work as physics teachers.
- Better internal links. I'm going to do a big round of internal linking to make sure that every article is well supported internally.
- Improve content quality. I'm going to look for lower quality, thin content pages and improve them as best I can. I'll also significantly improve the articles that were already ranking well, to make them really good, and worthy of rank 1.
- Sources. I'm going to have a writer go through my articles and find good sources for the things we say, especially sources that go back to scientific journals, etc.
- Create some new content that's extremely high quality. Detailed on-hand reviews, educational resources for teachers, etc.
- more graphics, pullquotes, infographics, photos, text design to make the pages look better.
I came here to document my own changes, and saw this thread, and thought, wow, that's almost exactly what I was going to write. For the same reason as Dead_Elvis (that quote), I decided to just dump a large portion of my site. After thinking about it, I'd created content that I thought the users might want (and I still think that), but frankly, those parts of the site don't make me a dime richer anyway, and since they could be considered thin, I dumped them. I've pared down my already fairly small site (300 pages or so) to about 1/3 of that. I kept the things people truly came for (which are also the things that created income). The rest is gone. I may add a thing or two back in over time, probably enhanced, but I may not. We'll see. For now, we'll see how a very streamlined site bounces back...or not.
Removed an entire section on 2/25 (a day after) via WebmasterCentral but no real improvement, even though they don't appear in the G index. Now I am letting G get the removed pages but they have a noindex. As of today probably 1/3rd of them have been taken by Google.
Maybe google needs a few days to reprocess them, assuming this will help of course :) ? Any idea on how it takes them to do a small recalculation?
First step i've done is to re-arrange Adsense to not be agressive and some internal links, low CTR now, but i have improved page views, time on site and bounce with 20%.
Next steps I am thinking:
- identify thin pages and improve them
- reduce the number of internal links (i see sites doing well have small number of internal links)
Best of luck with this; keep us updated :)
And this analysis by SeoMoz might be useful: [seomoz.org...]
They have a fair bit of data and analytical power available and draw some possibly useful conclusions.
Insofar as I can tell, none of my sites were hit, and a few have experienced small bumps upward.
However, this last update was enough to make me sit up and take notice, and think about what I can do pro-actively so that I do not get hit in the future.
So the first thing I am doing is taking the same advice I've been spouting in the AdSense forum for years - take an objective look at my site, and measure percentages.
If nav + ads + header + footer > content; then fix.
If nav + ads + header + footer < content; probly ok.
(that's probably not ALL I will do, but it's where I will start)
I refuse to jump the "thin" reason here. A page that is saying
"lady gaga - bad romance. 320kbps stereo MP3 file"
Google just underestimated the complexity of their own algo change in the long tail and how many innocent small sites will be hit by cleaning the stuff out. They did a pretty good job in many areas, but for some previously solid "thin" pages (gateways to GREAT photos, PDF downloads, video - heck even Linux disc images) their update does not fit...
By the way: I have not changed much on my site and got a very good start of the day, almost on the same level pre-farmer. Maybe that is a good sign and I will start to counteract next Monday, when I am done reading all your posts ;-) I will share my findings, then...
just 2 cents!
Elvis, I have been engaged in a similar exercise but if I look at pages that ranked before and after (I have lost about 20-30% traffic from a website that is 10plus years old with 100% original content and has never been affected before), I do not see a pattern that thin pages are affected. For example, a page with an image of Kim Kardashian in a dress that I like with simple comment like I love the color or cut or something simple like that is ranking just fine, but a page with a serious analysis of the economy is no longer ranking.
If I were you, I would hold off on making these changes because it is too early to know what is happening.
removing redundant pagination pages ( i had thousands of those) could also help........ too long a pagination doesnt make sense
|I do not see a pattern that thin pages are affected. For example, a page with an image of Kim Kardashian in a dress that I like with simple comment like I love the color or cut or something simple like that is ranking just fine, but a page with a serious analysis of the economy is no longer ranking |
Google has publicly stated that some low quality pages on your site can affect the whole site
|In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
therefore, it doesn't make much sense to try to see which of your pages were impacted more/less by this algo update.
Hi skweb, I definitely hear what you are saying here, but I can only judge based on my own site, and after digging through the most badly damaged pages it is clear to me that for my site anyway, the so-called "thinnest" pages were the ones that were hit.
Yes, I lost some longer pages as well (shall I call them thicker?) but since the algo can apparently cause an entire site to be demoted (see the quote above,) that doesn't seem that crazy.
I've only removed the "thin" pages, not the thick ones that also lost rank, and as I said, I don't have much to lose by removing them. I can easily add them back in, and they were very clearly NOT my money makers anyway.
This seems like a very minor change to my site when looked at from the bigger picture, and honestly, the changes DO make my site stronger IMHO. I don't see any downside to it.
I didn't throw out the baby with the bathwater, as the old saying says ;)
TheMadScientist, your points are dead-on and well taken. It is clear to me that I need to think outside the box on this and find ways to make my pages even more truly unique.
So far I haven't figured out what that will be, but I have got my mind on it as the goal, and I suppose that's where I've got to start!
Thanks Dead_Elvis ... I'm glad you got the point and if that's your goal and what you keep asking yourself, there's no telling what you might dream up ... One of the sites I started came from a 'fluke' idea ... Here's the crazy story:
I was building a site with a friend and we were going to do a member section ... We decided it was best separated from the original, so we started building two: one for one specific 'thing' and the other we decided to make for members.
The non-member site was completely original, but we wanted to make the member site 'unique' or 'different' too ... There were quite a few sites that had member areas in the niche and offered some of the other features we decided we should offer ... We expanded on one 'common feature' to make ours different and more 'fun' / 'usable' but we still didn't have quite enough to make it something we thought people would go 'oh, this is a 'must join' site' ... We kept trying to come up with something no one else was doing and had an idea to do something we both wanted and thought would be cool for the members ... When we got the idea done and working and started using it, people we 'shared' the idea with started saying, 'Huh? How did you?' and we knew we had a winner, so we expanded on it to make it more accessible for everyone...
Anyway, that's the story behind the site in my profile ... It wasn't 'just an idea' it was an idea that came from one idea and one improvement, which led another idea and another improvement and ended up being something totally different than either of us had thought of doing when we started and happens to be a totally unique idea...
The point of the story is: There's no telling what you'll come up with if you just keep asking yourself, 'What can I do different?' and keep asking and asking and trying to improve on the idea you just had throughout the day...
Interesting. I, too, took a step back to assess the quality of my site vs my peers and found a few things. There are takeaways to this event. First, it tells me that my overall position is not deserved based on certain factors. This algo is about quality and whether you are answering the needs of users.
I'm guessing that large authority sites are upheld to a certain standard that's different from smaller sites. My authority site has much much better content than my smaller sites, but this is also the site I learned on, vs the smaller sites I run that I built when I had more experience.
The difference is that my authority site tends to be more "quirky" while my smaller sites are more commercial, maybe even more dry. It would be a shame if I had to change my style because of such an update. But there could be other factors too. The quirkiness made me stray from usefulness, perhaps. I saw that my older pages had a lot more "musings" and rambling that were on the shallow side, vs more useful, informational material. Should I be penalized for that?
I also monetize my authority site a little harder as it attracts more ads, but the difference between this and my smaller sites is negligible, in terms of ad density or layout.
I can also see that this is a message that says, "well if you are an authority site, you better deserve it". That may ring true. I certainly question whether the algo should have penalized the big site for being interesting/different/quirky vs less useful in some areas as opposed to my more boring/straightforward/on-topic pages on my smaller sites that are now booming in traffic.
I would love to know what it is they are looking for. I am sure I've tripped some of their quality factors somehow, and maybe SIZE is one of them. Smaller sites that are just as quirky are not penalized. Smaller sites with more ads are not penalized. So really, what is it? The bigger you get, the more important you get, does that mean we have to act like the other big guys too and be much more "pro" about things? Or I suppose, one way to look at it is that we are "whittled" down to size and now simply reflect the traffic we deserve to get...
|I would love to know what it is they are looking for. I am sure I've tripped some of their quality factors somehow, and maybe SIZE is one of them. |
I don't think it is size per se but size plus diversity of content. Remember the focus is to go after content farms so you have to think about what their attribute are.
I was wondering about quality in the other thread.Pls. define "Quality".
I think it will be good for google to make a new definition for this, as their current guidelines on quality is not good enough.
@elsewhen Help me out on this - I'm not quite following the logic.
I don't see any "black or white" aspect to the Farm update changes - pages are affected to many different degrees, and even within one site, some pages can go up while others go down.
As I see it, even if weak pages do help drag down good pages on the site, it still seems to me that addressing the "Biggest Loser" pages first makes very good sense. Otherwise a webmaster would just be lost in the swamp and throw up their hands.
In other words, even though there is a site-wide component at work, the greatest power of the update is focused at individual URLs.
tedster, you explained my thoughts exactly.
Sitting and waiting for this algo to change seems the most risky move of all. I see no reason to believe my pages are just going to start ranking again out of the blue.
Taking into consideration the fact that the "Farmer" algo appears to be bringing down good pages as well, I have to assume that it will do the most damage to what it considers truly "bad" pages... hence I've only removed those pages with massive falls in the SERPS; my working assumption being that those pages which only fell a small amount are more likely to be collateral damage.
For my experiment I only removed pages that had fallen massively in the SERPS -- those in the 50, 100, 200, and 300 range.
|my authority site tends to be more "quirky" while my smaller sites are more commercial |
Yes, and also noting Jane_Doe's comment below.
Are your smaller sites more likely to feature content that is narrow focused, i.e. one single topic or a handful of closely related topics? Is their site structure and navigation easier to follow than your older large site?
In the Webmaster Forums thread that was referenced by TheMadScientist [webmasterworld.com...] one of the posts mentioned a non-profit that got hammered. Apparently, their site grew out quite haphazardly - not only in terms of scope, but also across three separate tld's. Check to see if you can find any parallels with your situation.
it seems that the issue is with the interpretation of this:
|In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
if google had complete confidence in their ability to guage page quality, then this would have been a page-by-page update, but the quote above indicates that google is looking at an overall profile and then applying that to an entire site. yes, they have to look at the quality of individual articles, but they don't seem to have the confidence in the algo to apply the down-weightings on a page-by-page basis.
my understanding of the ramifications of this is that a very-low quality article could rank well on nytimes (a site that was not flagged as low-quality), but a very high-quality article on a site flagged as low-quality would not be likely to rank.
yes, it is individual pages that need to be acted upon, but i am not convinced yet that the pages that lost the most traffic, or even the pages that lost the most ranking positions are the offending pages. ranking drops are not just about a particular site - they involve all those sites that rank near you. so a drop from 6th place to 11th place could be caused by exactly the same down-weighting as a ranking drop from 6th place to 7th place. in the first case, there were just a bunch of competing pages right behind the 6th position - and they were right there to move up into the vacuum created by your downweighting.
my sense is that google is really trying to guage quality... if we are to take their word for it, then the types of things they are downweightings are:
- low-value add
- copy content from other websites
- just not very useful
the sites that are not geting downweightings:
- original content
- in-depth reports
- thoughtful analysis
due to the scale at issue, and due to google's culture, they are making these assessments algorithmically. my research is pointing me to believe that they are using a much deeper definition of "original content." it is not just about strings-of-words anymore, but rather whether sites are bringing more to the table than other sites they are competing with in the SERPs
|my sense is that google is really trying to guage quality... if we are to take their word for it, then the types of things they are downweightings are: |
* low-value add...
but elsewhen, that is exactly what I am trying to take care -- and by examining which pages have fallen most drastically, I've discovered that they are indeed the same pages that happen to fit your description.
In a word, "thin."
Removing said "thin" pages seems an excellent first step. There is only one way out of a swamp, and that is by taking one step at a time.
I can't think of a better way to start improving quality than to start with examining the pages themselves.
Obviously, it would be something to look at on a site-by-site and page-by-page basis, but ... The OP in this thread said they looked into the pages that dropped the most and they happened to be thinner than even they remembered ... I think there's quite a bit to be gained from the statement Dead_Elvis makes in the OP and the 'biggest droppers' is definitely where I would start looking for answers.
Sometimes I think people over-think the situation...
I'd keep it simple for finding a starting point.
A few more thoughts.
As I was chopping out sections of my site, I started finding some things I'd completely forgotten about. When you've added to a site over many years, things tend to be forgotten. Some of the things I found:
- a few articles from years ago that I'd used from an article directory
- a handful of pages that were original and hand-written, but definitely were content-farmish in nature: "5 Great Gifts For the Golfer In Your Life", "What Gifts To Get A Teenage Girl", that kind of thing...
- a section of short, stub-like, definition pages
It's not going to kill me or the site to dump those, and content can always be added back in later if I so choose.
But those *might* be a few of the kinds of things to look for. Maybe. :)
| This 195 message thread spans 7 pages: 195 (  2 3 4 5 6 7 ) > > |