| This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 5 6  ) || |
|Documenting my attempt to re-rank after "Farmer" update|
Ok, so for the sake of all who have been hit by the so-called "Farmer" update, I thought it might be useful to document my own attempt to increase my website's "Quality" measurement (whatever that means,) in an attempt to gain back some of my previous good ranking in the SERPS.
My site is rather small compared to many mentioned on these boards just 700 or so pages and at it's peak it has never made me more than $1,000 in any month period, but in this economy that lost $1,000 is kicking my butt.
The site I am referencing is built on Wordpress as a CMS, has been copied copiously by scrapers in the past, and is a review/news-type sight, where I post long, honest and well-written reviews, news, how-to's, etc about my favorite types of widgets.
I have never paid for a link, and I do not run any sort of linking campaigns, though the site is plenty popular, gets a fair amount of natural links, and has nearly 7,000 RSS subscribers.
Obviously, we know very little about the particulars of this latest Google update, but the concensus seems to be that this a radical change in the algo, and we can't expect our ranking to just suddenly reverse any time soon.
With that said, I thought I'd document my meager attempt to gain back some of my good graces with Google.
After reading as much as I can find on the subject, I came across this article posted in one of the other threads here, and it seemed like a good place to start.
The quote that jumped out at me was this:
|Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, its important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
With that in mind I decided it was time to clean-up my site... but where to start? Like most of the good folks here, I tend to think everything I write is pretty good, and I certainly don't copy and paste other people's work etc. To the extent that it's possible, my pages are unique and well-written.
I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.
Luckily there were only a handful of these.
As I began to dig through them, it became apparent to me that at least 80% of these pages were very obviously "thin." Much thinner than I even remember writing.
So, for lack of a better idea, I canned them; my hope being that by removing entirely the pages that have suffered the most in this update, my entire website will look better to Google.
All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again.
I can not think of any other ways to "clean up" my site... I long ago blocked things like Tag Pages, Category Pages, etc from being indexed, so there is really nothing extraneous from my site in Google's Index.
Now I just cross my fingers and pray. Nothing I've done is irreversible, so what did I have to lose.. my already tanked rankings?
I'll let you know how it turns out.
|I saw traffic increase 30% to 40% during the last 2 weeks of Feb. Then penalty hit. Oddly enough, before I got hit with a -50 over a year ago, I saw a surge in traffic for the 2 weeks leading up to it. Coincidence? |
Some possibilities come to mind:
a. Perhaps whatever is happening to your site is unrelated to Panda.
b. Perhaps some pages were being "tested" in high positions briefly. If so, those pages have now dropped back to lower positions, while Google studies the data it collected during this brief testing period.
c. Perhaps you did something unusual that caused the site to shoot up the rankings (say, build 100,000 dodgy links) it was spectacularly successful for a short time, but an algorithm, or a human review, slapped the site back down once Google figured out what you did.
|Perhaps you did something unusual that caused the site to shoot up the rankings (say, build 100,000 dodgy links) it was spectacularly successful for a short time, but an algorithm, or a human review, slapped the site back down once Google figured out what you did. |
No link building so nothing dodgy going on. The increase in traffic coincided exactly with the initial reports of the algorithm. So perhaps my increases were the result of others in my niche getting hit. Maybe my turn was coming...and I didn't get hit until March 11.
I am reviewing the WMT data, and taking a close look at pages that took the hardest hit. The page that took the absolute hardest hit on my site (-300) was a pure content page (no ads) and no duplication anywhere on the net (I searched snippets in both Y! and G). However, I discovered a spelling error in my page header <H3>, and the title/description were identical (only 6 words long). So despite the content, perhaps those errors were the culprit. The second biggest hit (-200) also had a spelling error in the description. No ads on that page, and no duplication elsewhere. The next in line (-100) are my 5 monetized pages (thinnest content). I have some other pages that fell -50 to -80, and those tended to link more heavily to the monetized pages.
That's all I've discovered so far, and I've been studying these data for 3 hours now.
Are you filtering for web traffic only (in WMT)? I find that I get strange results if I include image searches.
Well, I just got whacked again. A phrase for which I ranked #7 on 2/23/11, and which dropped to #49, had regained some ground to #29. It's now #61.
A phrase for which I ranked #6 pre-update, and which had fallen to #28, had gained ground, too, and moved up to #16 yesterday. Tonight it's #34.
Another phrase, for which I ranked #11, fell to #80 after the update, then moved up some, is now at #110.
Another one was #11, was knocked down to #63, and is now at #115.
There's a lot more than just these four. #3 to #37 to #60 today, etc.
Just from a quick look, it seems like phrases for pages that I've been reworking are either staying the same, or not being hit anywhere near as hard as others. The ones being hit really hard are ones I haven't gotten to yet.
Oops. I take back the part about phrases for pages I've worked on. #11 to #113 to #148 tonight, with others taking hits, too.
|Oops. I take back the part about phrases for pages I've worked on. #11 to #113 to #148 tonight, with others taking hits, too. |
I see the same thing. Stuff starting to sneak back and then dropping again tonight. If they made it too easy to figure out the new algorithm they would take all the fun out of it. :)
I'm staying the course with my changes. They are going to do what ever they need to do to try to make the cause and effect relationship unclear, like the way they don't show all of the links and they way they vary the serps throughout the day.
I have my sites and pages on the older sites that didn't tank as models, including one site where the income almost doubled. I'd be really surprised if the changes I'm making to the older sites don't end up helping in the long run. I expect that they will show ranking dips in the short term just to make it harder to reverse engineer what they are doing. But with everything that has been posted I think the collective wisdom on the SEO forums and blogs has things pretty much figured out.
|... and they way they vary the serps throughout the day. |
IDK if they do that to purposely confuse people, but it works ... lol ... I do remember reading they have 'day part dependent' SERPs written into one of their patent applications from a few years ago, so my guess is that's the cause, which does produce what is probably a desirable effect of throwing those who try to 'manipulate' or 'determine cause and effect from' the SERPs off a bit.
|... and they way they vary the serps throughout the day. |
Is some of that a function of multiple data centers not being updated to the same level all the time?
In the last few days I have seen a difference of a couple pages in rankings just by doing an F5 refresh on searches for some of my terms. That makes it hard to figure out exactly where you stand.
I wonder if 404 errors are going to hurt your page rankings as well - as you would think Google has set the Googlebot to get rid of old sites with a lot of broken links/pages removed, etc.
Not sure if this would be a better solution but is a suggestion - since you are using Wordpress, rather than send the offending low-rank pages to the trash to have to retrieve later... why not install Wordpress-SEO plugin by Yoast - it creates your sitemap.xml, then gives you advanced functionality on all posts to select "no follow' meta tag and to "never" crawl and to remove/leave out of your sitemap.
I know Google may still view 'no follows' (they'd have to or all #*$! sites could get by the bot with all kinds of horrid things) but this may be a better solution to keep the low-rank pages out of their site calculation for your pages as a whole while not giving a 404 error that could be detrimental.
Plus, keeps the pages up, someone may enjoy them ;)
|I wonder if 404 errors are going to hurt your page rankings as well - as you would think Google has set the Googlebot to get rid of old sites with a lot of broken links/pages removed, etc. |
If you have them linked internally it can mean a "bad" site but otherwise it isn't your fault.
I just 301 redirect my 404 error pages - if there are any on my sites. I think it's a better strategy, but someone might disagree. I never delete a page, every page is part of the bigger picture and deleting doesn't do any good to any site. A nofollow attribute is enough for me to show G that I don't think these pages are important for my website. If the page is so bad, than noindex will do (why publishing it in the first place?).
To those hurt, is Googlebot visiting your sites with the same frequency even now?
|To those hurt, is Googlebot visiting your sites with the same frequency even now? |
The Gbot usually crawls just 300-500 pages a day. Yesterday it came around 1200 times. I've been changing a lot of content on pages this week, and have been using Twitter to tweet the URL's.
The last time I was crawled more than 1,000 times a day was on March 8th which, if I recall correctly, was right before many some of my rankings changed significantly, for good and bad.
new update coming...next thus-friday. I got a feeling and have posted it
|I just 301 redirect my 404 error pages - if there are any on my sites. I think it's a better strategy, but someone might disagree. I never delete a page, every page is part of the bigger picture and deleting doesn't do any good to any site. |
It can be risky if your server doesn't reply with a 404 response for URLs that never existed. In other words, there's a difference between redirecting URLs that you removed and redirecting EVERY request that your server sees as Not Found.
In the second case, badly formed links aimed at your site can become like a ticking time bomb, piling up duplicate content.
| This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 5 6  ) |