| This 195 message thread spans 7 pages: < < 195 ( 1 2  4 5 6 7 ) > > || |
|Documenting my attempt to re-rank after "Farmer" update|
| 1:58 am on Mar 3, 2011 (gmt 0)|
Ok, so for the sake of all who have been hit by the so-called "Farmer" update, I thought it might be useful to document my own attempt to increase my website's "Quality" measurement (whatever that means,) in an attempt to gain back some of my previous good ranking in the SERPS.
My site is rather small compared to many mentioned on these boards –– just 700 or so pages –– and at it's peak it has never made me more than $1,000 in any month period, but in this economy that lost $1,000 is kicking my butt.
The site I am referencing is built on Wordpress as a CMS, has been copied copiously by scrapers in the past, and is a review/news-type sight, where I post long, honest and well-written reviews, news, how-to's, etc about my favorite types of widgets.
I have never paid for a link, and I do not run any sort of linking campaigns, though the site is plenty popular, gets a fair amount of natural links, and has nearly 7,000 RSS subscribers.
Obviously, we know very little about the particulars of this latest Google update, but the concensus seems to be that this a radical change in the algo, and we can't expect our ranking to just suddenly reverse any time soon.
With that said, I thought I'd document my meager attempt to gain back some of my good graces with Google.
After reading as much as I can find on the subject, I came across this article posted in one of the other threads here, and it seemed like a good place to start.
The quote that jumped out at me was this:
|Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
With that in mind I decided it was time to clean-up my site... but where to start? Like most of the good folks here, I tend to think everything I write is pretty good, and I certainly don't copy and paste other people's work etc. To the extent that it's possible, my pages are unique and well-written.
I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.
Luckily there were only a handful of these.
As I began to dig through them, it became apparent to me that at least 80% of these pages were very obviously "thin." Much thinner than I even remember writing.
So, for lack of a better idea, I canned them; my hope being that by removing entirely the pages that have suffered the most in this update, my entire website will look better to Google.
All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again.
I can not think of any other ways to "clean up" my site... I long ago blocked things like Tag Pages, Category Pages, etc from being indexed, so there is really nothing extraneous from my site in Google's Index.
Now I just cross my fingers and pray. Nothing I've done is irreversible, so what did I have to lose.. my already tanked rankings?
I'll let you know how it turns out.
| 4:15 pm on Mar 6, 2011 (gmt 0)|
|you can have 8 banner ads and be /look fine |
Agree, but I think G don't want the ads in the content area. Askthebuilder has ad blocks within the content
See this url:
Above the fold on the chrome browser, three ad units, and the ads appear to take up more space than the content.
Whereas cultofmac guy seems to have moved all his ads to the right-hand sidebar and he is back in the SERPs.
Both useful unique sites with good content and good backlinks - the only difference being ad placement.
Edit: Sorry askthebuilder had four ad units above the fold, forgot to count the one in the header. That's too much for above the fold.
| 4:21 pm on Mar 6, 2011 (gmt 0)|
|For one of my keywords, the positions are constantly changing every couple of days. Before the algo update, the page was ranked second, then moved down to 10, then 7th, and then 11th and now it settled at 7th since yesterday. |
Ive seen this pattern too. It jumps up and down during the day and after a while it settles on a position. I'm now trying to get it go up and down again en move up in the serps. This is a highly competitive keyword and it's amazed my how high it got during this up and down period.
Maybe important to say we didn't have the algo change yet.
Edit: If you cannot have Ads in the content what's the point of having ads anyway? It's also the position the Adsense team recommend to put it there.
[edited by: Globetrotter at 4:25 pm (utc) on Mar 6, 2011]
| 4:24 pm on Mar 6, 2011 (gmt 0)|
|It's a shame that AskTheBuilder was hit. |
Not from my perspective. That site fails and have implemented their AdSense in a way that I would classify as a click trap. I'd also call that a "greedy" AdSense implementation.
|the pages are now returning a standard 404 error. I don't think I'm going to bother requesting a removal via Google... I'm not in a big enough rush, and with a site my size it gets crawled pretty quickly. |
You'll want to 301 those to an appropriate replacement or send a 410 Gone response. Google will continue to request those URIs if they are returning a 404. Your best bet is to preserve your site equity and 410 those puppies if they are gone forever. It acts just like a URI Removal Request. I've seen documents Gone in less than 48 hours after implementation.
| 4:39 pm on Mar 6, 2011 (gmt 0)|
@AlyssaS, I don't think CultOfMac changed the ads, check the historical screenshots in domain tools dot com website.
| 4:40 pm on Mar 6, 2011 (gmt 0)|
|If you cannot have Ads in the content what's the point of having ads anyway? It's also the position the Adsense team recommend to put it there |
Readers think they look spammy. I mean, look at that URL posted above for askthebuilder. He obviously thinks people visited his page to look at his ads, whereas they visited to find some info.
Remember the AOL v Google battle back in the day, when AOL's homepage was all winking gizmos and you couldn't find the thing you were looking for, the search box? And how G defeated them?
Well the Plaster-the-ads-all-over-the-place brigade are being AOL here. They have long forgotten the user.
The old-fashioned approach to ads was to put them in your sidebar, and then if the reader didn't find what they were looking for in your content, or if the ad was a product mentioned in the content, they had the chance of clicking an ad.
Now it's like the content IS the ads.
Also - I don't get the reluctance to reduce ad units. There are only a small number of quality relevant advertisers for each keyword, and they will be in your first ad block. After that, they diminish in quality and your final ad block will have 5 cent ads in it. So remove them, they probably arn't helping you financially anyway.
| 4:48 pm on Mar 6, 2011 (gmt 0)|
|@AlyssaS, I don't think CultOfMac changed the ads, check the historical screenshots in domain tools dot com website. |
That's interesting, so they were a genuine false positive, which may be why they got restored. I wonder what it is about their site that got them caught in the first place.
| 4:51 pm on Mar 6, 2011 (gmt 0)|
|Agree, but I think G don't want the ads in the content area. |
Never generalize it as "G don't want the ads in the content area."
G adsense/adwords team always wanted ads to be placed above fold and in content area.G search team was fine with it all these days and suddenly they come up with this new rule and go on to implement it in thir algo, without even bothering to give webmasters a warning.
G can have two different teams or two independent teams internally but to an outsider G is one company.They cannot pretend to the outside world as two separate companies.
It is a real shame that they now want to penalize webmasters for something what they encouraged all along.
If ads in the content area is spammy, they should have made it clear to the webmasters not to do so.
To me the article from Vanessa Fox almost confirms that ads had played a role in this penalizing algorithm.G has totally messed up the webmasters with this act without even alerting them about it.
This is the worst I have seen from G as a company and it has totally lost its reputation with this, however hard they try to put up a face through people like Vanessa.This is simply not good PR.
[edited by: indyank at 5:03 pm (utc) on Mar 6, 2011]
| 4:55 pm on Mar 6, 2011 (gmt 0)|
Well, I still believe they made a manual change. They really don't want bad press from Cult guy. Remember, Matt tweeted him back.
| 5:03 pm on Mar 6, 2011 (gmt 0)|
|Also - I don't get the reluctance to reduce ad units. There are only a small number of quality relevant advertisers for each keyword, and they will be in your first ad block. After that, they diminish in quality and your final ad block will have 5 cent ads in it. So remove them, they probably aren't helping you financially anyway. |
Best advice I've seen so far!
I've worked with a WebmasterWorld member, who is probably following this topic, and that is exactly what we did, quite a while ago, reduced the number of ad units. While there were short term losses, the long term gains are off the charts. Their traffic continues to rise. In fact, this update actually rewarded them. I'm looking at a nice little spike starting on Feb 24, 2011 and showing an upward trend from that point forward. Its a quality site and well done from a usability and accessibility standpoint.
I was impressed by this members enthusiasm to implement ALL the changes I suggested. I know you're reading! ;)
What did we do?
1. Valid HTML platform
2. Pure HTML
3. Subtle ad unit placement
4. Less ad units
5. Content first - Users first
If I visit a page on your site, the first thing I should see is what I'm there looking for, the primary content of the page. If my viewport is filled with mostly ads, in positions where I'm usually expecting to see on site navigation, I usually hit my back button and contribute to the high bounce rate for that destination. ;)
From the looks of things, when discussing AdSense, many of these folks have gotten really greedy in their implementations. Remember that whole concept of banner blindness? Well, that line became very blurred when AdSense came on the scene.
Oh, and you folks at AdSense? Maybe you need to stop sending out emails promoting the placement of more ad units? You've been providing folks with all the rope they need to hang themselves. You're responsible for the AdSense Farms that are now finally being purged from the indexes.
| 5:47 pm on Mar 6, 2011 (gmt 0)|
If you're a site which gets it's income from adsense / affiliates just like a newspaper (or at least some site who "sells" it's knowledge instead of products ) how would you ever be able produce great quality content and make it this way you earn back your invested time / money? To me it seems impossible without adsense on one of the spots the Adsense team is recommending.
| 8:46 pm on Mar 6, 2011 (gmt 0)|
| 10:11 pm on Mar 6, 2011 (gmt 0)|
Thank you for the YT link, Browsee.
The point of my comment, and I believe it is in line with indyank's quote
|Never do a nofollow for the sake of preserving PR to your site as google has long changed the way they distribute link juice.By having a nofollow you are just wasting the juice.It is better to credit it to the source. |
Offcourse, this doesn't mean that you should not use nofollow at all.But it should be rare and there should never be a situation where you want to write most of your stories referring sources that you don't trust.
is that while it may well be technically possible to implement a js-based nofollow scheme (!), I don't believe that it is recommended as some sort of cure all method.
If you just wanted to alert G to the fact that it should not pass PR juice to a class of external links (e.g. user generated content, paid links), then I'd just stick with the suggestion by P1R above to use VALID and PURE HTML.
Nothing wrong with rel=nofollow in those situations, but you may also want to examine the use of a redirect that G is not allowed to crawl.
| 6:57 am on Mar 7, 2011 (gmt 0)|
Trimming the thin pages, with or without the Farmer algo, is a good idea. Imagine writing a technical paper, perhaps a thesis, with one word in 50 mis-spelled. That's the effect, for me, of running into a 'thin' 400 word post / page at intervals as I am reading online. Yank 'em and be thankful that Google caused you to review your site. Once they are gone, if the site doesn't begin to percolate upwards, you will at least know that all the remaining pages are strong, so some other factor must need attention.
My traffic hasn't been affected at all. It was, and remains, dismal. :-)
| 2:15 pm on Mar 7, 2011 (gmt 0)|
I do not think it is about ads, we definitely not overuse them on any of our sites. Can anyone confirm that after the ads were removed or moved down, the traffic from google came back to where it was?
| 2:56 pm on Mar 7, 2011 (gmt 0)|
If you have strong content and haven't overdone the ads (and you can see lots of sites that still rank despite being slathered with ads), there are likely other issues at play. For example, if your content was plagiarized or scraped and ended up in 'article directories', Google may be throwing your baby out with the bathwater.
| 4:01 pm on Mar 7, 2011 (gmt 0)|
NixRenewbie, those were my thoughts exactly, and they are the reason I created this thread.
While I consider my site strong as a whole, this algo change has caused me to re-evaluate a lot of my older pages, and what I discovered was that many of them no longer live up to my expectations of a high-quality page.
I'm actually kind of enjoying the process of cleaning up the site.
It's nice to know, anyway, that anywhere a person lands on my site they will find a page worth reading. Anyway, I don't see how cleaning up my site can cause any harm.
At this point I'm doing it as much for myself as for Google or any other search engine.
| 4:44 pm on Mar 7, 2011 (gmt 0)|
Truly, one Google hand doesn't know what the other is doing.
EzineArticles and AskTheBuilder have both been hit by the Farmer update (so Google Search evidently hates them) but these same sites are also featured on the Adsense site as success stories (so evidently team Adsense loves them - MFA / spam / thin content = success).
Who could not argue that the indiscriminate granting of Adsense accounts to any and all sites is really the biggest factor in the proliferation of spam sites on the web.
What team Adsense does is just like what happens with other areas of business (hand gun manufacturers, banking/credit industries, fast food, etc...). They make a legal product, without any concern or sharing any responsibility about how it is used or how it affects society as a whole.
Google search may try to be "Do no evil." The Adsense side is simply American business as usual. Screw everyone, we just care about hauling in the money.
| 7:09 pm on Mar 8, 2011 (gmt 0)|
I'm just wondering if there's been any conclusions reached anywhere on this forum about UGC, such as comments about products. I'm thinking that, unless such UGC has been found to be detrimental, it might help give otherwise similar pages some diversity.
One product line I have photos and specifications for has very few differences between models. It's extremely hard to write fresh content for each model when there's a minor difference from one to the next.
| 9:43 pm on Mar 8, 2011 (gmt 0)|
I've been going through my sites that had issues and the pages that stopped ranking well really weren't all that great. Plus I guess they were dragging down some of my better pages. In a way this is good for me that this all happened because a lot of stuff has come up in these threads that I hadn't thought of on my own. It has been a bit like being connected to SEO Borg collective lately.
I am hopeful that in the long term I'll have better rankings than even before Panda after I get all of the rewrites done. Some of the pages had been slowly losing ranking over the years and I hadn't noticed until last week because the better pages were more than making up for it. I hadn't touched these sites much in awhile so they really needed some overhauling.
I know many took offense at the tone of Vanessa Fox's article but she really has some good ideas in there. Use webmaster tools, set the filters to U.S. and see what your top and bottom pages are and analyze the differences. Some pages are going to be at the bottom because of obscure topics or a lack of links or position, but check for stuff that normally should be ranking that isn't.
| 12:36 am on Mar 9, 2011 (gmt 0)|
First post, long time reader.
For me thin content would be system driven or CMS driven such as pagination pages and tag pages. Both seem to be an areas where a lot of pages were globed together. Liberal usage of tags formed (UGLY) lists where the KW is the header and not much else is there except page titles and links.
As I looked around at other large sites that had lost traffic this was somewhat common.
I removed several thousand tag pages and have been waiting on G to remove them from the index. So far only down about 200 since the 27th.
I have also noticed my index rate is slower than before. Perhaps because the tag and thin pages reduced my sites authority?!?
Lastly, in SERPs where I did rank first page or high second page. Pages or sites with very high Alexa, yet old and pages attached to giant what I call tier one sites now rank with pages that are often times not much more than signatures in older forums that have my KW as the title. The oldest I have seen so far is 2006 (stale!)
I have also seen door way pages and domain kw URLs move ahead as well as aged old sites that have not ranked for years but now are tops.
I can agree with the thin content being a factor to potentially why I might drop (15-20 percent traffic) but to have junk pop up high is very odd at the same time.
Lastly, my feel is the more competitive terms tanked the most if there is a site-wide penalty or tweaked algo where my thin content caused me to drop. More competitive, more dropping. Less competitive less drop. Long tails are OK.
Will comment when all removed pages are no longer in the index or if I have something of value to add.
All the best
| 3:11 am on Mar 9, 2011 (gmt 0)|
I have to disagree about the adsense speculation that too many ads reduces rank (on solid content sites).
Reason: Codewise adsense may indeed trigger some page slowdowns, a SINGLE ad unit will cause up to 9 browser requests per pageload including...
- /pagead/abglogo/abg-en-100c-000000.png (this one is called twice)
| 3:19 am on Mar 9, 2011 (gmt 0)|
I have spent half the day going through a site that was hit with 50% reduction in traffic.
I found about 20 "thin" articles and replaced them all with fresh new original articles all over 500 words. After looking at these articles they are pretty much worthless and I could see why they would be an issue.
I also noindexed the contact page, and any other page that were not needed.
There is nothing else I can see to do at this point.
I will report back if there are any changes.
Will give this about 10 days, next step is to reduce adsense units, there are only 2 on each page but will go to 1 if this does not help.
| 3:25 am on Mar 9, 2011 (gmt 0)|
Interesting thread. I also read all those articles and came to the conclusion that I had to merge together thin pages (that had lost ranking) for my main website (7 years old, PR6).
I did this two days ago. About 20 pages total that were once ranking top 20, but had lost ranking after the update.
I checked them again tonight and... They had lost even more ranking after being merged together.
I took thin pages and combined them with related pages to make longer ones, then dumped the ones that never had good ranking.
Went from bad to worse.
I'm not changing them back. They were quality to begin with, just short because all that could be said was said.
| 3:49 am on Mar 9, 2011 (gmt 0)|
Skimming through the Webmaster Central thread that asks for people to submit sites they think got hit for no reason, I noticed this post by Wysz (the Google employee who started the thread). While most of it contains words we've seen elsewhere, one sentence jumped out at me, and I've bolded it here.
|Our recent update is designed to reduce rankings for low-quality sites, so the key thing for webmasters to do is make sure their sites are the highest quality possible. We looked at a variety of signals to detect low quality sites. Bear in mind that people searching on Google typically don't want to see shallow or poorly written content, content that’s copied from other websites, or information that are just not that useful. In addition, it's important for webmasters to know that low quality content on part of a site can impact a site's ranking as a whole. For this reason, if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
In light of that, I'm hopeful that I'm on the right path.
Here's the link to that section of the long thread: [google.com...]
| 3:57 am on Mar 9, 2011 (gmt 0)|
|Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
Looking at that statement, I wonder if it's enough to noindex or use a robots disallow for thin pages that need to be worked on later, or if they should be yanked now.
| 4:04 am on Mar 9, 2011 (gmt 0)|
Personally, I not only yanked mine, but I even made sure they all return 410 codes. If I ever decide to reconstruct any of that content, I'll not only redo the content so it's much much better, but I'll present it as a completely new url. The old is gone forever.
Whether or not noindex or a robots disallow is enough...maybe not.
| 4:09 am on Mar 9, 2011 (gmt 0)|
With the explosive growth of the web in the last three years I think that G (and other SEs, too) are looking for ways to REDUCE the incredible clutter they must index to stay current. If they can scare the bejezzus out of webmasters to REFRAIN from posting say, a billion or so pages of thin content, reduce the number of paginations (consolidate, ie. a single full article per page), or refine a domain's offering, then Panda Farm has done its job. First the US, then the rest of the world, BWAHAHAHAHA!
Seriously, I think the web is getting too big for even Google to handle and this is just the beginning of "what won't make it into our index". But instead of human editors (which cost real money) they are doing it with nerds at the 'plex and a zillion lines of code...which is significantly more cost effective.
| 4:30 am on Mar 9, 2011 (gmt 0)|
|Personally, I not only yanked mine, but I even made sure they all return 410 codes. If I ever decide to reconstruct any of that content, I'll not only redo the content so it's much much better, but I'll present it as a completely new url. The old is gone forever. |
Whether or not noindex or a robots disallow is enough...maybe not.
This update has reduced my overall traffic by 20%-30%, meaning that Yahoo, Bing, other search engines and direct traffic are still delivering visitors. I'd hate to give those search engines and people 404 errors just to maybe do it the way Google wants, or maybe not.
| 4:35 am on Mar 9, 2011 (gmt 0)|
Bing and Yahoo are too important to cold-shoulder. Take that into consideration when playing with Google's algos. Any changes I make are commonsense changes... and despite how I might sound from time to time, some make sense... so a dozen or so really thin pages have been combined into a more interesting page...and it's a 410 for the deleted URLs...
| 4:44 am on Mar 9, 2011 (gmt 0)|
|Bing and Yahoo are too important to cold-shoulder. Take that into consideration when playing with Google's algos. |
That is very true and that's why I always build a site to appeal to the common denominator and then tweak it to get as high as possible in Google SERP's without loosing the ground gained in other engines.
| 4:55 am on Mar 9, 2011 (gmt 0)|
Well, both of you agree that it's important not to "cold shoulder" Bing and Yahoo. But how can that be done without removing pages that are already in those two engines' indexes? They don't seem to have a problem with those pages from my site.
That's why I thought the robots disallow for Googlebots was the best way, as it didn't affect other SE's. Maybe I'm wrong but, if so, what's right?
| This 195 message thread spans 7 pages: < < 195 ( 1 2  4 5 6 7 ) > > |