| This 195 message thread spans 7 pages: < < 195 ( 1 2 3  5 6 7 ) > > || |
|Documenting my attempt to re-rank after "Farmer" update|
Ok, so for the sake of all who have been hit by the so-called "Farmer" update, I thought it might be useful to document my own attempt to increase my website's "Quality" measurement (whatever that means,) in an attempt to gain back some of my previous good ranking in the SERPS.
My site is rather small compared to many mentioned on these boards 末 just 700 or so pages 末 and at it's peak it has never made me more than $1,000 in any month period, but in this economy that lost $1,000 is kicking my butt.
The site I am referencing is built on Wordpress as a CMS, has been copied copiously by scrapers in the past, and is a review/news-type sight, where I post long, honest and well-written reviews, news, how-to's, etc about my favorite types of widgets.
I have never paid for a link, and I do not run any sort of linking campaigns, though the site is plenty popular, gets a fair amount of natural links, and has nearly 7,000 RSS subscribers.
Obviously, we know very little about the particulars of this latest Google update, but the concensus seems to be that this a radical change in the algo, and we can't expect our ranking to just suddenly reverse any time soon.
With that said, I thought I'd document my meager attempt to gain back some of my good graces with Google.
After reading as much as I can find on the subject, I came across this article posted in one of the other threads here, and it seemed like a good place to start.
The quote that jumped out at me was this:
|Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, it痴 important to note that low quality pages on one part of a site can impact the overall ranking of that site. |
With that in mind I decided it was time to clean-up my site... but where to start? Like most of the good folks here, I tend to think everything I write is pretty good, and I certainly don't copy and paste other people's work etc. To the extent that it's possible, my pages are unique and well-written.
I finally decided to go through my Google Webmaster Tools account, and take a look at exactly which pages lost ranking, paying particular interest to the pages that lost BIG. I'm talking drops of 50, 100, even 200 or 300 places.
Luckily there were only a handful of these.
As I began to dig through them, it became apparent to me that at least 80% of these pages were very obviously "thin." Much thinner than I even remember writing.
So, for lack of a better idea, I canned them; my hope being that by removing entirely the pages that have suffered the most in this update, my entire website will look better to Google.
All told I threw away approx 40 pages. Now this is Wordpress, so "throwing away" just means moving them to the trash, where they are no longer viewable online, but can be easily retrieved should I decide I want them indexed again.
I can not think of any other ways to "clean up" my site... I long ago blocked things like Tag Pages, Category Pages, etc from being indexed, so there is really nothing extraneous from my site in Google's Index.
Now I just cross my fingers and pray. Nothing I've done is irreversible, so what did I have to lose.. my already tanked rankings?
I'll let you know how it turns out.
Well, both of you agree that it's important not to "cold shoulder" Bing and Yahoo. But how can that be done without removing pages that are already in those two engines' indexes? They don't seem to have a problem with those pages from my site.
That's why I thought the robots disallow for Googlebots was the best way, as it didn't affect other SE's. Maybe I'm wrong but, if so, what's right?
@dickbaker ... I don't think there is a right way any longer. Note that I said COMMONSENSE in regards to a very small number of pages on one of my sites. That works and also works for my visitors but chasing Google's shifty algos does not. Bing and Yahoo... and whoever is left (Blekko, Cuil?) are still considerable and MIGHT some day, be a force to reckon with. The baby and bathwater thing again. If it is working in B and Y but G has tanked how willing are web masters to diss the former for the latter? That will be each web master's decision... I just know where I'm going these days and it is not down Giggle's (sic) hilarious path.
|That's why I thought the robots disallow for Googlebots was the best way, as it didn't affect other SE's. Maybe I'm wrong but, if so, what's right? |
No one except the Google people know the answer to that for sure. I have thought of that as well.
It boils down to whether Google thinks that the webmaster is complying with the new rules of the game by not trying to get thin pages or duplicate content indexed, or if the new algo thinks, "aha, he's trying to play tricks by keeping the page and the links to the page, but telling us not to look at the page."
I'm trying to think like a Googlebot, but my head hurts.
Im trying to figure this one out myself too. What I did today was I found 3 half page articles I wrote a long long time ago that were related to a page off my homepage and combined them into one good page. So I basically scrapped 3 pages and made the page off my homepage bigger and better. If I notice a change in rankings for this page I will post it here.
This week I try to find more pages like this and I guess Ill wait and see. Cant hurt.
Heres one more thing I would like to point out to help out. After the algo update I did the following change with no results either way. I had my site title in H1 tags across the entire site with my article title in a h3 tag.
I eliminated the h1 tag that my site name was in and replaced it with a image of my site name. Then I replaced the h3 tag that had the title of each article and replaced that with the h1 tag.
I saw no difference in rankings on any search engine after the recache. I wonder if search engines even pay attention to these tags anymore. You would of thought that I would have saw some kind of change in keyword rankings.
This experiment was done on a 420 page site with about 65000 unique visitors monthly from search results.
I think they will be careful not too make it too easy for site owners to see cause and effect for awhile between site changes and ranking changes. I suspect they don't want people to learn and publicize how to game the new algorithm right away.
I personally think that, because it is apparently about quality, they should tell every webmaster what Google expects them to do. Who would not want more quality on the web :)
I think they have told us what to do. Just without details. ;-)
|I think they have told us what to do. Just without details. ;-) |
You're right. I'm going to noindex these pages, disallow them with a robots file, then remove the pages from my site along with any links to them.
Can't be too sure. Of course, I may get hit with a duplicate slim-to-no-content penalty or, even worse, a "you should have had a page for this topic" penalty. ;)
luke175 posted a link in the AdWord forum. The page is written by a guy that was banned from AdWords and contains information directly from Google about their perception of page quality.
He posted the explanation from a google employee listing specific percentages for ad to content ratio etc. I know organic pages won't be weighted exactly the same but it is Well worth a read... [andrewhansen.name ]
dickbaker, I definitely hear what you're saying. I know in my case that going back through my older pages I found a bunch that were not only "thin" but also rather embarrassing.
I feel better having deleted them 末 whether Bing or Yahoo liked them or not 末 than having people surf onto my site via one of those cruddy pages.
All of those pages now serve a 410.
Stronger pages, ones which I am proud of and think are good 末 whether or not Google likes them 末 have been left in their place.
For me, what this algo change has done is cause me to re-consider a lot of my oldest content. To my chagrin, much of that old content was garbage. YMMV.
I have also been racking my brain to figure out what to do.
First, I looked at all the nofollows on my site which were affiliate links. I know Google does not like nofollows.
The odd thing is one of the sites listed on the "winners" list has bunch of nofollows on them. Site seem a wreck with no consistently throughout. Some have the entire navbar on the left nofollowed and sometimes it is dofollow. snopes[dot]com . Look under "Odd news".
I have also moved my Adsense block from the sweet spot to my sidebar and thinking about putting up a leader board.
I am not rewriting any content because I strongly feel that it is all relevant and since it is all original, I stand by it.
Very interesting read there burcot...only problem is my users using that screen resolution is getting smaller and smaller. Now about 15% with 80 % being larger.
Why that adsense rep wanted more ads is beyond me. By adding a vertical leader board(right side) or 160 x 600 did absolutely nothing to my earnings when instituted about 10 weeks ago. I may just go back to div in content only again that would raise my content to ads above the fold to about 70-30%
I wonder if images have an effect. Would that be considered content? I have at least one image above the fold on most pages that basically establishes upon landing on the page "this is what the page is about" and directly relates to the title and H1 tag.
Need your opinion here as well. I run a blog and while it is pretty content rich -- I write very in-depth pieces of 1000 words each -- I also have these smaller "thin"articles, if you may, that are either announcements, giveaways that expired/contests, greetings during holidays, etc. Now all other blogs in my niche do the same thing, and these are part of interacting with our communities.
I am wondering what to do with them and whether they should be removed. I feel like they shouldn't, since these are dated posts and have lots of interaction with community. But then again, they may no longer be relevant since they are older announcements.
The question here is: would this algo downgrade you because of historical data like this? I would think this would be more of a "relevance" issue rather than a "quality" issue. The only problem here may be that they are thinner posts. What would be your suggestion to handle such posts like these? Again, this is very common among blogs, in order to engage community. While I realize Google no longer wants "clutter", I'd like to know what the best way to handle cleanup for this stuff would be.
|if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
Straight from the horse's mouth on the Google Help forum.
Dang! netmeg beat me to the punch again.
|I think they have told us what to do. |
In every google video that I have seen Matt Cutts in, he always says "just make great content."
He also said that the Panda / Farmer update filter worked fine in the case of suite101 losing 94% of its traffic. So the lesson here I guess would be just see what suite101 was doing and then do the opposite.
For suite101 website, the big problem i see is the left side of article pages. It has a lot of ad units and is confused with site navigation. Probably many users click back or go to google search again after click an ad.
IMHO, displaying categories in left side and fewer ads in a right side should solve their problem (if can be so simple).
Anyway, Matt Cutts says in Wired interview he understand why suite101 is penalised, is pretty sure about this site.
Maybe you can add another wrong things on this site, to make a list with possible factors.
I am still working to restore my traffic.
|Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
That could be a recipe for disaster. If your content has already been copied by other sites, and you put it on another site, how is Google going to know you didn't copy it, too? And then penalize that site?!
I don't know if this topic should get a new thread (mods may move if they wish), but I've been tracking the changes made by EzineArticles.com.
If you Google the news for that site, there is an interview on the WSJ site with EA's CEO: [online.wsj.com...]
Today I find out Mr. Knight, who may or may not be following this thread, is apparently changing his site like the webmasters here. Allegedly, according to one webmaster (found via Google News), articles are being deleted and even user accounts terminated! I'll wait for confirmation on that; you can imagine some webmasters will be upset if it's true and speak out very soon.
But what Mr. Knight said was that they have increased the minimum article length from 250 words to 400, and now require links below the articles to be to related content.
"Mr. Knight said he believed part of the reason that Google's new search algorithm demoted Ezinearticles.com articles is that much of the content had already been published on the personal websites of its writers."
He also admitted some articles were permitted which should not have been published: "We know we accepted some of it we shouldn't have."
Perhaps these are the ones which are now disappearing from his site, if indeed they are being removed, seen by Google as poisonous to the entire site.
Hopefully all the Acai Berry nonsense will fly into the trash. :-)
UPDATE-According to the company's blog, they are now blocking all *future* articles on the Berry. I guess that's a start.
[edited by: potentialgeek at 7:24 pm (utc) on Mar 9, 2011]
|That could be a recipe for disaster. If your content has already been copied by other sites, and you put it on another site, how is Google going to know you didn't copy it, too? And then penalize that site?! |
LOW QUALITY CONTENT.
Has anyone gotten their rankings back? On 2/25 I deleted my 'thin' content pages via WebmasterCentral but no answer from Google SERPS. Of course something else might be at play at my site but I am curious to see if anyone got their ranking back (cultofmac excluded) by doing changes? If so, how long did you have to wait?
@walkman, I have yet to hear about someone who got their rankings back (even after making fixes to their site). I would rejoice once I hear that, knowing that it is actually possible to crawl out of the abyss. I've been talking to many site owners and nobody I know has regained rankings, unfortunately. Like you, I wonder how long it will be before we get any positive word from anyone.
LOL, Planet13, I beat netmeg by about half a day! I mentioned it earlier in this thread, but no matter. It should probably be mentioned several times to get the message across. Also, the link that burcot mentioned earlier, regarding ad to content ratio, was very interesting. If we were to pretend that the same type of thinking was applied on the organic side, it could make a difference in how we view our pages. Worth reading.
@potentialgeek what if you moved the light pages to new domain/subdomain and 301'd the old urls?
In my case I have a section of my site with duplicated and/or little content on each page. It's a work in progress in which I'm adding bits and pieces of content as I go. I want to keep the section and evolve it, moving it and redirecting old urls might be the way to lift the devaluation of good content while being able to keep the section thats not so good.
potentialgeek , i see in top results duplicate content,sites with 0 original content.I do not think this Panda update is focused on copied pages. But the amount of text on each page makes sense for me.
then it's more likely that Google has a set x weeks penalty or maybe they haven't really done a serious re-calculation of the SERPS.
|LOL, Planet13, I beat netmeg by about half a day! |
That's why you're Dazzlin'
|then it's more likely that Google has a set x weeks penalty or maybe they haven't really done a serious re-calculation of the SERPS. |
I've seen nothing to indicate the former, and I suspect tweaks and recalculations are ongoing.
@ Netmeg "I've seen nothing to indicate the former, and I suspect tweaks and recalculations are ongoing."
Maybe, but I find it unlikely that no one here fixed their site enough to convince Google. Unless they aren't sharing results....
Thanks, burcot, for that article link. I think it contains gold!
|Unless they aren't sharing results.... |
Geeze, ya think that might be the case? :)
I'm guessing that anyone who figures out how to "fix" this issue, or how to work around it, probably sees that process as a potential gold mine and isn't likely to be posting the process in an open forum any time soon.
At least not in language any clearer than the typical talk from a G employee.
| This 195 message thread spans 7 pages: < < 195 ( 1 2 3  5 6 7 ) > > |