| 4:55 am on Mar 9, 2011 (gmt 0)|
Well, both of you agree that it's important not to "cold shoulder" Bing and Yahoo. But how can that be done without removing pages that are already in those two engines' indexes? They don't seem to have a problem with those pages from my site.
That's why I thought the robots disallow for Googlebots was the best way, as it didn't affect other SE's. Maybe I'm wrong but, if so, what's right?
| 5:47 am on Mar 9, 2011 (gmt 0)|
@dickbaker ... I don't think there is a right way any longer. Note that I said COMMONSENSE in regards to a very small number of pages on one of my sites. That works and also works for my visitors but chasing Google's shifty algos does not. Bing and Yahoo... and whoever is left (Blekko, Cuil?) are still considerable and MIGHT some day, be a force to reckon with. The baby and bathwater thing again. If it is working in B and Y but G has tanked how willing are web masters to diss the former for the latter? That will be each web master's decision... I just know where I'm going these days and it is not down Giggle's (sic) hilarious path.
| 6:41 am on Mar 9, 2011 (gmt 0)|
|That's why I thought the robots disallow for Googlebots was the best way, as it didn't affect other SE's. Maybe I'm wrong but, if so, what's right? |
No one except the Google people know the answer to that for sure. I have thought of that as well.
| 6:53 am on Mar 9, 2011 (gmt 0)|
It boils down to whether Google thinks that the webmaster is complying with the new rules of the game by not trying to get thin pages or duplicate content indexed, or if the new algo thinks, "aha, he's trying to play tricks by keeping the page and the links to the page, but telling us not to look at the page."
I'm trying to think like a Googlebot, but my head hurts.
| 7:18 am on Mar 9, 2011 (gmt 0)|
Im trying to figure this one out myself too. What I did today was I found 3 half page articles I wrote a long long time ago that were related to a page off my homepage and combined them into one good page. So I basically scrapped 3 pages and made the page off my homepage bigger and better. If I notice a change in rankings for this page I will post it here.
This week I try to find more pages like this and I guess Ill wait and see. Cant hurt.
| 7:38 am on Mar 9, 2011 (gmt 0)|
Heres one more thing I would like to point out to help out. After the algo update I did the following change with no results either way. I had my site title in H1 tags across the entire site with my article title in a h3 tag.
I eliminated the h1 tag that my site name was in and replaced it with a image of my site name. Then I replaced the h3 tag that had the title of each article and replaced that with the h1 tag.
I saw no difference in rankings on any search engine after the recache. I wonder if search engines even pay attention to these tags anymore. You would of thought that I would have saw some kind of change in keyword rankings.
This experiment was done on a 420 page site with about 65000 unique visitors monthly from search results.
| 8:22 am on Mar 9, 2011 (gmt 0)|
I think they will be careful not too make it too easy for site owners to see cause and effect for awhile between site changes and ranking changes. I suspect they don't want people to learn and publicize how to game the new algorithm right away.
| 1:43 pm on Mar 9, 2011 (gmt 0)|
I personally think that, because it is apparently about quality, they should tell every webmaster what Google expects them to do. Who would not want more quality on the web :)
| 2:19 pm on Mar 9, 2011 (gmt 0)|
I think they have told us what to do. Just without details. ;-)
| 4:04 pm on Mar 9, 2011 (gmt 0)|
|I think they have told us what to do. Just without details. ;-) |
You're right. I'm going to noindex these pages, disallow them with a robots file, then remove the pages from my site along with any links to them.
Can't be too sure. Of course, I may get hit with a duplicate slim-to-no-content penalty or, even worse, a "you should have had a page for this topic" penalty. ;)
| 4:27 pm on Mar 9, 2011 (gmt 0)|
luke175 posted a link in the AdWord forum. The page is written by a guy that was banned from AdWords and contains information directly from Google about their perception of page quality.
He posted the explanation from a google employee listing specific percentages for ad to content ratio etc. I know organic pages won't be weighted exactly the same but it is Well worth a read... [andrewhansen.name ]
| 4:31 pm on Mar 9, 2011 (gmt 0)|
dickbaker, I definitely hear what you're saying. I know in my case that going back through my older pages I found a bunch that were not only "thin" but also rather embarrassing.
I feel better having deleted them 末 whether Bing or Yahoo liked them or not 末 than having people surf onto my site via one of those cruddy pages.
All of those pages now serve a 410.
Stronger pages, ones which I am proud of and think are good 末 whether or not Google likes them 末 have been left in their place.
For me, what this algo change has done is cause me to re-consider a lot of my oldest content. To my chagrin, much of that old content was garbage. YMMV.
| 4:45 pm on Mar 9, 2011 (gmt 0)|
I have also been racking my brain to figure out what to do.
First, I looked at all the nofollows on my site which were affiliate links. I know Google does not like nofollows.
The odd thing is one of the sites listed on the "winners" list has bunch of nofollows on them. Site seem a wreck with no consistently throughout. Some have the entire navbar on the left nofollowed and sometimes it is dofollow. snopes[dot]com . Look under "Odd news".
I have also moved my Adsense block from the sweet spot to my sidebar and thinking about putting up a leader board.
I am not rewriting any content because I strongly feel that it is all relevant and since it is all original, I stand by it.
| 5:22 pm on Mar 9, 2011 (gmt 0)|
Very interesting read there burcot...only problem is my users using that screen resolution is getting smaller and smaller. Now about 15% with 80 % being larger.
Why that adsense rep wanted more ads is beyond me. By adding a vertical leader board(right side) or 160 x 600 did absolutely nothing to my earnings when instituted about 10 weeks ago. I may just go back to div in content only again that would raise my content to ads above the fold to about 70-30%
I wonder if images have an effect. Would that be considered content? I have at least one image above the fold on most pages that basically establishes upon landing on the page "this is what the page is about" and directly relates to the title and H1 tag.
| 5:23 pm on Mar 9, 2011 (gmt 0)|
Need your opinion here as well. I run a blog and while it is pretty content rich -- I write very in-depth pieces of 1000 words each -- I also have these smaller "thin"articles, if you may, that are either announcements, giveaways that expired/contests, greetings during holidays, etc. Now all other blogs in my niche do the same thing, and these are part of interacting with our communities.
I am wondering what to do with them and whether they should be removed. I feel like they shouldn't, since these are dated posts and have lots of interaction with community. But then again, they may no longer be relevant since they are older announcements.
The question here is: would this algo downgrade you because of historical data like this? I would think this would be more of a "relevance" issue rather than a "quality" issue. The only problem here may be that they are thinner posts. What would be your suggestion to handle such posts like these? Again, this is very common among blogs, in order to engage community. While I realize Google no longer wants "clutter", I'd like to know what the best way to handle cleanup for this stuff would be.
| 5:34 pm on Mar 9, 2011 (gmt 0)|
|if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
Straight from the horse's mouth on the Google Help forum.
| 5:58 pm on Mar 9, 2011 (gmt 0)|
Dang! netmeg beat me to the punch again.
|I think they have told us what to do. |
In every google video that I have seen Matt Cutts in, he always says "just make great content."
He also said that the Panda / Farmer update filter worked fine in the case of suite101 losing 94% of its traffic. So the lesson here I guess would be just see what suite101 was doing and then do the opposite.
| 6:41 pm on Mar 9, 2011 (gmt 0)|
For suite101 website, the big problem i see is the left side of article pages. It has a lot of ad units and is confused with site navigation. Probably many users click back or go to google search again after click an ad.
IMHO, displaying categories in left side and fewer ads in a right side should solve their problem (if can be so simple).
Anyway, Matt Cutts says in Wired interview he understand why suite101 is penalised, is pretty sure about this site.
Maybe you can add another wrong things on this site, to make a list with possible factors.
I am still working to restore my traffic.
| 6:52 pm on Mar 9, 2011 (gmt 0)|
|Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
That could be a recipe for disaster. If your content has already been copied by other sites, and you put it on another site, how is Google going to know you didn't copy it, too? And then penalize that site?!
| 7:00 pm on Mar 9, 2011 (gmt 0)|
I don't know if this topic should get a new thread (mods may move if they wish), but I've been tracking the changes made by EzineArticles.com.
If you Google the news for that site, there is an interview on the WSJ site with EA's CEO: [online.wsj.com...]
Today I find out Mr. Knight, who may or may not be following this thread, is apparently changing his site like the webmasters here. Allegedly, according to one webmaster (found via Google News), articles are being deleted and even user accounts terminated! I'll wait for confirmation on that; you can imagine some webmasters will be upset if it's true and speak out very soon.
But what Mr. Knight said was that they have increased the minimum article length from 250 words to 400, and now require links below the articles to be to related content.
"Mr. Knight said he believed part of the reason that Google's new search algorithm demoted Ezinearticles.com articles is that much of the content had already been published on the personal websites of its writers."
He also admitted some articles were permitted which should not have been published: "We know we accepted some of it we shouldn't have."
Perhaps these are the ones which are now disappearing from his site, if indeed they are being removed, seen by Google as poisonous to the entire site.
Hopefully all the Acai Berry nonsense will fly into the trash. :-)
UPDATE-According to the company's blog, they are now blocking all *future* articles on the Berry. I guess that's a start.
[edited by: potentialgeek at 7:24 pm (utc) on Mar 9, 2011]
| 7:09 pm on Mar 9, 2011 (gmt 0)|
|That could be a recipe for disaster. If your content has already been copied by other sites, and you put it on another site, how is Google going to know you didn't copy it, too? And then penalize that site?! |
LOW QUALITY CONTENT.
| 7:10 pm on Mar 9, 2011 (gmt 0)|
Has anyone gotten their rankings back? On 2/25 I deleted my 'thin' content pages via WebmasterCentral but no answer from Google SERPS. Of course something else might be at play at my site but I am curious to see if anyone got their ranking back (cultofmac excluded) by doing changes? If so, how long did you have to wait?
| 7:22 pm on Mar 9, 2011 (gmt 0)|
@walkman, I have yet to hear about someone who got their rankings back (even after making fixes to their site). I would rejoice once I hear that, knowing that it is actually possible to crawl out of the abyss. I've been talking to many site owners and nobody I know has regained rankings, unfortunately. Like you, I wonder how long it will be before we get any positive word from anyone.
| 7:29 pm on Mar 9, 2011 (gmt 0)|
LOL, Planet13, I beat netmeg by about half a day! I mentioned it earlier in this thread, but no matter. It should probably be mentioned several times to get the message across. Also, the link that burcot mentioned earlier, regarding ad to content ratio, was very interesting. If we were to pretend that the same type of thinking was applied on the organic side, it could make a difference in how we view our pages. Worth reading.
| 7:30 pm on Mar 9, 2011 (gmt 0)|
@potentialgeek what if you moved the light pages to new domain/subdomain and 301'd the old urls?
In my case I have a section of my site with duplicated and/or little content on each page. It's a work in progress in which I'm adding bits and pieces of content as I go. I want to keep the section and evolve it, moving it and redirecting old urls might be the way to lift the devaluation of good content while being able to keep the section thats not so good.
| 7:31 pm on Mar 9, 2011 (gmt 0)|
potentialgeek , i see in top results duplicate content,sites with 0 original content.I do not think this Panda update is focused on copied pages. But the amount of text on each page makes sense for me.
| 7:49 pm on Mar 9, 2011 (gmt 0)|
then it's more likely that Google has a set x weeks penalty or maybe they haven't really done a serious re-calculation of the SERPS.
| 8:00 pm on Mar 9, 2011 (gmt 0)|
|LOL, Planet13, I beat netmeg by about half a day! |
That's why you're Dazzlin'
|then it's more likely that Google has a set x weeks penalty or maybe they haven't really done a serious re-calculation of the SERPS. |
I've seen nothing to indicate the former, and I suspect tweaks and recalculations are ongoing.
| 8:09 pm on Mar 9, 2011 (gmt 0)|
@ Netmeg "I've seen nothing to indicate the former, and I suspect tweaks and recalculations are ongoing."
Maybe, but I find it unlikely that no one here fixed their site enough to convince Google. Unless they aren't sharing results....
| 8:10 pm on Mar 9, 2011 (gmt 0)|
Thanks, burcot, for that article link. I think it contains gold!
| 8:23 pm on Mar 9, 2011 (gmt 0)|
|Unless they aren't sharing results.... |
Geeze, ya think that might be the case? :)
I'm guessing that anyone who figures out how to "fix" this issue, or how to work around it, probably sees that process as a potential gold mine and isn't likely to be posting the process in an open forum any time soon.
At least not in language any clearer than the typical talk from a G employee.
| This 195 message thread spans 7 pages: < < 195 ( 1 2 3  5 6 7 ) > > |