homepage Welcome to WebmasterWorld Guest from 50.19.206.49
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 31 message thread spans 2 pages: 31 ( [1] 2 > >     
A Panda Protection Plan?
Sgt_Kickaxe




msg:4483399
 7:11 am on Aug 10, 2012 (gmt 0)

We all have pages that, without warning, fall out of favor in Google and we've all been frustrated in trying to figure out why. Google won't tell you why a page suddenly receives zero visitors and it's clear, since Panda/Penguin, that pages can work against the rest of your site. This being the case I'm trying to be pro-active with a plan.

Using Google analytics I set up a method of identifying pages that receive no traffic from search within a given 14 day period and these are automatically moved into a category on my site reserved for news type articles. Articles that aren't meant to be a challenge to my top page for any given keyword. All articles within this category have a google noindex meta tag applied.

The result is that if Google isn't sending traffic the article is de-indexed where I can either leave it that way or modify it and send it back into circulation. I mean, if Google won't send traffic to it and it *might* be working against the rest of my site it should be deindexed, right?

See any problems with this plan?

 

claaarky




msg:4483447
 10:16 am on Aug 10, 2012 (gmt 0)

Sgt_Kickaxe, I think the concept of automating your site based on GA stats is an awesome idea as a means to protect it from Panda (and, I believe, as a means of maximising rankings for your most important pages).

It's just a matter of getting the right process for assessing what is bad content and then dealing with it in the right way. I'm not sure that a page is bad just because it didn't get search engine traffic within a couple of weeks. I'd base the judgement more on whether the new page was popular with people already on your site (did it get lots of internal visits, did they spend much time on the page, did they stay on your site....assuming it's a page that you wouldn't want people leaving from). I'm not sure two weeks is enough time either, although it may depend on your volumes of pageviews.

Personally, I don't believe noindexing is a way to protect your site from a bad page. I think you either need to remove it, or noindex it and move it somewhere where people can't easily find it while you improve it (but Google can so you it knows you've de-indexed it). I think if people can find a bad page, bad user stats are being produced which can harm your site.

The problem with automating it completely is every page is different and, because we don't have access to the stats of our competitor's sites, we don't know where the line is (in terms of stats) between a good page and a bad one for a particular topic/keyword/whatever.

I'm working on a similar idea but it's more of a flagging system. I have some basic parameters that, for my site, hint at a bad page and it makes me aware of pages slipping towards the bad end of the scale so we can look at what's causing it and decide what to do. However, the we are finding that it's still very complex - pages that were great last month (according to our stats) can suddenly become bad pages and the cause can be anything from competitors undercutting us on that product to a drop in interest due to seasonality, bad weather or world events.

What to do in this situation is proving to be a challenge, but we do see that our rankings can be negatively affected with a couple of weeks if we hang on to pages flagged by the system as having shifted into the low quality camp, ignoring what the possible consequences might be at the next Panda refresh.

We're introducing new pages very slowly now, giving them lots of internal promotion as well to push lots of traffic in so we can make a quicker assessment and decide whether to keep them or drop them. We've only been doing this for a few weeks and although we've not seen any improvement from the last Panda refresh, we are seeing the rankings of some of our category pages improve as a result of this overall approach, which I'm hoping is a sign we're moving in the right direction.

tedster




msg:4483461
 10:44 am on Aug 10, 2012 (gmt 0)

Nice one, claaarky. Have you discovered any pattern in the pages you've flagged? Are they "shallow content" in any way that you can detect, compared to pages that continue to rank?

claaarky




msg:4483491
 12:06 pm on Aug 10, 2012 (gmt 0)

Thanks Tedster....well, this is the thing which convinces me why employing user metrics as a Panda and ranking mechanism is such a brilliant solution for Google. ANYTHING can be great today and terrible tomorrow. It all depends on millions of constantly changing factors.

If we add a new product which is similar to one we already have, the stats of the existing one can plummet, or the new one can struggle if the existing one is a better choice. If a competitor undercuts us it affects our stats. If a new competitor arrives or improves in ranking, it affects our stats. Anything that we or other webmasters do, combined with everything that is going on in the world, can negatively or positively impact our stats. We can rise and fall without changing a thing, because everything around us is changing.

netmeg




msg:4483504
 12:40 pm on Aug 10, 2012 (gmt 0)

That's not far off from what I do for my sites, except in my case it's not so much non-visited pages as expired or canceled events that are not likely to be resurrected; we slap a NOINDEX on them for Google and they're rendered "inactive" which means they don't show up in any navigation or links or widgets on the site. You could still find them if you go direct, but almost no one ever does.

claaarky




msg:4483567
 3:09 pm on Aug 10, 2012 (gmt 0)

Tedster, just to answer your question a bit more, yes there are patterns (ignoring the cases where good pages go bad because of events outside of our website). It basically boils down to Amit Singhal's list to be honest but I'll try to put it in my own words!

In general the low quality pages, shallow, whatever you want to call them (I haven't yet thought of a word or phrase that perfectly sums up what we're seeing except for 'bad user metrics') are caused by the following:-
- low quality images (slightly blurry)
- images that just aren't very interesting or engaging,
- anything that isn't 100% appropriate to the page (text or images - e.g. an image of an Eagle accompanied by text that mostly talks about birds but doesn't actually mention eagles),
- anything that's badly written (spelling mistakes, bad grammar or lacking in any real helpful detail. For example you could describe a TV as rectangular, flat and sexy OR as LED, 50 inch screen, 3D with a 5 year guarantee....people unsurprisingly prefer the second version).
- long pages that don't do a good job of helping the user work their way through the content it in small, digestible chunks,
- inaccurate page titles (misleads the visitor, affects trust potentially),
- product is bad value (overpriced for what it is, or available elsewhere cheaper)

Those are the biggies. It seems that they all erode a visitor's trust and engagement in the site. In some sections of our site we had lots of pages where the user metrics were on the border of acceptability, but we found they had a cumulative grinding down effect so, although our flagging system said they were okay individually, collectively they were bad news. There would still be certain pages with slightly worse user metrics in amongst them but these would be quite random - almost like people left once they couldn't take any more rather than the page they actually left from being the main problem.

Pages that have none of these issues have great user metrics. However, we do have some pages that DO have one or more of these issues but also have great user metrics - these tend to be where we have products better priced than anyone else (i.e. the price makes it great quality and compensates/overrides the negative factors). We still improved these pages on the basis they'd be even more popular if we did.

Another interesting observation is that bad product reviews really improve engagement and, ironically, conversion. It's like the fact we're prepared to show bad reviews gives people more confidence in us. We added a really slating review to one quite popular product and gritted our teeth, but it suddenly started selling like hot cakes. The number of visitors and external entrances remained the same initially, but as sales grew the page began rising in the rankings (ranking followed popularity, rather than the other way round). However, after a few weeks the sales fell away again and then the ranking dropped as well. This seems to confirm people have more confidence in information that is up to date or recent and that trust can be gained in some surprising ways! It also seems to confirm that pages people like and are engaged with will rank better as a result.

Basically, our observations are simply that running an online business is now no different to running a bricks and mortar business. That's an observation I'm liking a lot. It means I don't have to figure out Google - I need to figure out people and I have a chance of understanding how people work, because I am one. People are far more complex and constantly changing than any computer system could ever be, but I still feel I have a better chance now.

Panthro




msg:4483570
 3:14 pm on Aug 10, 2012 (gmt 0)

I think it's a great idea, especially bc it seems to isolate one particular factor. Doo eeet

Zivush




msg:4483608
 4:51 pm on Aug 10, 2012 (gmt 0)

Interesting insight claaarky
Google has an important advantage.
They can make a comparison not only from one single source (a site) but pages with similar topics all around the web.
Pages aren't equal when it comes to user engagement. Some are less attractive than others no matter how much work you put on them.

Sgt_Kickaxe




msg:4483755
 12:59 am on Aug 11, 2012 (gmt 0)

Another concern is having a lot of, or a high percentage of, noindex content. Is it possible Google sends less visitors to sites with lots of non-indexed content since they know a visitor may end up on one of those pages with a click or two? It would be counter-productive to be thinning the herd with noindex if that lowers your overall trust factor(if there is one). Does anyone with a LOT of noindex pages see any negative effects from having them?

indyank




msg:4483869
 3:27 pm on Aug 11, 2012 (gmt 0)

Does anyone with a LOT of noindex pages see any negative effects from having them?


I do wated to raise this for a long time and it is great that that brought it up. Yes, I do have a feeling that Google basically has brought about a change to pages tagged "noindex, follow" or just "noindex" My understanding is they just don't pass on PR from such pages as opposed to how that meta directive should actually work. But their logic could be that if you don't want us to show those pages in our index, we would also ignore any link juice flowing through or out of those pages via internal or external links.

But more data points from other folks would help.

indyank




msg:4483870
 3:36 pm on Aug 11, 2012 (gmt 0)

or lacking in any real helpful detail. For example you could describe a TV as rectangular, flat and sexy OR as LED, 50 inch screen, 3D with a 5 year guarantee....people unsurprisingly prefer the second version).


The former is more informal while the later is formal. But when you have UGC or user reviews, you typically get the informal kind of reviews and that is the way most people tend to explain products or things they like to their friends and others. So why do you feel that people always prefer the second version and Google would follow them?

claaarky




msg:4483945
 10:33 pm on Aug 11, 2012 (gmt 0)

Indyank, my example was based on a product description, not a review. By comparing the stats of product pages that people do and dont like and by experimenting with my content, I have found that people prefer product descriptions that are specific and detailed as opposed to vague, fluffy and salesy. These days I never assume I know what people prefer, I do what I think is right then alter it if my stats tell me I got it wrong.

Review sites may see the complete opposite. Every site is different but this is a pattern I have noticed and it ties in strongly with one of Amit singhals panda guidelines which suggests to me it might be a pattern google found as well (and therefore could be part of the panda puzzle).

zeus




msg:4484008
 9:02 am on Aug 12, 2012 (gmt 0)

A old article which dont get any or just a few visits is still a good article, so that would be a bad way to rank pages or force a webmaster to delete or place it somewhere ells. If a page dont get visited dossent mean its bad, so thats also a bad way of ranking, so there must be something els to Panda (i hope), but ok when you look at the ranked pages the quality has got down since Panda and all those same domain stuffing search results.

claaarky




msg:4484022
 10:04 am on Aug 12, 2012 (gmt 0)

Zeus, the way I see it, if a page gets no visits it's either of no interest to your visitors or it's hidden away because you don't want people finding it on your site (or navigation doesn't help people find it easily). What's the point in having great content that nobody sees? If it really is great it needs promoting on the site so people can find it easily.

santapaws




msg:4484023
 10:36 am on Aug 12, 2012 (gmt 0)

because the people left that give a %^&* about their sites often have a pyramid style architecture where it takes more and more clicks to find pages deep down the pyramid. Each level by nature gets less clicks but it doesnt make them bad pages, just not built for the masses, but built for someone, somewhere someday.

onebuyone




msg:4484024
 10:42 am on Aug 12, 2012 (gmt 0)

santapaws: page is bad, when it does not get any visits other than google serp clicks.

zeus




msg:4484065
 2:50 pm on Aug 12, 2012 (gmt 0)

claaarky - when you example have a site about widgets with all colors that exists, but blue is more popular then pink, but pink still is a good article, just not that popular, what about that or if we talk fashion there is also old style that is not that interesting anymore, but still good fashion.

claaarky




msg:4484113
 7:27 pm on Aug 12, 2012 (gmt 0)

Zeus, my stats show a marked decline when a page becomes "not that interesting any more". Would a fashion retailer continue to stock a product that people weren't that interested in any more? I don't think so - they'd clear it out to make room for the hot fashion items that sell much better. To take the example to an extreme, think of a fashion shop that only stocks items people aren't that interested in compared to one that only stocks the latest hottest fashion items. Which one would most people say is better quality? I think the latter.

Taking the pink/blue widgets example, my stats show that people don't like wading through lots of pages that are basically the same subject/product. They seem to prefer distinct, simple choices. One page on widgets that gives the option to buy it in blue or pink is better than one page for each colour.

Dan01




msg:4484121
 8:54 pm on Aug 12, 2012 (gmt 0)

A old article which dont get any or just a few visits is still a good article, so that would be a bad way to rank pages or force a webmaster to delete or place it somewhere ells. If a page dont get visited dossent mean its bad


It is hard to tell if the page is bringing down your whole site or not, so we just delete it usually. Like SGT, we use Analytics data, but for the last six months not 14 days. If a page didn't receive any traffic, it got deleted unless we found a link to it. Over the last year our site went from 22K pages to 14K.

We are still purging. I don't compare traffic per se, but SERPS. Day-to-day traffic fluctuations don't tell us much. A few of our SERPS did move up, but we still have a ways to go.

Now we are going through the remaining pages one-by-one to determine their quality, traffic etc. We are just deleting them.

onebuyone




msg:4484123
 9:44 pm on Aug 12, 2012 (gmt 0)

Dan01: You should delete/noindex pages, which are getting traffic, but are not performing well: high bounce rate etc. because those are pages, which are hurting you, not those, who are not ranking at all, because from Google algo POV - such pages do not exist.

Stop wasting organic traffic on some low quality pages, even though they have high position in SERPs for some reason.

zeus




msg:4484131
 10:31 pm on Aug 12, 2012 (gmt 0)

The fashion example, I also more ment it in a history way, not selling. How about news sites, they have extreme old articles which no one looks at, why do they rank. We are on the same page, but I just cant/wont believe that its that extreme as you say.

Another thing when would you say a article/page have a bad exit % 75-100% also what about those sites where uses get to to that single page via a search result, they get there info and leave, I would say thats a perfect site.

claaarky




msg:4484132
 10:40 pm on Aug 12, 2012 (gmt 0)

Err, onebuyone, are you serious?!

I would NOT recommend deleting pages that are bringing in traffic and ranking well. If the stats look bad I'd work on improving them but I think if they rank well google is saying your competitors stats look worse.

claaarky




msg:4484139
 11:31 pm on Aug 12, 2012 (gmt 0)

Zeus, I see what you are saying. I think pages that have great stats historically but no visitors now (internal or external) are not a panda issue. Pages with bad stats are the problem.

I agree a page with a 75-100% exit rate is not necessarily a bad page. I believe it all depends on how your page stats compare to your competitors (for all metrics not just exit rate). You can't know what your competitors stats are so all you can do is look at your pages with the worst stats and decide whether to remove or improve.

grippo




msg:4484144
 12:20 am on Aug 13, 2012 (gmt 0)

HI Sgt_Kickaxe, very interesting thread so far. I'm a little embarrased by the question, but have to ask. How do you get GA report for pages w/o traffic from SEs? By definition, if they have no traffic, they won't show in GA.

Dan01




msg:4484153
 1:03 am on Aug 13, 2012 (gmt 0)

Dan01: You should delete/noindex pages, which are getting traffic, but are not performing well: high bounce rate etc. because those are pages, which are hurting you, not those, who are not ranking at all, because from Google algo POV - such pages do not exist.


Thanks for your comment onebuyone.

We already deleted the pages with no traffic. Now we are deleting the articles with very little traffic. We don't want to delete pages that get 3+ visits every month.

We'll look at bounce rate down the line.

NoIndex vs Deleting:

SGT was asking if noindex was a good strategy. Personally, I agree with the person who said in some future Panda iteration these pages could bring us down too.

Dan01




msg:4484154
 1:09 am on Aug 13, 2012 (gmt 0)

I believe it all depends on how your page stats compare to your competitors.


I agree with that. If the page is ranking way down, especially on a long-tail SERP, then perhaps the article should go.

In some cases a page can be improved. In our case, we don't have time to improve 14K pages, so deleting poor performers is quicker.

gregorysmith




msg:4484301
 2:36 pm on Aug 13, 2012 (gmt 0)

In many cases, I would suggest simply starting over, and doing a 301

Zivush




msg:4484335
 3:59 pm on Aug 13, 2012 (gmt 0)

Speaking of deleting pages,
I know an article directory which was one of the biggest on the web before Panda 1.
They deleted hundreds of thousands of pages.
They also created sub-domains for authors to make their quality pages distinguished easily.
Did it help them? Only for a few months before they were kicked again on the October2011 Panda.

claaarky




msg:4484345
 4:14 pm on Aug 13, 2012 (gmt 0)

Zivush, when that kind of sub-domain solution was first mentioned I thought it won't be long before Google jumps on that. Logically, if content is bad and it's one click away from pages that rank well, Google is going to demote the ranking pages so its' users don't come into contact with the bad content they link to.

I think moving content to another domain or a subdomain will only work if the original domain no longer links to that content, otherwise it just remains a click away and the problem still exists.

This of course would mean it would be risky linking to low quality sites.

1script




msg:4484490
 3:27 am on Aug 14, 2012 (gmt 0)

Interesting idea but I'm curious why you based this on an assumption that Google operates on 14-days cycles ( 1/2 of 28? sorry, could not resist ...) I've had pages drop, then come back up on a keyword and it happened much more slowly than in 14 days. So, in theory at least, you could kill a page that can potentially rank at some point in the future. Additionally, it takes Google some time to de-index pages, perhaps longer than 14 days, and so it seems this 14 days feedback is way too short, can get the whole system out of whack.

If I were doing it, I'd do it with a glacial speed. Perhaps wait for 1/2 year since the last direct visitor from Google comes to that page. But then, it would greatly diminish effectiveness of these efforts.

It sounds like you've been doing it for awhile now, how you noticed anything particular about the pages that drop? Were they shallow content (whatever the definition), duplicates, no IBLs etc. to begin with?

This 31 message thread spans 2 pages: 31 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved