Welcome to WebmasterWorld Guest from 107.21.183.163

Message Too Old, No Replies

20,000 Page Website Hit by Panda

     

Dan01

6:44 am on Sep 1, 2011 (gmt 0)



About a month ago I noticed our main site finally got hit by the Panda update. We accepted guest posts for years. Many of those posts are probably added to other sites as well or may be considered low quality.

We have started deleting tons of posts.

We created a spreadsheet with all of the pages that received traffic over the last nine months. Now we are cross checking each page with that list; one by one. If the page didn't receive any traffic, we are deleting it.

We expect to delete about 8K to 10K pages by January (hopefully).

Here are the questions:

1) Is there a ratio of low-quality pages to high quality pages? In other words, what if we delete 8,000 pages but left several low quality pages on the site - would we still be hit by Panda? How does the algo work?

2) Is it better to delete from the oldest to the newest or does it matter? I am not sure it matters, but I was thinking that if Google crawls from newest to oldest, the SE will notice the difference quicker.

We are still getting some traffic, but it has been cut in half. For instance, looking back at our traffic in Analytics, we might have received 50 visits on a page from 20 keywords in January, but now we receive 20 visitors from 8 keywords.

3) My wife and I see things a little differently. I delete articles even if they got one or two visits over the last nine months. She wont, if it had one visit she keeps it. Who is right? I don't want to have to go through this process again if it isn't fixed the first time around.

manny123

10:52 pm on Dec 26, 2011 (gmt 0)

5+ Year Member



Google finds very large sites highly suspicious, and subdirectories with many pages, too.

I haven't seen anything that indicates large number of pages are themselves a part of the problem. It is the content on those pages and the user's interaction with it that seem to matter most.

Dan01

2:24 am on Dec 27, 2011 (gmt 0)



Most of the guest authors contributed content to get a link back to their site. Back in 2009 we stopped allowing submissions, but there are still tons of pages linking to other sites; perhaps some of those sites are low-quality.

Here is my question:

1) Can adding a nofollow to all of those pages help us recover from Panda?

The reason I ask is because I thought part of Panda (or perhaps another algo change) had to do with who we link to. If you link to junk sites, then you are perceived as being junk. Would nofollow help mitigate that?

Pjman

2:00 pm on Dec 27, 2011 (gmt 0)



@Dan01

Adding nofollow is probably worth a shot.

But I think how MC explained it is that the only way you can improve your quality is to noindex or remove the page it's on.

StoutFiles

2:27 pm on Dec 27, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



There's no way to avoid Panda forever...they'll come out with some Dragon update and you' all get hit again. While fighting MFA sites, which they only publicly fight to appease the sellers, they are slowly moving everyone to only the large, established sites.

Google is changing its first page results to all the major players, then the rest of you fight for the scraps. It seems every search I do comes back with the ad links, then sites like Wikipedia, Amazon, eBay, matching keyword domain names, etc.
This 34 message thread spans 2 pages: 34
 

Featured Threads

Hot Threads This Week

Hot Threads This Month