homepage Welcome to WebmasterWorld Guest from 54.226.173.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 133 message thread spans 5 pages: < < 133 ( 1 2 [3] 4 5 > >     
Recovered from Panda - and how I did it
brinked




msg:4302357
 6:42 am on Apr 22, 2011 (gmt 0)

Alright so one of my sites just recovered from panda. Its a site I bought a few months ago and have mostly neglected but it earned a decent income (all adsense). Site had 3 adsense ads on every page, I removed them all, since the panda update it was only making about $5 a day anyway.

Site has about 80 pages, all with unique content, its an old site, really should have never been hit IMO. Here is what I did:

- removed all adsense ads from every page

- Updated the homepage text a little bit, added some and tweaked some old text.

- removed duplicate meta descriptions (there were 6 pages that had identical meta descriptions)

That is all. I am encouraged that it only took about a week for the panda to gets its paws off. I wish my other site was recovered but its much bigger and many more pages. I am not sure if it was one thing or a series of things...to be more clear, there was one adsense ad at the top posing as a top navigation and then two of the smaller banners 1 above content and one below it.

 

danijelzi




msg:4303000
 5:05 pm on Apr 23, 2011 (gmt 0)

I've forgot to say one thing which could be important. One of these articles got a great amount of backlinks from relevant and reputable sites. The other two articles were not so popular among other sites. Also, I got a large amount of backlinks for one another article, which is still buried somewhere in SERPs.

Mr3putt




msg:4303101
 9:25 pm on Apr 23, 2011 (gmt 0)

Any image-rich and text-poor sites survive Panda?


For my main website my individual pages consist of about 350-500 words along with anywhere from 10-300 images.

My gallery is around 20,000 images but my image to content ratio is about 50/50.

I was hit by Panda 1 and Panda 2 with a total traffic loss of about 60%. My other website is set up the EXACT same way the only difference is it's about 1/5 of the size. It gained 30% traffic after Panda 1 and dropped around 30% after Panda 2 netting no difference prior to Feb 24th.

The sites combined came close to a six figure income for 2010 with adsense. I have removed adsense to see what will come of it and am exploring other monetization options. The way Google seems to be going I might as well start investing to sell my own products, maybe I can get the reputation of a big brand in my niche and climb up the SERPS. ;)

That was a long response to your question which was only partially answered giving the fact my image to content ratio is 50/50. Although it was therapeutic for me. Drugs and sex have been my answer to Panda. Cheers to getting back on the bandwagon of what Mr. Panda is thinking.

My_Media




msg:4303113
 9:52 pm on Apr 23, 2011 (gmt 0)

Here is my take on the panda update.
I have more than 50+ sites and only one large one with daily unique of 60k/day. All my smaller sites gained 30%+ over all 2 panda updates while only left my flagship hanging to it's life support (-65%).
I can see that Google attacked the mid-larger sites and let the smaller ones ranked up to gain our previous keyword positions. I think this algo only applied to sites with large links(in and out) and traffics to it. I am sure later they will come up with panda 3 and smaller site will get hit too.
But one thing really hard to swallow is that second language sites ranked higher and higher on our keywords. Now that is a death blow to me not my site.

brinked




msg:4303115
 9:54 pm on Apr 23, 2011 (gmt 0)

For everyone who is asking...

YES, websites loaded with lots of ads all over are still ranking.

YES, websites with image/media content are still ranking.

YES, websites linking to affiliate programs are still ranking

YES, websites that scrape content, have duplicate content are still ranking.

and YES websites with thin/shallow content are still ranking.

You will not be pandalized for having a lot of ads alone. There are other factors (probably many) that go into this. If your site is on the brink of tipping the scaled toward panda and you have poorly placed ads, those ads are what can tip you over the edge.

I also do not think there is panda and non panda. I think there are different severity levels of panda. I have seen some sites get removed completely from googles index. Basically, the more poor quality content you have, the more you will be effected by panda.

I also think that google kind of slowed down everything before panda. Such as not updating the SERPs as much. This way when panda was released, they essentially refreshed their index to make panda that much harder to manipulate. Maybe a lot of people were effected when panda was released because there poor backlinks finally caught up to them or something else. This is just a theory of course.

miozio




msg:4303186
 1:57 am on Apr 24, 2011 (gmt 0)

@ My Media "I can see that Google attacked the mid-larger sites and let the smaller ones ranked up to gain our previous keyword positions. I think this algo only applied to sites with large links(in and out) and traffics to it."

Yes, looks like you are right about it. The remaining sites are huge monsters and tiny scrapers. And i see some of the scrapers dissapear probable because we sent a wave of reports to Google!

kd454




msg:4303187
 2:01 am on Apr 24, 2011 (gmt 0)

Maybe a lot of people were effected when panda was released because there poor backlinks finally caught up to them or something else. This is just a theory of course.


I have added some fresh backlinks to a site that was effected by Panda 2 and getting some serps back, so I am thinking this not a theory.

crobb305




msg:4303189
 2:42 am on Apr 24, 2011 (gmt 0)

You will not be pandalized for having a lot of ads alone. There are other factors (probably many) that go into this. If your site is on the brink of tipping the scaled toward panda and you have poorly placed ads, those ads are what can tip you over the edge.


Brinked, I understand your frustration :)
I/we have been discussing/proposing a number of possible factors for 6+ weeks. Nine times out of ten, each seems to get dismissed with "Google can't possibly hold that against you," or "How can Google possibly know about that?" We are talking about complex mathematical/statistical models, and it is the aggregation of an unknown number of signals in multiple algorithms that may be used to rank pages.

I have been improving my site based on everything that I believe could A) improve my site's quality and B) be direct signals of quality in a computer program created by humans

brinked




msg:4303195
 4:32 am on Apr 24, 2011 (gmt 0)

crobb, its not really frustration. I have been able to get out of every penalty google has thrown at me. I obsess over it until I figure it out, and for the last 6+ years, I have always figured it out. I have always been able to build a website from scratch and know exactly how to get it to rank for exactly what I need it to rank for.

I have always been able to take on clients and tell them, this is why you're not ranking, this is exactly what you need to do. And it always works.

This panda situation changes the rules. I need to understand it, and I will. I have already made great progress and I will continue to do so. I come up with theories, a lot of them are wrong, but I end up having enough to finally stumble upon the actual solution.

I have already gotten 1 website out of panda and I am on the verge of getting another one as well. I will do it, its just a matter of when.

I am a problem solver. I cant go to sleep at night not knowing how to do something. A lot of programmers have come to me and said "no, you cant do that, its not possible to code that" and then I will come up with a way to do exactly what I want to do. Its just in my nature I guess. This is a sport for me.

tedster




msg:4303201
 5:28 am on Apr 24, 2011 (gmt 0)

I have been able to get out of every penalty google has thrown at me.

I suggest you let go of the "penalty" model when it comes to understanding Panda. Yes, Panda does lower rankings for some sites, but it does this as part of the ranking algorithm. A penalty is imposed on top of a ranking score, but Panda is used to GENERATE the ranking score.

This means that improving rankings is not a matter of removing an infraction of some kind. As crobb305 said "it is the aggregation of an unknown number of signals in multiple algorithms that may be used to rank pages."

With a true penalty, it is possible (to a degree at least) to reverse engineer the factors and discover "the cause" of a penalty. With a core change to the ranking algorithm such as Panda, we are faced with something much more complex than that. I'm thinking of Panda as a new module in the ranking algorithm, parallel to a few other modules that score for relevance, trust, and citations (PR).

The reports we're now seeing about improved rankings support that mental model. One person built new backlinks. Another person removed ads above the fold. Yet another addressed duplicate content issues. And in almost every case of reported success, rankings IMPROVED but they were not "restored" to pre-Panda levels the way they would have been in the case of a true penalty.

Shatner




msg:4303237
 9:05 am on Apr 24, 2011 (gmt 0)

>>>The reports we're now seeing about improved rankings support that mental model.

We haven't really seen reports of more than a single, isolate page recovering or anything though have we? It's not like anyone whose entire site has really improved overall rankings.

Just saying even there, we don't really know what's going on, and the fact that this is the case 100% supports what you're saying tedster.

londrum




msg:4303241
 9:21 am on Apr 24, 2011 (gmt 0)

if you cant find anything wrong with your site then maybe the best thing to do is just sit it out and wait. we know that google is crawling very old pages (at least a year old) because they are showing up as 404s in WMT. that is bound to mess up the SERPs. maybe all your new pages havent been weighted properly yet. and not just your pages, but also the pages that link to you.

if we just wait for all the iterations to play themselves out, then maybe we will see some improvement.

bigger sites will see improvement first because their stuff gets crawled quicker.

brinked




msg:4303245
 9:51 am on Apr 24, 2011 (gmt 0)

tedster, I completely agree.

Google is so complex that its important to not try to understand what its doing, but rather take a look at your site and find out why google might not be in favor of it.

When you study as many websites as I do, you start to notice trends. Its important to look at sites that rank and see what they are doing and what you are not doing.

The first real penalty that frustrated me to no end was the over optimization penalty. I never thought in my wildest dreams there was such a thing. You read about seo and you get the feeling you need to put important keywords everywhere, in your title tag, meta tags, header tags etc etc etc.

It is very important to keep an open mind when reviewing your own site. Google panda is about quality...be critical on your own site and look for areas that you feel may not be of a high quality. Study known pandalized sites until a pattern starts to present itself.

Its better to focus on good practices rather than focus on what google is doing. Its like driving your car. You can try to understand where cops set up traps, always be on the lookout of patrol cars or you can not go too far over the speed limit and not worry about cops being near by.

RichTC




msg:4303247
 9:57 am on Apr 24, 2011 (gmt 0)

I suggest you let go of the "penalty" model when it comes to understanding Panda. Yes, Panda does lower rankings for some sites, but it does this as part of the ranking algorithm. A penalty is imposed on top of a ranking score, but Panda is used to GENERATE the ranking score.


Because no two sites are identical i can see why this is your view however, you can have sites that are similar in structure. Clearly, if one has 4 adsence units on its pages and the other does not and the adsence one is hit and the other not i would class that as a penalty for having adsence units (to many) as the most likely cause of that penalty. Admit that it COULD be the straw that broke the camels back but its still a devaluation or penalty as far as a webmaster is concerned

The problem here is that with all "penalty" issues, removing the problem doesnt guarantee sucess. It can take months to recover as we all know as the serps updates and this is whats so annoying about it - if adsence placement position on a page is a factor (which clearly it is hence the revised adsence placement guidelines) the adsence team should have been saying to its clients "a new scoring metric could harm your site for having too many adsence blocks we suggest you revise your layout". But they didnt, they kept pushing webmasters to increase ads exposure for them to then take a smack!.

As far as im concerned it is a Penalty and utter contempt for the webmaster by Google, many of whom have worked with them to help increase earnings (it is a two way street or so webmasters thought).

Its what action you take now going forwards that counts and if that means removing parts of your pages or adsence blocks that "push your site over the edge" then so biet, but the reps should have been better informed that adsence layout was going to be a likely factor and discussed this with their webmaster clients in advance, unless of course "the select" were informed?

As i see it "if you are going to keep feeding a lion every day you have to take the risk that one day that Lion might bite your arm off or eat you whole" thats the price you pay and a lot of webmasters have been bitten as a result following this update and im not convinced quality has been improved one bit as a result, if that was the objective.

walkman




msg:4303253
 10:14 am on Apr 24, 2011 (gmt 0)

The reports we're now seeing about improved rankings support that mental model. One person built new backlinks. Another person removed ads above the fold. Yet another addressed duplicate content issues. And in almost every case of reported success, rankings IMPROVED but they were not "restored" to pre-Panda levels the way they would have been in the case of a true penalty.


OR, Panda hit and after Panda we kept losing more and more users as Google "fixed" it. maybe they are backtracking on some things or some non-core things are fixed by users. But no one has come out of 2/24 Panda so far. Brinked got hit on 4/11, we know it was an algo change but, that's it.

danijelzi




msg:4303255
 10:19 am on Apr 24, 2011 (gmt 0)

if you cant find anything wrong with your site then maybe the best thing to do is just sit it out and wait. we know that google is crawling very old pages (at least a year old) because they are showing up as 404s in WMT.


I have noticed that I get Google organic traffic mostly for articles about products a couple of years old. I have an impression that Google at this moment doesn't know where to rank most of my newer articles.

AlyssaS




msg:4303290
 1:04 pm on Apr 24, 2011 (gmt 0)

We haven't really seen reports of more than a single, isolate page recovering or anything though have we? It's not like anyone whose entire site has really improved overall rankings.


Just because it's not reported, doesn't mean it hasn't happened.

Cracking Panda would be a major competitive advantage. You've have to be foolish to then go on a forum and spout about everything you had found out, your advantage would get competed away because you just handed the crown jewels away to those who would have never worked it out themselves. Loose lips sink ships, and all that.

Get yourself a test site, and start testing stuff and documenting your tests. When something works, implement it on your main site. When it doesn't, well, you've acquired some new knowledge. Setting up your own lab and tests is the only real way to be successful at this.

pageoneresults




msg:4303293
 1:40 pm on Apr 24, 2011 (gmt 0)

I have been improving my site based on everything that I believe could A) improve my site's quality and B) be direct signals of quality in a computer program created by humans.


Have you checked your DNS. Have you checked server performance. Are your pages link heavy (Internal)? How many Round Trips to the server do your documents make when requested. Are you on a shared IP where more than a comfortable percentage of those are questionable sites?

I personally feel there are many things being overlooked with this Panda update. I don't see anyone discussing site performance, etc. Google is really BIG on site speed. Why do you think we have all the tools for site speed available to us?

I've been reviewing sites hit by the Pandaemic and my findings are a bit different than what most are discussing. Many of the discussions I see are related to "what's on the surface". Ever take a look at what's down below? Many of the losers that were publicly posted have more than a handful of technical challenges. Some of them are downright bandwidth abusive when it comes to the User-Agent and the User.

tangor




msg:4303300
 1:58 pm on Apr 24, 2011 (gmt 0)

Panda is not done. The web is a Very Big Place and even the Mighty Goog can't be in all places at once (hence the continental-sized roll-outs). When all is in place comes Phase Three.

walkman




msg:4303648
 3:21 pm on Apr 25, 2011 (gmt 0)

Just because it's not reported, doesn't mean it hasn't happened.

Yep, and just because people report being hit by Panda and we see stats, doesn't mean it has actually happened. Since we're speaking theoretically and all.

danijelzi




msg:4303705
 5:45 pm on Apr 25, 2011 (gmt 0)

One of my ~2000 articles got on the #2 today for different keywords mentioned in the text. Google cache for that page is April 24 and shows my new template with a reduced number of sitewide internal links. However, I'm outranked for the article title by scrapers. Most of the other articles are still pandalized, except 2 or 3 that are now normally ranked (on the first page) and some others that were not affected by Panda. I'd also like a separate thread for movement reports. Thanks.

dibbern2




msg:4303809
 9:00 pm on Apr 25, 2011 (gmt 0)

The more I work on this, the more the puzzle parts become revealed.

supercyberbob




msg:4303817
 9:31 pm on Apr 25, 2011 (gmt 0)

The more we get together the happier we'll be.

Wlauzon




msg:4303832
 9:48 pm on Apr 25, 2011 (gmt 0)

I am only going by what I found when a competitors site got hit hard, so don't take this as gospel.

In his case, he had a lot of links from adfarms. Some he had setup himself (or at least it showed the same owner of the domain). The majority of those farms had far more than 3 ads - in one case 15 ads on one page. Most were in the 6-9 range.

In addition, a lot of the farms had "near" duplicate content - very similar but not exact text, probably a 90-95% match.

Just going from that, I am guessing that those two factors are major in what Google is looking at for Panda.

crobb305




msg:4303836
 9:56 pm on Apr 25, 2011 (gmt 0)

I'd also like a separate thread for movement reports. Thanks.


danijelzi, did you see this thread? [webmasterworld.com...]

DirigoDev




msg:4303875
 11:15 pm on Apr 25, 2011 (gmt 0)

I've analyzed 121 health web sites (any site that moved 5 ranking positions in Hitwise week over week - 4/9 to 4/16). In my opinion about 50 of these sites should not have been hit by Panda. Many of these sites have quality content with ~800 to ~2500 words - original content. The big losers, for the most part, have large link ratios (e.g. the ratio of inbound to outbound links is very uneven). It cuts both ways. Too many inbound or too many outbound links. The winners all have very tight in to out ratios (both negative and positive).

Anyone have insight?

whatson




msg:4303954
 4:47 am on Apr 26, 2011 (gmt 0)

I think that's just it, lots of thick unique content. Not just for search results but pages with information that your visitors might want to know.

I don't think ads play a major part in this, if any.

How they determine who the source of information is, i.e. who wrote the initial content, is most likely the oldest one indexed in Google.

It is the only consistent pattern I can see. Perhaps some people who thought they were getting unique content written, was actually being scraped from other sites.

Errioxa




msg:4304022
 10:33 am on Apr 26, 2011 (gmt 0)

Many ads could be correlated with bounce rate and time on site .

If you have lots of ads, higher volumen users exit of your site

dataguy




msg:4304085
 1:40 pm on Apr 26, 2011 (gmt 0)

@DirigoDev I think you could be on to something, I hope you post more of your findings. Actually I'll love to see a dedicated thread to your findings.

Your explanation could actually explain every aspect of this update that I see. For instance, assuming the Googlebot can read javascript, the number of ads on a page would count toward the number of outbound links on a page, skewing the in/out ratio.

'Thin content' pages, page stubs and dupe content pages wouldn't likely have inbound links, so they would skew the in/out ratio.

eHow, which was helped by the update, pays their writers, so their articles don't contain outbound links, which would help balance the in/out ratio. On the flipside, HuffPo which was hurt by the update, adds 'related links' to each of their articles. I don't know of any other news site which does this. This could skew the in/out ratio.

I would guess that outbound link counts hurt more, or there's a shelf of some sort. I'd love to see more of your findings.

Leosghost




msg:4304093
 1:52 pm on Apr 26, 2011 (gmt 0)

"assuming the Googlebot can read javascript"

They can do lots of things that at one time or another they choose not to use..which is why white text on white ground etc and other "spam techniques" get picked up on some sites etc and not on others..

And they serve most of their products via javascript..even their feeds to "premium" networks are mainly seen in javascript in the page source..for reasons that some should think about closely.

Read some of Google's source code on their pages..what they don't know about javascript, you could write on the back of a postage stamp.

pageoneresults




msg:4304098
 1:54 pm on Apr 26, 2011 (gmt 0)

eHow, which was helped by the update, pays their writers, so their articles don't contain outbound links, which would help balance the in/out ratio. On the flipside, HuffPo which was hurt by the update, adds 'related links' to each of their articles. I don't know of any other news site which does this. This could skew the in/out ratio.


From a technical perspective, eHow is one of the fastest and most optimized sites I've seen in the list. They make minimal HTTP Requests (30 for the home page), use CSS Sprites effectively and the server response times are blazing fast.

Huffington Post is a crawling nightmare! 225 HTTP Requests weighing in at 2,651,821 bytes. The site is loaded with internal and external links, over 400+ on the home page. Hard coded internal 301s are a bit excessive. Their external linking practices leave much room for improvement. Many of their externals are hard coded 301s. I call these lackadaisical 301s and they are rampant with larger sites. Many of the sites that took a hit have similar issues. The Huffington Post should be fined for user-agent abuse. That site is rife with errors. Heck, they have 611 errors in their CSS file, I've never seen that many before.

DirigoDev




msg:4304101
 1:59 pm on Apr 26, 2011 (gmt 0)

assuming the Googlebot can read javascript


I can confirm that as of 4/11 Google can read JavaScript. Just after the 4/11 update we started to see our original publication dates show in the SERPs. Previously the SERPs saw our last modified dates.

Here's the kicker, the original publication dates are in a JavaScript document.write script. This means that Google is reading the scripts. I cannot pinpoint a date certain. The first time that we noticed the wrong dates in the SERPs was 4/11. I'm pretty sure it was rolled out with the Panda update.

Anyone?

This 133 message thread spans 5 pages: < < 133 ( 1 2 [3] 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved