homepage Welcome to WebmasterWorld Guest from 54.198.148.191
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 70 message thread spans 3 pages: 70 ( [1] 2 3 > >     
You might be affected by panda if you do the following
brinked




msg:4306379
 9:01 pm on Apr 30, 2011 (gmt 0)

I have been studying the effects of panda ever since april 11. I have reviewed many sites that were negatively impacted and I have even managed to recover a few. I have noticed many trends.

You may have been hit by panda if you do or have done the following:

1. Own one or more similar sites which all link back to your "main" business. We know that JCP was hit when google found out they had many sites which all link back to JCP. If you own one or more similar sites, its best to follow good practice and take them down or combine them to your main site. I have seen several webmasters get hit for multiple sites ranking for the same/similar terms.

2. Content placement. This falls is line with my ads theory. The location of your content is becoming a very real factor to google. This is backed up by googles instant previews in the SERP pages which outlines where exactly the text is found. If someone searches for awesome widgets, and you have the words awesome + widgets way down on your page, google might knock you back because that content will not be easy to find by the searcher.

3. Content stuffing. A very strong characteristic of a MFA (made for adsense site). Some of us cant help the temptation to try to fit all the keywords we are targeting into one small paragraph or few sentences. Write your content for your visitors, not google. In this case, some de-optimization is better than optimization. You can still rank for terms without having certain strong keywords found on your page.

4. Giving priority placement to ads over content.. This is a big one. Browsing the google webmaster forum so many people complain they were hit and have no idea why. Then you visit there site, click an article and the article is pushed all the way down by a huge adsense block. This tells the visitor "click my ads" instead of "read my content". If your site is optimized for ad clicks rather than for quality content, you may very well have a problem.

5. Over optimization and repetition. Panda is simply an update to googles already highly complex algorithm. Over optimization is an old factor but it is always being updated and panda is no exception. Google is getting better at sniffing out webmasters who blatantly try to game there system and they do not like it. Do not put your main keywords all over the place (titles, alt text, header tags etc etc) focus on having appropriate content.

6. Too many useless pages. Do you have a ton of useless pages with little to no useless content? Google may very well see this as an attempt to rank for many premium search phrases. If you have questionable content or complete sections of content that offer little to no actual value to the reader, it is probably best to delete/block it. Do you have a user based website? Do your member profiles all really need to be in googles index? Do you have tons of useless tag pages?

It is important to have someone that is not biased review your site. So many websites I have reviewed the owner swears up and down he/she has no idea why there "extremely high quality and unique website" was hit, then when I take a look I can spot many suspicious practices almost instantly.

Google panda is not entirely about duplicate content, if that were the case, all of these scraper scripts would not still be ranking.

I hope this helps push many people in the right direction. If you feel this information has helped you, please respond and if you recover, please report back with full details.

 

tedster




msg:4306396
 11:24 pm on Apr 30, 2011 (gmt 0)

Thanks for that summary. It is not easy to digest and condense all the Panda information/

the temptation to try to fit all the keywords we are targeting into one small paragraph or few sentences

And that comes from following SEO "advice" that is 5-10 years behind the times. Google definitely has challenges and poor results for some queries, but their analysis is still miles beyond what it used to be, back when all the "basic SEO advice" was first published, and then recycled, and recycled and...

TheMadScientist




msg:4306399
 11:32 pm on Apr 30, 2011 (gmt 0)

Nice post brinked ... I just skimmed it, but it looks like a +1er and some good info / summary ... I'll come back to it later.

A couple of interesting notes:
Google says they do not have an 'over-optimization' penalty (5), per se, but that could possibly be translated to 'a bit on the spammy side' ... I think 2, 3, 4 are all good points and 6 is definitely something people should look at ... 1 is actually the really interesting one to me, because I know some things idk if I can post because of the tos and Overstock issue, but there's a BIG site I know of that might use a strategy like 1. Anyway, like I said, I basically skimmed and read the bold, so I can't comment much more on the full post right now and I'll think about 1 and try to figure out if there's a way I can post about it. Looks good though.

[edited by: TheMadScientist at 11:40 pm (utc) on Apr 30, 2011]

Leosghost




msg:4306402
 11:40 pm on Apr 30, 2011 (gmt 0)

There are a lot of folks who have spent the last 2 months denying #2 and #4 ..to the point where it was not worth my time repeating them as being major factors in many cases..and that "preview" was a foretaste of them "Google" looking at what visitors saw.

Hopefully you'll have better luck than I did .. and the thread can run without getting sidetracked with "It can't be ads, because all websites run ads..etc"

or "Google is broken"...etc etc.

It is important to have someone that is not biased review your site. So many websites I have reviewed the owner swears up and down he/she has no idea why there "extremely high quality and unique website" was hit, then when I take a look I can spot many suspicious practices almost instantly.

A definite +1 for that .and I would add, tested for usability too and overall impression, look , and feel .

ascensions




msg:4306409
 11:58 pm on Apr 30, 2011 (gmt 0)

I believe the ads are the #1 factor, however I'd love to see an example where one of the Panda winners was displaying the opposite behavior.

Planet13




msg:4306416
 1:16 am on May 1, 2011 (gmt 0)

I believe the ads are the #1 factor, however I'd love to see an example where one of the Panda winners was displaying the opposite behavior.


I don't know if this fits exactly with your question, but...

I was searching for something on g (don't remember what it was now), and about.com came up number 1.

So, when I go to the about.com page, the first thing I get is a css popup window ad (I think it was css, might be javascript - but you get the idea). I have to close it to look at the content of the page.

Then there was about two paragraphs of text, maybe 70 words total, then a link to page 2.

I didn't go on to page 2, because I saw that the article was going to span 7 more pages, and probably they would all follow the same format.

Of course, about.com has lots of ads surrounding the content - not to mention that popup.

So, either about.com is whitelisted, or google has a more subtle policy than Ads = Bad.

ascensions




msg:4306421
 1:49 am on May 1, 2011 (gmt 0)

But about.com got hit, according to SEW and NYT...

...and I don't think ads surrounding content is necessarily bad. I do think where they appear on the screen has a lot to do with it. Right-aligned ads seemed to be favored, not only on Google properties, but many of the "winners". (For above-fold placement.)

I assume you are granted some level of "trust"- if you will, based on other factors with respect to how many ads you can have and how far left, or how many above the fold you can have. "A" list sites (News orgs.) of course would be allowed more leniency than "B-F" sites, but I think the safe call is if you're hit by Panada, then you're likely not a "A list" site, and should probably limit the number of ads as well as the position to (oh say) the right alignment.

I think the whole "thin" content thing was misappropriated by webmaster as meaning solely as lack of text, but I'm more likely to believe "thin" means lack of text and in the presence of ads.

I'm seeing observable affects when word counting pages and eliminating ads on pages smaller than 250 words dynamically with PHP.

I just had my sitelinks come back tonight, hoping to see this theory play out.

tedster




msg:4306422
 1:52 am on May 1, 2011 (gmt 0)

the whole "thin" content thing was misappropriated by webmaster

Absolutely. Google engineers never used the word "thin", they used the word "shallow".

AlyssaS




msg:4306424
 2:07 am on May 1, 2011 (gmt 0)

I would add "pages with duplicate subject matter" as a subset of 6. "too many useless pages".

For instance ehow has pages and pages on how to boil an egg. Each has a very slightly different title - so "how to boil the perfect hard-boiled egg", "how to boil the perfect soft-boiled egg", "how to hard boil an egg", "how to boil an egg in a microwave".

They all have slightly different titles, and the text is worded differently so they pass copyscape, and are supposedly "unique" - but arguably they are about the same thing and could all be condensed into a single 1500 word page with sub-titles for hard-boiled, soft-boiled, microwave-boiled and so on.

I think if you have too many of these over-lap subject matter pages, you get flagged, as this is not the natural way to explore a subject.

It costs G time and money to index these pages, and when every site is producing 15 pages on something that warrants a single page, it adds up to a lot of garbage.

Leosghost




msg:4306426
 2:16 am on May 1, 2011 (gmt 0)

I get no popup on about ..at least opera doesn't say it blocked one..and IIRC ?, about.com is a premium partner so the ads to content ratio before getting more a tap from Panda may well be a little higher..

That said, their ads are way less intrusive ( bearing in mind that due to personalisation and IBA that none of us will be seeing the same ads ..even "signed out" ) than many and their ads take up only about 30% of the total area at the viewing area that Google talks about in their "best practices" re adsense and ad placement.

Also I just had to run way more searches, for the kind of things that they used to be in the top 5 for, than I previously would to get about.com to show up in the top 10, they definitely did get hit ..but not "mauled" ;-)

kd454




msg:4306428
 2:46 am on May 1, 2011 (gmt 0)

4. Giving priority placement to ads over content..

I am thinking this is what happened to my sites. Spent the entire week fixing this and still not fully finished.

I am seeing some slight improvement on the first site I did Monday. Hoping next week will see if this play out. I went page by page on the site I have done so far and this is the only thing I could find.

Getting some sitelinks back on some sites so hopefully this is a positive sign.

Putting ads inline with the content up top looks like crap and I can see why they would have an issue with this.

Funny my sites CTR jump up on the sites I moved the Adsense a few paragraphs below the fold, to bad the ole Adsense team was pushing this placement..grrrrr

I have looks at hundreds of site effected and this was a common theme among many. Then I went and looks at sites that took top positions and all either had no Adsense or it was below the fold or non obstructive to the content.

Another thing I noticed was if a site look clustered and was not easy to navigate this was a factor.

It is important to have someone that is not biased review your site.


Did this and it because obvious Adsense placement was the #1 factor :) When the wife told me she would back out of that page without a thought due to the ads the light went off.

brinked




msg:4306431
 3:01 am on May 1, 2011 (gmt 0)

When someone clicks an article and they are brought to a page and all they see are ads and the content is pushed below the ads, the user gets the sense there is no story and a lot of peoples instinct is to click on one of the ads in an attempt to get what they wanted.

This is partly googles fault, if they strongly feel this way, they should do away with the block/square ads because they are very bulky and its a challenge to place them in such a way where they wont conflict with the page content.

bramley




msg:4306439
 4:06 am on May 1, 2011 (gmt 0)

In short, shallowness is the key; per page and per site. With shallowness being the ratio of useful content and sensible layout and navigation versus advertising (position), and content that is repetitive or stuffing (oversight or gaming), or not of real general interest (pages that should be no-indexed to reduce site-wide shallowness).

To put in another way, shallowness is the absence of quality.

oh well, forget it ... whatever ...

[edited by: bramley at 4:52 am (utc) on May 1, 2011]

crobb305




msg:4306443
 4:24 am on May 1, 2011 (gmt 0)

Good summary Brinked. I am seeing recovery (as I have been reporting), but I feel like I am walking on eggshells with Google. I fear if I sneeze while I am looking at my website, it will fall back into penalty.

crobb305




msg:4306444
 4:29 am on May 1, 2011 (gmt 0)

Google says they do not have an 'over-optimization' penalty (5), per se, but that could possibly be translated to 'a bit on the spammy side'


Exactly. They have phrase-based document scoring in place (I see it in action on my site right now for one of my phrases). Per their patent, a document could be labeled as "spam" when phrases exist outside of the "expected" probability distribution. [seobythesea.com...]

"a spam document will have an excessive number of related phrases, for example on the order of between 100 and 1000 related phrases. Thus, the present invention takes advantage of this discovery by identifying as spam documents those documents that have a statistically significant deviation in the number of related phrases relative to an expected number of related phrases for documents in the document collection."

Note that there is a big difference between labeling a document as "spam" versus not ranking well. Phrase-based document scoring might trigger a ranking reduction for a phrase (again, when occurring more/less than expected); then, at some point, it could peg the whole document as spam and you get spanked.

Planet13




msg:4306447
 5:32 am on May 1, 2011 (gmt 0)

@AlyssaS:

For instance ehow has pages and pages on how to boil an egg.

But didn't ehow come out as a winner? I am certainly seeing it a lot more in the SERPs...

@crobb305

I would like to hear more about what you are finding in regards to phrase based scoring.

"...a spam document will have an excessive number of related phrases, for example on the order of between 100 and 1000 related phrases."

The odd things about that are (related to AlyssaS' post) ehow seems to use a lot of related phrases (i.e., how to boil an egg, how to microwave an egg, etc.,).

And also I am seeing LOTS of keyword stuffed pages still rank well - and even lots of keyword stuffed hidden text pages doing real well. It's like 1999 all over again...

caribguy




msg:4306455
 7:22 am on May 1, 2011 (gmt 0)

Brinked brilliant again, AyssaS - thank you... Both confirm my findings. I wish I had time to benchmark in other niches or high volume sites.

Make sites for people seems to be the new (old) mantra. Don't worry too much 'about' exceptions, that's not you.

shallow




msg:4306497
 12:02 pm on May 1, 2011 (gmt 0)

Giving priority placement to ads over content.. This is a big one.


About two years ago, an AdSense Optimization team member reviewed my site, made suggestions as to ad placement and "approved" the layout. I also hired a consultant who suggested where I place ads.

The main AdSense ad (728 X 90) on every page, except the home page, is under the header area and above the article title and text. I usually have a photo at or near the beginning of an article.

Article text always shows above the fold (though a good part of the text is usually below the fold, depending on the size of the photo or illustration used). My site is an educational site about all aspects of a class of widgets: explaining features, how to use, etc.

Post-Panda: Is everyone now saying there should be no ads above the fold?
Should I remove the leaderboard?

I have three AdSense ads on a page; top, right sidebar and at the end of the article.

Any advice would be much appreciated.

walkman




msg:4306502
 12:28 pm on May 1, 2011 (gmt 0)

Too many tag pages are enough to do you in, although some sites I know have escaped that. Suppose I add [panda,thin content,penalty] as tags to this thread and for all three all you see is the same sentence with a slightly different title. Until Panda Google would have ignored one, now looks like they slam you with a penalty and drive you out of business.

The horrible news is that Google has no intention of taking changes in consideration for the time being. I made my changes on 2/25 and more 2-3 weeks later, just lost another 30% of traffic.

scooterdude




msg:4306503
 12:32 pm on May 1, 2011 (gmt 0)

Such definitive statements of knowledge :)

Actually, I think you should all read Planet13's post above

I used Google relentlessly, even when they was been bad to me, yet i've never seen the serps so bad for all queries, epecially the techie ones i do

Now, I find my self using Yahoo uk, more an more

Why

Because Bing serps are so, so close to the rubbish I see in Google

ascensions




msg:4306504
 12:34 pm on May 1, 2011 (gmt 0)

@Shallow - From what I'm seeing, your leaderboard should be fine as long as it's not in between the title and the text. If it's above it- it should be fine... as should the other placements. The one that Google seems to be targeting is the "left floated/aligned" 336/300px-like boxes everyone is/was using pushing content down.

Where I'd be careful is how many ads you have in ratio to the amount of content. Say 1 ad per 250 words. If you have 3 ads on a page with 100 words then, I'd question that. Well, I wouldn't... The Google Regime would.

I think another important element here is screen width. Almost all the winners I've seen are sitting with a width of no less than 975px. Which means more content can be above the fold even with ads.

Of course I really must criticize Google on this, if this is the main course of Panda, it does little to improve the quality of the web, it merely makes things look better. More to the point the overall web will be less diverse and contain less content because of their censoring of those who don't adhere to their model. Which is the reason we're seeing scrapers, and information unrelated to our searches.

The other day I took my CR-48 on a road trip, and on searches to which I knew the expected SERP, I got rubbish. Couldn't even find the address for local restaurants. Heck couldn't even find their websites. It kept bringing stuff up in Vermont, even though I'm in NC... and I kept adding to the search phrase "Greensboro, NC"... If this current algorithm is working for Google, good for them, but I say this with no cynicism... I don't like the direction Google has gone with this, and I find it completely out of character with the company. If I owned stock, I'd sell it based on what I've seen in the last three months. I would have been reluctant to say that when it first happened, but the apparent behavior of a hubris blinded Google unwilling to change in light of the obvious overwhelming evidence to the contrary suggests to me something isn't right here.

If Microsoft was smart they'd use this discontent with the many Adsense publishers to entice them into their program. Google needs to step up and show some responsibility for their internal customers (us) and throw us a bone.

londrum




msg:4306507
 12:49 pm on May 1, 2011 (gmt 0)

For instance ehow has pages and pages on how to boil an egg. Each has a very slightly different title - so "how to boil the perfect hard-boiled egg", "how to boil the perfect soft-boiled egg", "how to hard boil an egg", "how to boil an egg in a microwave"... It costs G time and money to index these pages...


this is what every algo change is about. at the end of the day, this is what every algo change is about now, in my opinion. it's got very little do with providing better quality serps, and more to do with reducing costs for google.

imagine how much it costs google to send out spiders and read billions and billions of pages every day, and store all the different permutations. it must be astronomical. and the web is getting bigger everyday. the costs just spiral up and up.
so what do google do? they make suggestions as to what webmasters should do to get better rankings... which happen to save them a packet of money too.

think about it... what suggestions have google made in the last year or so?

1) speed of pages. they "suggested" that speed might play a part in the algo... so every webmaster immediatly went out and reduced their page weight and offloaded loads of stuff. there was never any evidence that speed played a part when they announced it. and even now they are only saying that it will be used as a "tiebreaker". a tiebreaker? how many different ways are there to rank pages these days? there are hundreds of different onpage and offpage factors.
google is not dumb enough to demote a brilliant page ten places down the serps, just because it takes half a second more to load.
so the only affect that this had was on reducing google's costs -- both in spidering the web, and storing it's pages.

2) punishing "thin" pages. apparently the new rule is this: if your site has some thin pages then your entire site will get punished (unless your "trust" is so high that you can get away with it).

but there is no sense in demoting an entire site (which might otherwise good) because 5% of it is thin. it makes no sense. the obvious thing would be to just demote those thin pages. but why would google demote the good ones as well? that is like throwing the baby out with the bath water.
for example, lots of blogs have tag pages. it's normal. but even if the actual posts and content is the best in the world, of pulitzer prize quality, and written by william shakespeare himself, google are still "suggesting" that they will demote your entire site because 5% of it is tag pages.
that just makes no sense whatsoever. nobody can suggest otherwise.
but, lo and behold, webmasters all around the world have been panicked into binning thousands upon thousands of pages from their sites and noindexing thousands upon thousands more.
...saving google money on spidering and storing the web.

that is what it is all about. we are just doing google's bidding.

and as for duplicate content... i think google have just given up trying to work out who wrote what thing first. and in a way i dont blame them.

if a good site nicks something from a rubbish one, why should they have to rank the rubbish one first? that is not what users want. they'd rather visit the better site, even if the material is second hand.
if google have determined that the second site is better, what do they actually get by ranking the original writer first? nothing. zip. the only people who care are the writers themselves. but google cares about the searchers, not the writers.
and remember that they are not legally obliged to do it... it costs them money to work out which one wrote it first, because they have to store snapshots of the web from way back when. in my opinion they have just washed their hands of the problem to save themselves money.

that is what every algo change is about now... ways to save google money.

AlyssaS




msg:4306514
 1:23 pm on May 1, 2011 (gmt 0)

@AlyssaS:

For instance ehow has pages and pages on how to boil an egg.

But didn't ehow come out as a winner? I am certainly seeing it a lot more in the SERPs...


They dropped in Panda II. For some terms I watch, they used to hold the top three positions with their "multiple versions of the same page". Now they have just one entry at #5.

I think hubpages has similar issues to eHow with umpteen pages on the same subjects, all worded differently so as to be "unique", but all essentially saying the same thing.

P.S. I know some people are taking the SERPs we are seeing just now as gospel, but I think we are in some sort of transition, with a lot of testing going on.

ascensions




msg:4306515
 1:42 pm on May 1, 2011 (gmt 0)

Just my opinion, but I don't think crawling is a serious cost problem at time with their infrastructure's capacity.

I also believe the size of the web in totality is much smaller than any of anticipate. Matt Cutts has said they could literally download the entire thing in one instantaneous crawl if the end-publisher-user's equipment could handle it.

There's just not enough people publishing content on the web. That's exactly why Panda is broken. Sure it may have strangled some of the rubbish, but a lot of good content got caught in the net. If things don't change, it's also going to be much more prohibitive for new publishers to come on the scene.

walkman




msg:4306517
 1:49 pm on May 1, 2011 (gmt 0)

They dropped in Panda II. For some terms I watch, they used to hold the top three positions with their "multiple versions of the same page". Now they have just one entry at #5.

There is no doubt in my mind that G set out to purposefully hurt eHow on Panda II given the horrible press and blog reactions Google got when eHow actually rose on Panda. It was everyone's rallying cry: if content is supposedly important, HTF is an "How to boil water" eHow article well researched?

Also eHow's case destroyed everything that Google said about Panda, it was utterly embarrassing. But the fact is that they passed Panda algorithmically.

Next, I hear eHow might reveal the ice making formula in a serious of articles :)

IMO, goog has no problems with crawling, it isn't about cost, more about managing what they find.

AlyssaS




msg:4306535
 3:19 pm on May 1, 2011 (gmt 0)

I wish I hadn't used ehow as my example as it always sidetracks the discussion.

I'll try again. What I was trying to say was that while you still see sites with multiple listings in the SERPs, they are now for pages with related subjects, not the exact same subject.

So where before a site got three listings for "blue widgets", "best blue widget", "my blue widgets" - which all had the subject matter of blue widgets and all said the same thing worded differently, usually featuring the exact same brand.

Now if a site has multiple listings, the related pages seem to be actually related: so for the term "blue widgets", you get "blue widgets brand x", "blue widgets brand y", "blue widgets brand z". The pages are different as they are talking about different brands.

Anyone else seeing this? (I'm sorry if I'm not explaining this properly - it's hard to do without citing real examples).

netmeg




msg:4306543
 3:46 pm on May 1, 2011 (gmt 0)


P.S. I know some people are taking the SERPs we are seeing just now as gospel, but I think we are in some sort of transition, with a lot of testing going on.


I agree with this (and pretty much everything else said by AlyssaS) 100%.

shallow




msg:4306544
 3:47 pm on May 1, 2011 (gmt 0)

@ascensions, thank you very much for the information; it was very helpful.

You wrote in another post:

If things don't change, it's also going to be much more prohibitive for new publishers to come on the scene.


If things don't change, it's going to be prohibitive for older publishers (individuals, not corporations) to stay on the scene, or at least maintain any type of enthusiasm for it. The past four years I've paid web designers and web developers a lot of money to improve my site; also paying writers for unique and well-written articles (in addition to my own).

When my income, all within less than a few weeks, drops 40-70% on any given day, there are no funds left for improvement (not to mention how this effects other aspects of my life).

Besides, why should I invest any more money in my site when Google may come along whenever they want and take the wind out of it again.

Roaming Gnome




msg:4306546
 4:07 pm on May 1, 2011 (gmt 0)

Thank you Brinked. A lot of what you posted is inline with what I figured also. +1

Leosghost




msg:4306547
 4:09 pm on May 1, 2011 (gmt 0)

Anyone else seeing this?

Yes..and I agree with Netmeg ..AlyssaS observations and analysis of the "non ads" part of the panda ..particularly in relation to multiple listings is very very close to what I'm seeing and have been seeing for a while..and it definitely is not done yet..we are going to be dealing with this being refined all year..and maybe beyond.

the system has changed radically.

There appears to be a lot of A-B ( and A-B-C etc ) testing going on ..certain KW or KW1+KW2 searches get different serps depending on time of day ( appears still to be cycling around 12 hours ) ..and the number of results returned oscillates ..but rarely moving much from what appears to be a set upper and lower limit ..the top position moves on each swing ..but there is waay more movement from positions 2 to 5 ..and again from 7 to 10 ..( I'm not looking at page 2 because IME average surfers don't do so enough for me to be interested )..there is some junk gets put in sometimes ..but 12 hours later it is gone...and it isn't always the same junk..it isn't broken its morphing .

This 70 message thread spans 3 pages: 70 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved