homepage Welcome to WebmasterWorld Guest from 54.235.61.62
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 70 message thread spans 3 pages: < < 70 ( 1 [2] 3 > >     
You might be affected by panda if you do the following
brinked




msg:4306379
 9:01 pm on Apr 30, 2011 (gmt 0)

I have been studying the effects of panda ever since april 11. I have reviewed many sites that were negatively impacted and I have even managed to recover a few. I have noticed many trends.

You may have been hit by panda if you do or have done the following:

1. Own one or more similar sites which all link back to your "main" business. We know that JCP was hit when google found out they had many sites which all link back to JCP. If you own one or more similar sites, its best to follow good practice and take them down or combine them to your main site. I have seen several webmasters get hit for multiple sites ranking for the same/similar terms.

2. Content placement. This falls is line with my ads theory. The location of your content is becoming a very real factor to google. This is backed up by googles instant previews in the SERP pages which outlines where exactly the text is found. If someone searches for awesome widgets, and you have the words awesome + widgets way down on your page, google might knock you back because that content will not be easy to find by the searcher.

3. Content stuffing. A very strong characteristic of a MFA (made for adsense site). Some of us cant help the temptation to try to fit all the keywords we are targeting into one small paragraph or few sentences. Write your content for your visitors, not google. In this case, some de-optimization is better than optimization. You can still rank for terms without having certain strong keywords found on your page.

4. Giving priority placement to ads over content.. This is a big one. Browsing the google webmaster forum so many people complain they were hit and have no idea why. Then you visit there site, click an article and the article is pushed all the way down by a huge adsense block. This tells the visitor "click my ads" instead of "read my content". If your site is optimized for ad clicks rather than for quality content, you may very well have a problem.

5. Over optimization and repetition. Panda is simply an update to googles already highly complex algorithm. Over optimization is an old factor but it is always being updated and panda is no exception. Google is getting better at sniffing out webmasters who blatantly try to game there system and they do not like it. Do not put your main keywords all over the place (titles, alt text, header tags etc etc) focus on having appropriate content.

6. Too many useless pages. Do you have a ton of useless pages with little to no useless content? Google may very well see this as an attempt to rank for many premium search phrases. If you have questionable content or complete sections of content that offer little to no actual value to the reader, it is probably best to delete/block it. Do you have a user based website? Do your member profiles all really need to be in googles index? Do you have tons of useless tag pages?

It is important to have someone that is not biased review your site. So many websites I have reviewed the owner swears up and down he/she has no idea why there "extremely high quality and unique website" was hit, then when I take a look I can spot many suspicious practices almost instantly.

Google panda is not entirely about duplicate content, if that were the case, all of these scraper scripts would not still be ranking.

I hope this helps push many people in the right direction. If you feel this information has helped you, please respond and if you recover, please report back with full details.

 

ascensions




msg:4306551
 4:34 pm on May 1, 2011 (gmt 0)

Besides, why should I invest any more money in my site when Google may come along whenever they want and take the wind out of it again.


Fully agree. In fact I went from writing 2500 word pieces several times a week to none since this happened. At first I spent lots of time trying to "fix" things, now I'm just not very motivated.

When I think about writing I end up second guessing half the content because I think "Will Google further penalize me?" It's rather unsettling.

I'm to the point if revenue doesn't change my next step will be to pull all adsense. Adsense that used to earn me $$$$.$$ a month before March. If there's no money, I'll also be no-indexing from Google, not to be vindictive but to see if it has any affect on things.

I ask Google, is this the intended behavior you wish to create? I'm quite certain, many of you aren't far behind me in my thinking.

shallow




msg:4306567
 5:33 pm on May 1, 2011 (gmt 0)

I went from writing 2500 word pieces several times a week to none since this happened.


I started publishing 1-3 articles a week about a year ago. Now, with rare exception, I'll publish one a month.

I'm to the point if revenue doesn't change my next step will be to pull all adsense. Adsense that used to earn me $$$$.$$ a month before March.


Same here, though I don't think it's AdSense per se. I think it's the dramatic change in traffic as a result of Panda. I note some of my other affiliate ad sales are down too. In my case, Feb-May tend to be the slowest times of the year so this is adding to the problem but it's only a very minor part of the problem...time will tell.

Other than writing and creating graphics, I do not have the skills to work on the back-end of my site. So I hire a very fine web developer that does back-end work (when I have the money to afford him).

So I'm playing with the AdSense ad colors a bit.

For what it's worth, I changed the font color of the title on my text ads from one that blended to one that is of higher contrast (eg. from blue to orange).

CTR returned to previous levels, though not income because of the large drop in traffic and lower pay per click. Well, at least one thing went up instead of down.

crobb305




msg:4306569
 5:45 pm on May 1, 2011 (gmt 0)

Do you have a ton of useless pages with little to no useless content?

I think this is becoming easier to detect. With people sharing content on the web, I think we might expect to find a different inbound link distribution than we did 5 years ago. Nowadays, people will just share the direct url rather than just the homepage. So, sites that have been built with inbound links to just the homepage probably need to rethink this model. By eliminating useless pages (especially those with no inbound links even after 1+ years), you might help your inbound link distribution.

suggy




msg:4306876
 11:10 am on May 2, 2011 (gmt 0)

"Giving priority placement to ads over content" = WRONG

I think you are confusing symptoms with the disease here.

I don't think google cares where you have ads on your page. Or where you content is. They don't need to. They only need to know what were a sample of visitors reactions to your page (in it's entirety). Bounce wraps all this placement nonsense up nicely (and includes relevance). As does a visitor hitting an adsense link <1 sec after arriving ;-). After all, isn't that what MFA sites are aiming for?!

My website is a commerce site with no ads at all and was hit hard by Panda. In my case, I don't think we were hit by this part of the algo. More likely that we had too many derivative landing pages.

My take on Panda

1. Page content metrics act as a filter to trap high probability spam (pans individual pages)

2. Page level user interaction metric - ie. do people click on this click and what happens when we send them there? (pans individual pages)

3. Site quality metric that is based on the site average of the above 2 metrics. (downgrades site authority so even your good page slide 10 places)

Of course, this is all speculation!

superclown2




msg:4306883
 11:38 am on May 2, 2011 (gmt 0)

1. Own one or more similar sites which all link back to your "main" business. We know that JCP was hit when google found out they had many sites which all link back to JCP. If you own one or more similar sites, its best to follow good practice and take them down or combine them to your main site. I have seen several webmasters get hit for multiple sites ranking for the same/similar terms.


So how would Google know, if the sites were on different servers with different listed owners? Do they know more about us than we think they do, I wonder?

anand84




msg:4306895
 12:29 pm on May 2, 2011 (gmt 0)

They all have slightly different titles, and the text is worded differently so they pass copyscape, and are supposedly "unique" - but arguably they are about the same thing and could all be condensed into a single 1500 word page with sub-titles for hard-boiled, soft-boiled, microwave-boiled and so on.


While that appears to be what Google is trying here, I think it makes sense (as a Google searcher) to prefer articles that answer my questions to the point. As a user, I would like the answer to "When is Earth Day" to be a result that talks about exactly that instead of, say, a Wikipedia page giving me a comprehensive history of the event where I have to search for my answer.

londrum




msg:4306905
 1:13 pm on May 2, 2011 (gmt 0)

While that appears to be what Google is trying here, I think it makes sense (as a Google searcher) to prefer articles that answer my questions to the point. As a user, I would like the answer to "When is Earth Day" to be a result that talks about exactly that instead of, say, a Wikipedia page giving me a comprehensive history of the event where I have to search for my answer.


there is no future in these kind of searches for webmasters, because google is increasingly providing the answers themselves. they dont want to send the visitors anywhere.

for example, search for "eiffel tower height"

google provides the actual answer at the top of the SERPs, which it has scraped from the sites below.

maha




msg:4306938
 2:59 pm on May 2, 2011 (gmt 0)

1. Own one or more similar sites which all link back to your "main" business. We know that JCP was hit when google found out they had many sites which all link back to JCP. If you own one or more similar sites, its best to follow good practice and take them down or combine them to your main site. I have seen several webmasters get hit for multiple sites ranking for the same/similar terms.


How does G determine who actually owns these "similar" sites? Does that mean I can create similar sites and link to my competitors and bring them down?

bhartzer




msg:4306942
 3:06 pm on May 2, 2011 (gmt 0)

How does G determine who actually owns these "similar" sites?

Google does have access to the domain whois data, so that might be one way.

ascensions




msg:4307042
 7:39 pm on May 2, 2011 (gmt 0)

My website is a commerce site with no ads at all and was hit hard by Panda. In my case, I don't think we were hit by this part of the algo. More likely that we had too many derivative landing pages.


It could be the landing pages, but do the links on your site lead to any site that sells something?

I found "ads" aren't necessarily traditional ads, but some of my most heavily affected pages were quality pages where I linked to my domain where I sell my book. A real book, in bookstores and on Amazon... but Google determined my traditional links of me recommending my book as an Ad. No heavy keywords, anything like that. If you link to a site selling something... Google considers it an ad now.

suggy




msg:4307303
 12:15 pm on May 3, 2011 (gmt 0)

Hi ascensions

It's 99.9% pure play e-commerce. We do the selling. Are McAfee secure, PCI Compliant, use Verisign EV certificates, don't use an out of the box CMS or basket.

I say 99.9% because we have obscured affiliate links on about 5 out of 30,000 pages. I struggle to believe this is the cause of our problems!

The main problem is we allowed on site search pages to be indexed.

The secondary one was that we sliced our product categories up thinner and thinner to target more specific searches and optimised those pages with a few hundred words of content. You end up with pages that share far too much in common: same links, pictures and just a few words of derivative content (which is also too similar).

Finally, we got to like the idea that the site had such authority all we needed to do was write a page for mid-tail search terms, point a few internals at it and we would rank. We built silos of these pages inter-linked. Panda threw those all out.

The key in all this is "written to search engines" + too similar/ not unique enough. Seems better to not have words ont the page than 250 - 500 'derivative' ones.

suggy




msg:4307304
 12:18 pm on May 3, 2011 (gmt 0)

ps: I strongly believe that the reason plenty of unloved dross ranks right now is because all the real competition have shot themselves in the foot! In short, we're all Pandalized (to some extent or other) and they are not!

ascensions




msg:4307332
 1:52 pm on May 3, 2011 (gmt 0)

@Suggy

One of my first thoughts was a portion of the algorithm is based on the link structure. Where as if we visualize previous models, linking was a side-ways triangle with the the tip pointing to the home-page, it's now the opposite, where as the tip is pointing to the terminus page. It fits with all the winners. Sites that use intermediary "topic pages" or "search pages" would definitely suffer, as they're primary design is to "spread out" mathematically whatever authority the homepage has.

It's my opinion that Google wants us to keep such pages out of the index, perhaps providing a path for indexing that doesn't involve equalitive linking, but emphasizes the newest and most relevant content first.

In other words, if every page has approximately the same authority due to a (negative) linking pyramid then it's most likely designed to be a "farm". On the other hand, if the link map appears that the freshest and most relative content is commonly the most authoritative which then drives the authority to other pages, then that to me, would indicate quality.

londrum




msg:4307336
 2:04 pm on May 3, 2011 (gmt 0)

the problem with that theory is that it ignores bazillions of user searches.

if i have a page which just lists all the shoes i sell, then you're saying that google will consider it a "farm" page, and favour the actual individual shoe pages that it leads to instead.

but what if a user searches for "shoes"? he doesnt want a page with one shoe on it. he wants a page that lists the whole lot, so he can view a selection.

the same with "hotels", and any other number of search terms

suggy




msg:4307379
 3:33 pm on May 3, 2011 (gmt 0)

Hi @ascensions

I think I understand what you mean. Basically, a large chunk of content that appears like a growth on the side (or even a smaller one maybe) stinks of landing pages/ doorway pages?

This would be logical since these pages do not appear within the narrative of the site (if they are out on a limb, what chance has the average visitor who didn't land on one of seeing them?!)

Not sure I fully understand how the triangle could be completely upside down/ reverse / inverse?

Suggy

ascensions




msg:4307383
 3:42 pm on May 3, 2011 (gmt 0)

I think of it like this. The way SEO used to build sites is a hollow pyramid. (home page being the top, topic pages being at the bottom) Start pouring water (authority) in the top and all the topic page at the bottom get some. (almost equally)

Now flip that pyramid upside down. Pour the water in from the bottom and suddenly things change. Why does it change? Because it's a lot easier to build links to a single homepage and let it trickle down. (right-side up pyramid) Natural links however, linking to the topic pages (bottom of the pyramid) require much more difficulty, and are more likely to be "natural".

suggy




msg:4307401
 4:10 pm on May 3, 2011 (gmt 0)

Ah - get it. You're talking about external links. I am talking internal link structure. Sorry.

It's an interesting point on external links though. However, I think even 10 years down the line it's still mainly in google's dreams that mainstream users deep link to specific pages. Many journalists, etc., still don't know there way around a hyperlink and default to sticking the domain name in and hoping someone will make it clickable.

That aside, I did notice this time last year that Google suddenly stopped being less satisfied with our home page as a result. It had ranked well for a broad range of extremely competitive keywords, but suddenly it was like google as saying "yes, that's fine, but have you got a more specific page we can list?". Hence why we started our own small-scale content farming -- to give Google a better page match!

We broke the golden rule -- write for users not google -- and now it's bitten us on the jacksy!

ascensions




msg:4307406
 4:15 pm on May 3, 2011 (gmt 0)

Yeah, totally agree on the journalist thing. I began wondering sometime ago, merely if the reference to a website could be used as a virtual link. I've found no proof of that, but that would be one of my first thoughts as a Google engineer. Design a way to track what material is referencing without URLs.

DirigoDev




msg:4307519
 8:46 pm on May 3, 2011 (gmt 0)

Giving priority placement to ads over content.. This is a big one.


Iím in the health category. I had my team look at revenue model (e.g. direct selling vs advertising model Ė adsense/banners). Out of the top 200 health information sites by traffic 22 direct sellers (those with stores or shops) moved down and 24 direct sellers moved up. Sixteen moved down significantly and 10 moved up significantly.

Of the advertising model sites (e.g. WebMD microsites or About subdomains...) 48 moved down significantly and 54 moved up significantly. Iím using Hitwise to identify winners and losers.

Iím not seeing anything here. Iím going to take another pass through the data and look at the number of adsense links and the number of banners.

Whitey




msg:4307531
 9:04 pm on May 3, 2011 (gmt 0)

6. Too many useless pages. Do you have a ton of useless pages with little to no useless content? Google may very well see this as an attempt to rank for many premium search phrases. If you have questionable content or complete sections of content that offer little to no actual value to the reader, it is probably best to delete/block it. Do you have a user based website? Do your member profiles all really need to be in googles index? Do you have tons of useless tag pages?

I checked affected and non affected sites with substantial repetitive blocks of content - they rank successfully on both for keyword combinations within the blocks and the blocks themselves.

Where are there reported systems of analysis to pinpoint the drops ?

DirigoDev




msg:4307599
 1:50 am on May 4, 2011 (gmt 0)

But about.com got hit, according to SEW and NYT...


This might be true in aggregate, but it is not so universally. The New York Times Co. company confirmed that traffic was down due to Panda in their Q1 2011 earnings conference call. This is not the complete story.

Iíve been analyzing WebMD and About for weeks in the health information category and can assure you that some sites gained significantly while others declined significantly. Same goes for WebMD micro sites. I keep asking WHY? Iíve not found any answers yet. Nothing in the presentation layer of the Web sites explain the phenomenon.

SIGNIFICANT DECREASE:
Alcoholism
Arthritis
Men's Health
Orthopedics
Pediatrics
Sports Medicine

NEUTRAL:
Thyroid
Women's Health

SIGNIFICANT GAIN:
Cancer
Colon Cancer
Diabetes
First Aid
Heartburn
Irritable Bowel
Depression
Dermatology
Bipolar
Dermatology
Psychology

Iíve also looked at authorship. About uses ĎGuidesí who author most of the articles. The Guide being a Ph.D., medical doctor, nurse or person affected by a health condition (e.g. Buddy T. Ė the alcoholism Guide - is in recovery from the experience of living with an alcoholic) does not correlate to an increase or decrease in position. Nor does the writing style per the Gunning fog index. Nor does the design or the number of ads Ė which are similar across the different sites. Very perplexing.

coachm




msg:4307615
 3:23 am on May 4, 2011 (gmt 0)

Looking at the list, I'd bet we were hit due to inter-linked sites, and perhaps duplicate content. We built our sites for visitors, and there's overlap of the niches, which is why we have semi-duplicated original content. It's kind of like having a site for tennis and a site for baseball, and on both sites you put the same relevant articles on sports psychology.

We wanted distinct look and feel of our various sites so as not to completely confuse people, and we also didn't want to run people from one site to the other and get totally lost.

I'm afraid I'm not all that keen on changing our practices to suit google's ideas of what's appropriate for our visitors. I don't mind being flexible if I'm told directly, but I can't see playing guessing games and screwing around with visitors.

Anyway, for whatever reason, traffic better today, but generally we lost about 30-40%.

workingNOMAD




msg:4307708
 9:58 am on May 4, 2011 (gmt 0)

One site of mine has a combination of all 1-6 and actually went up when Panda struck the UK on 12 April (and it's not spammy, just a very old site, rarely updated)

Pjman




msg:4307736
 11:35 am on May 4, 2011 (gmt 0)

@DirigoDev

Your analysis is great! I'm in the education niche. I too see very little that makes by who was hit and who wasn't in our niche. The only thing that is certain is that brands came out of no where.

Scholastic who has never competed (SERP wise) at all, in the 12 years I have been doing this, seem to come up for everything. I feel that 10% of those are warranted. They have some good stuff. But, the remaining 90% of their rankings that jumped are just fluff that is useless to most. There is better stuff on Page 2 and 3 results. A few other brands seem to mixed in there at the top of the results, for no reason what so ever.

SirTox




msg:4308073
 11:16 pm on May 4, 2011 (gmt 0)

I'm seeing a slight recovery from Panda today. Traffic from G is up about 25% over this past Monday. I was hit on April 11th, and since then I have been making many changes to my site. I thought I would share those changes.

    -5,000 post site. Deleted or rewrote pages with less than 100 words. I've been trying to get each post over 300 words and add an image or two of related widgets where possible. I was shocked to find that I had a few pages with only 50-80 words. If I can't write enough details on the topic, the post gets deleted. I'm down to 4,200 pages and counting.

    -Corrected any spelling mistakes I could find. I found about 20.

    -Corrected any outbound broken links or redirections I could find.

    -Removed the tag cloud and all links to tag pages. I did this Tuesday and it seemed to have an effect right away. This leads me to believe a plain, straight path to all the topic pages in your site matter a lot. No more having several tag pages about Widget A and Widget B each with the same mixed and matched links to posts. I would imaging having too many landing pages that link to the same pages matter as well.

    -Added a Facebook Like button. This probably did nothing, but it's worth noting that this caused a tremendous spike in Likes as you would imagine. So this matters if you believe Facebook Likes make a difference.

    -Changed my Facebook page to have a facebook.com/sitename link instead of the random number it originally assigns.

    -Added about 10 videos to my YouTube channel. A couple of them have become pretty popular.

    -Wrote one guest post for a huge site that gets massive traffic and sells widgets in my niche. Note: I am not counting the inbound traffic from this post toward the 25% recovery.

    -Lowered the amount of "Related Post" links at the bottom of the page from 6 to 3.

    -Added "Featured Posts" to my sidebar toward the bottom. This increased average pageviews by about .8 per person.


It's all still a shot in the dark even with this info, but maybe some of you guys can take this information and see a slight recovery like I have.

Marvin Hlavac




msg:4308181
 8:56 am on May 5, 2011 (gmt 0)

-Removed the tag cloud and all links to tag pages.
Did you remove the tag pages themselves too, or only the links to them?
SirTox




msg:4308355
 5:12 pm on May 5, 2011 (gmt 0)

I didn't delete the actual tag pages. Just all the links to them. I've been thinking of setting them to noindex. They have never been in my sitemap.

Currently, I am going through my tag pages and removing the redundant ones or ones with 1 or 2 links on them.

tedster




msg:4308387
 5:49 pm on May 5, 2011 (gmt 0)

If you've got no links remaining to the tag pages, then they are pure orphans and might even be seen as attempted doorway pages. If it's not practical to delete them for some reason, then definitely no-index or even robots.txt disallow. But best of all, remove them.

walkman




msg:4308405
 6:30 pm on May 5, 2011 (gmt 0)

I agree with tedster,
I would put noindex or even a 410/404, let Google visit them and THEN remove links. Or put a noindex /410 and add their urls to a sitemap. You want Google to visit them asap

crobb305




msg:4308424
 7:13 pm on May 5, 2011 (gmt 0)

If you've got no links remaining to the tag pages, then they are pure orphans and might even be seen as attempted doorway pages. If it's not practical to delete them for some reason, then definitely no-index or even robots.txt disallow. But best of all, remove them.


This is a very interesting point Ted, and something that never occurred to me (the "orphan" aspect); however, I did make some changes with something similar in mind (but I went the opposite direction with robots.txt because I think my disallow may have been part of my Panda problem):

When Panda began, I discussed a problem that I found on my site:example.com search. For many years, I have housed my affiliate links in a redirect file. While my site is not a "heavy" affiliate site, it sure looked that way at the time of Panda because of my robots.txt deny. At any one time I have 2 or 3 valid affiliate links. However, since this redirect file has always been denied to Googlebot, Google kept track of all the redirect parameters that I have ever used and kept adding them to the index. Over the years, the number of parameters indexed in Google kept growing and growing. Even when I deleted them from the redirect file (causing a 404) Googlebot wouldn't know about it because it was denied access. And even though I stopped linking to those deleted redirects, Google kept them in the index -- and kept trying to request them according to my WMT data.

Google indexed those links without a description (thin?), and used the anchor as the title, but never knew when I removed links from the redirect file (it couldn't discover the 404). So at the time of Panda, I had dozens of old redirect links indexed in Google and had no pages linking to them. They were orphaned (and 404 to boot). So, I opened up my redirect file to let Googlebot sniff through it. All of the dead/orphaned redirect links were de-indexed with only 2 valid ones remaining. This may have been part of my problem with Pandalization. Dozens of orphaned redirect links (similar to tags). I have kept the file open to Google.

I think tags and redirects may operate in the same way when it comes to Panda, particularly if the files are blocked to Googlebot and dead tags/redirects appear to be orphaned (not to mention "thin" due to the fact that Google can't crawl the content).

I hope this makes sense.

DirigoDev




msg:4308504
 10:23 pm on May 5, 2011 (gmt 0)

If you've got no links remaining to the tag pages, then they are pure orphans and might even be seen as attempted doorway pages.


I have about 500 pages of great original authoritative content. Most are 1800-4500 words. For all these pages I have stronger pages - very similar KW terms. I'm working on making these pages orphans (e.g. removing all links to them except for our Google Search Appliance, Sitemap.xml, and human sitemap. This seems safe to me. My users will find the pages via website search or in the SERPs.

The content was generated by an e-mail newsletter team. We've just kept shoveling content onto the site at the rate of ~250 pages per year because of a newsletter every two weeks. When the team writes a new newsletter they don't put much thought into what content is already on the site. The whole process starts with a new outline. Directories have just grown without any thought to being a farm or confusing the engines on what pages are most important. Most of these pages have between 100-3500 FB likes. Until Panda 2 life was real good.

My idea to remedy the situation is to unlink these 500 pages. Thoughts?

This 70 message thread spans 3 pages: < < 70 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved