homepage Welcome to WebmasterWorld Guest from 54.83.133.189
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe and Support WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 6 [7] 8 9 10 11 12 13 > >     
Matt Cutts and Amit Singhal Share Insider Detail on Panda Update
tedster




msg:4276281
 10:54 pm on Mar 3, 2011 (gmt 0)

Senior member g1smd pointed out this link in another thread - and it's a juicy one. The Panda That Hates Farms [wired.com]

Wired Magazine interviewed both Matt Cutts and Amit Singhal and in the process got some helpful insight into the Farm Update. I note that some of the speculation we've had at WebmasterWorld is confirmed:

Outside quality raters were involved at the beginning
...we used our standard evaluation system that we've developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"


Excessive ads were part of the early definition
There was an engineer who came up with a rigorous set of questions, everything from. "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"


The update is algorithmic, not manual
...we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.

 

freejung




msg:4277477
 4:56 pm on Mar 6, 2011 (gmt 0)

Wow! Take a gander at some of those sites that people are reporting losses for. Go ahead and drill down to the MFA pages. I visited more than a few, all of them would have failed my initial sniff tests. Some of those folks have definitely stretched the limits of AdSense.

I agree that many of these sites have too much ad space, but there are several notable exceptions that lead me to suspect that ad space is not the primary factor here.

The one common thing I'm seeing in these sites has to do with overall look-and-feel and is hard to define, but it's basically this: upon landing on one of the internal pages of these sites, it's not immediately obvious that you've found the answer to your query (even in cases where a good answer really is there). This has to do with layout, typesetting and placement, which when badly done creat an overall impression of confusion.

My hypothesis, based on this, is that user interaction data of some kind is the primary factor here. People land on these pages and react negatively in some way that Google can measure as indicating "this page doesn't immediately appear to be what I want."

indyank




msg:4277489
 5:22 pm on Mar 6, 2011 (gmt 0)

What kind of bounce rates do webmasters here consider as normal? I guess it would be helpful if people can share it.

I see a site with a bounce rate of 60-65% being whacked by this algo.

freejung




msg:4277490
 5:26 pm on Mar 6, 2011 (gmt 0)

To elaborate on the above post:

This issue of the overall impression of the page as a landing page is something I've been working on for my site recently, as it was a common criticism from testers. I've made several changes over the past year or so that are designed to make it more immediately obvious that the searcher is in the right place: I changed the color scheme to look more like other sites in the genre, I made titles stand out more, and I put several graphical elements above the fold that make it more visually clear what the site is about. I gained traffic in this update, so perhaps my efforts have paid off.

Because it's something I've been thinking about, I noticed that most of these sites that lost traffic seem to have problems in this area. For example, one of them is an art site, optimized for art-related keywords -- but most of the pages have few or no actual artistic images above the fold. If I'm looking for art and I land on a page that's almost all text, with maybe one tiny image above the fold, I'm going to look elsewhere. I don't want text, I want art! I bet if that site were to put one or two large relevant images at the top of each page, it would do just fine.

I'm thinking this has more to do with visual design and overall look than anything else. If you've lost in this update, maybe consider hiring a top-notch graphic/layout designer.

indyank




msg:4277491
 5:28 pm on Mar 6, 2011 (gmt 0)

However I will have to add that bounce rates are gamed by most big sites.If you look for a product, they never make it available on the first page.There will always be a separate page to buy things.

Smaller sites tend to give it away in the landing page.

Bounce rate is a noiy signal and not the perfect one.

I think G is redefining most of the things these days, though they had ignored them in the past for good reasons.Is it because people working on it now are different from those who worked on it earlier?

freejung




msg:4277492
 5:31 pm on Mar 6, 2011 (gmt 0)

indyank, I doubt it's as simple as just measuring bounce rates, but in the spirit of sharing potentially useful data: my bounce rate is about 40% and has not changed significantly in years. The changes I speak of above did not affect bounce rate (that was kind of disappointing, I thought they would) but did have some impact on average pageviews and time on site.

venti




msg:4277493
 5:31 pm on Mar 6, 2011 (gmt 0)

Agreed with sentiments about the bounce rate. Often larger sites will make you click through to get the information you are looking for (e.g. click here for price or phone). I am of the frame of mind to provide that to the end users without hassle. This effects both our bounce rate and time on site, but I feel it provides the better user experience. Would be a shame if that was somehow negative in Google's eyes.

hyperkik




msg:4277495
 5:31 pm on Mar 6, 2011 (gmt 0)

I agree that many of these sites have too much ad space, but there are several notable exceptions that lead me to suspect that ad space is not the primary factor here.

The impact of the update is not necessarily site-wide. I have one site that was hit in part. The section that was hit (@40-50%) had very different html and css than the section that was not - but used the same advertisers and ad networks. Most of the pages in that section that was affected had fewer units per page than the section that was not affected. I've recoded the pages and should know, soon enough, if that makes a difference.

indyank




msg:4277496
 5:32 pm on Mar 6, 2011 (gmt 0)

If I'm looking for art and I land on a page that's almost all text, with maybe one tiny image above the fold, I'm going to look elsewhere.


freejung, each one will have his own idea of what is good and what is bad.Human beings are not all the same.Even culture is not the same.

I always prefer stories that flow naturally.Images should be in the right places within the overall flow.

G should focus more on whether the page has stuff what the user was looking for rather than focussing on things that some may consider as bad.If they tend to listen to the stories from a few people, they will miss out striking a balance with other cultures.

My feeling is G is being confused by outsiders to a great extent these days.They shouldn't be attempting to do everything.They should define a boundary at some point.

[edited by: indyank at 5:38 pm (utc) on Mar 6, 2011]

freejung




msg:4277497
 5:37 pm on Mar 6, 2011 (gmt 0)

freejung, perceptions will different from one to another

Of course, but in a large dataset you can still find distinct patterns of behavior, despite individual variation.

The idea that it's about design is just a hypothesis, but it seems to fit with the data we have so far.

freejung




msg:4277500
 5:55 pm on Mar 6, 2011 (gmt 0)

The section that was hit (@40-50%) had very different html and css than the section that was not

How would you characterize the difference? Was one section clearly better-designed than the other?

dickbaker




msg:4277551
 8:32 pm on Mar 6, 2011 (gmt 0)

I look at bounce rates for my site as being significant, even if Google doesn't. Google Analytics was showing a bounce rate of 45-48%, which I thought was high, given the nature of my content.

I found a bit of code to add to GA's tracking that changes the bounce rate to not include single-page visits that last ten seconds or more. If somebody spends more than one second on a page, I figure he's found something he was looking for, even if he doesn't go further.

At any rate, my bounce rate with the 10 second filter is anywhere from 8%-18% for most pages. Any pages higher than that are ones I need to scrutinize.

tangor




msg:4277566
 10:31 pm on Mar 6, 2011 (gmt 0)

If I was a conspiracy nut I'd be inclined to think these new changes are intended to rotate a number of (not big name) sites out of the top so a new set of wannabee sites trying to get into Adsense earnings and failing since their sites just aren't quite good enough... after all Google is an advertising company and if they are not getting new advertisers, they're losing money.

But, dear kiddies, not even my tin-foil hat is that rumpled! :)

zoltan




msg:4277569
 11:03 pm on Mar 6, 2011 (gmt 0)

"I found a bit of code to add to GA's tracking that changes the bounce rate to not include single-page visits that last ten seconds or more."
dickbaker, where did you found that code?

docbird




msg:4277576
 11:36 pm on Mar 6, 2011 (gmt 0)

freejung (and others who've mentioned above the fold): having to look or scroll down past a lot of advertising space is surely not ideal for a visitor. Yet there are such pages; at times have decent info but not designed to present it without users making some effort. A bit like having store to a museum, say, right on the way in.
Nor does this really require a top notch designer to fix such a massive issue

Content_ed




msg:4277585
 12:15 am on Mar 7, 2011 (gmt 0)

I've got a site with a bounce rate of 83% and Google traffic is up since the algo change. The information on this site could have been spread across hundreds of pages, but instead it's on a dozen long pages with some graphical navigation aids at the top. If a visitor doesn't see what they want there, it's not going to be on any other pages of the site, which the limited sidebar navigation makes obvious. It's not a major site, draws something over 500/day from Google.

Content_ed




msg:4277586
 12:20 am on Mar 7, 2011 (gmt 0)

I read through the entire list of sites reporting in on the WebMaster Tools thread. Yes, there were a number of MFAs, but what really stood out to me was all the 10 year old sites that got slammed. One guy even had PR=8 for the homepage.

I don't have a great critical design eye, but some of the serious content sites (the basket mine fall into) looked awfully good to me. Many of them commented on getting ripped-off all the time and suggested duplicate content issues.

And yes, I would have given them my credit card AND followed their medical advice.

dickbaker




msg:4277590
 12:49 am on Mar 7, 2011 (gmt 0)

Zoltan, here's the entire script:

<script type="text/javascript">

var _gaq = _gaq || [];
_gaq.push(['_setAccount', 'UA-#*$!#*$!-X']);
_gaq.push(['_trackPageview']);
setTimeout('_gaq.push([\'_trackEvent\', \'NoBounce\', \'Over 10 seconds\'])',10000);

(function() {
var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
})();

</script>

dibbern2




msg:4277597
 1:13 am on Mar 7, 2011 (gmt 0)

I see a site with a bounce rate of 60-65% being whacked by this algo.


Not at all. Not even with higher bounce.

Perfection




msg:4277641
 3:30 am on Mar 7, 2011 (gmt 0)

but what really stood out to me was all the 10 year old sites that got slammed


I don't have a great critical design eye, but some of the serious content sites (the basket mine fall into) looked awfully good to me. Many of them commented on getting ripped-off all the time and suggested duplicate content issues.


Yes to both. The site of mine that got hit the hardest is literally 10 years old. It has also been constantly scraped and ripped off more times than I can even put into words.

mromero




msg:4277673
 5:28 am on Mar 7, 2011 (gmt 0)

Another anecdotal find. One of our pages is a photo gallery and has lots of traffic as all are original images, fast loading thumbnails, descriptive names etc.

G downgraded this page about 30+%.

The only recent changes we did on this page in the past 2 months:

1. Downgraded adsense to the bottom - it did not look nice above the fold.

2. Added about 10 new images.

3. Got 1,000 plus likes from FB

4. Got so weary of folks stealing our images, including erasing our watermark, uploading to wikipedia, then re-downloading to their website (a rather creative technique).

So added in bold a warning of copyright and that we were using copyscape to monitor the page.

BAM! G downgraded 35% impressions but clicks went up 40% - what gives? Does the new algo not know what to do with quality photo galleries?

Jane_Doe




msg:4277674
 5:38 am on Mar 7, 2011 (gmt 0)

The site of mine that got hit the hardest is literally 10 years old. It has also been constantly scraped and ripped off more times than I can even put into words.


I think older sites had more time to accumulate more back links from the types of sites that got downgraded, so there has been a ripple effect impacting older sites maybe often a bit than others.

Tashi




msg:4277694
 7:59 am on Mar 7, 2011 (gmt 0)

@mromero

The one site of me that got hit has got, besides a lot of content, a photo gallery as well. Each thumbnail is going to a seperate html page (without ads) to show a big version of the picture.

Could it be that G sees these as thin-content pages?
Would it be a good idea to remove them, even though my visitors like them (lots of comments)?

js2k9




msg:4277783
 2:15 pm on Mar 7, 2011 (gmt 0)

A pro SEO making money on the 6 figure level told me this update is about google maps. Yes, google maps. Go figure....as I'm still trying to figure this out. I know he's a reputable source, so....perhaps this helps.

Content_ed




msg:4277794
 2:44 pm on Mar 7, 2011 (gmt 0)

I read Vanessa's post. What I got out of it was:

#1 It's not us, it's you
#2 Don't complain in public
#3 Accuse yourself of blackhat SEO
#4 Accuse yourself of bad content
#5 Spend your time on SEO rather than content
#6 Don't offer suggestions, we're smarter than all of you put together

mromero




msg:4277833
 3:56 pm on Mar 7, 2011 (gmt 0)

Tashi - whatever happened to the picture is worth a thousand words mantra?

We will write an additional 500 words on this page article around the photo gallery, widen the page and double the length of descriptions and see what happens. We live, work and play in our niche so no problem to generate content.

Jane_Doe




msg:4277844
 4:14 pm on Mar 7, 2011 (gmt 0)

I read Vanessa's post. What I got out of it was.


She was just trying to be helpful. I got a lot out of her article. She just tried to point out that if you want to survive long term you it helps to not take algorithm changes personally. Most site owners aren't going to stay at the top of Google or any other search engine year after year without doing some analysis of what types of pages the algorithm is favoring and what types of pages are not doing as well. My older pages got hit because they were designed for an algorithm in existence 10 years ago.

Content_ed




msg:4277877
 5:32 pm on Mar 7, 2011 (gmt 0)

@Jane

I never designed for an algo, I designed for people. That's still Google's official advice, it simply my not be true anymore.

When you redesign your sites and regain your positions, you can thank Google if you think it's appropriate. I think you're getting a little ahead of the game thanking them for making work for you before you know the outcome.

I agree she was trying to be helpful, I just find it offensive how she went about it.

Bewenched




msg:4277881
 5:47 pm on Mar 7, 2011 (gmt 0)

@Content_ed

Ours is a 14 year old ecommerce site that got slammed.

Jane_Doe




msg:4277882
 5:48 pm on Mar 7, 2011 (gmt 0)

When you redesign your sites and regain your positions, you can thank Google if you think it's appropriate. I think you're getting a little ahead of the game thanking them for making work for you before you know the outcome.


I wasn't thanking Google. I was thanking an ex-employee at Google for sharing her insights.

I never designed for an algo, I designed for people. That's still Google's official advice, it simply my not be true anymore.


In the competitive categories, there are often millions of search returns for a single search term. Good content alone usually isn't enough to get you in the top ten spots.

tedster




msg:4277885
 5:55 pm on Mar 7, 2011 (gmt 0)

If you want a good rule of thumb for where Google is aiming, think "measuring engagement". I know that "engagement" sounds like a social media metric, but more and more it will also be important for SEO. We could even understand the Panda update as a first venture into that territory.

ken_b




msg:4277887
 6:04 pm on Mar 7, 2011 (gmt 0)

Panda update


Geeze, I wish could we settle on ONE name for this event and use it consistently, at least here on WW?

Panda, panda farm, farm, farmer, blah, blah blah ....

Google sends enough confusing signals without us adding to the mix.

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 6 [7] 8 9 10 11 12 13 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved