homepage Welcome to WebmasterWorld Guest from 54.166.108.167
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 6 7 8 9 [10] 11 12 13 > >     
Matt Cutts and Amit Singhal Share Insider Detail on Panda Update
tedster




msg:4276281
 10:54 pm on Mar 3, 2011 (gmt 0)

Senior member g1smd pointed out this link in another thread - and it's a juicy one. The Panda That Hates Farms [wired.com]

Wired Magazine interviewed both Matt Cutts and Amit Singhal and in the process got some helpful insight into the Farm Update. I note that some of the speculation we've had at WebmasterWorld is confirmed:

Outside quality raters were involved at the beginning
...we used our standard evaluation system that we've developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"


Excessive ads were part of the early definition
There was an engineer who came up with a rigorous set of questions, everything from. "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"


The update is algorithmic, not manual
...we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.

 

freejung




msg:4278415
 5:08 pm on Mar 8, 2011 (gmt 0)

Something interesting I just noticed, on the subject of previews:

Generally from what I've seen, adsense blocks are left blank in the Google preview of a page. I noticed this and placed an image in <noscript> and the CSS background of my adsense unit, so that the image displays when adsense doesn't -- this makes sense from a usability perspective too, as people with ad blockers or javascript turned off will now see content instead of a big hole in the page.

Google picked up these images and has been displaying them in my previews.

About a day ago I added a couple of new pages. The previews for those pages (only) are displaying adsense units in the preview, and even highlighting some of the adsense text.

What have you seen with respect to adsense appearing in previews?

freejung




msg:4278416
 5:09 pm on Mar 8, 2011 (gmt 0)

new update has a problem with pages with pictures and a small amount of text

Not in my case. Most of my pages are like this and are doing fine. I think the downgrading of such pages in some cases is a secondary effect.

Shaddows




msg:4278441
 5:51 pm on Mar 8, 2011 (gmt 0)

I noticed [adsense blocks are left blank in Google preview] and placed an image in <noscript> and the CSS background of my adsense unit


Sounds like a great idea, but smells of cloaking. And obvious cloaking too- the bot cant see Adsense, but Google knows it's there!

freejung




msg:4278512
 9:42 pm on Mar 8, 2011 (gmt 0)

the bot cant see Adsense, but Google knows it's there

Yeah, I was worried about that too -- but the bot can see the adsense, that's what I'm getting at. Some of my previews display adsense in the preview, while others show the background image.

I would argue that it's not cloaking because I'm serving exactly the same page to the bot as I am to anyone else. If you turn off JS, you see the exact same image the bot sees. Also, I happen to know that the bot runs JS because several other elements on my pages are shown as they render with JS, not as they render without it. So I'm giving it an image and some JS, and it gets to choose which it wants to display, that seems fair. Of course, I wouldn't want to have to try to argue that to Google _after_ being penalized for cloaking.

However, I just investigated more and discovered that for most sites, Google just shifts the content up to fill in the blank space where Adsense would be. My layout is set up in such a way that that's not really possible. For these other sites, the result looks like cloaking too, for example a page with adsense right in the middle of the content appears to be just a solid block of text.

The obvious solution, which I'll probably implement, is to redesign the layout so that it devolves gracefully when the adsense unit takes up zero space, filling in the blank space with content instead of an extra image.

seoholic




msg:4278514
 9:45 pm on Mar 8, 2011 (gmt 0)

Hi All,

I wanted to update this thread with some additional guidance for those who have sites that may be affected by this update.

Our recent update is designed to reduce rankings for low-quality sites, so the key thing for webmasters to do is make sure their sites are the highest quality possible. We looked at a variety of signals to detect low quality sites. Bear in mind that people searching on Google typically don't want to see shallow or poorly written content, content thatís copied from other websites, or information that are just not that useful. In addition, it's important for webmasters to know that low quality content on part of a site can impact a site's ranking as a whole. For this reason, if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.

We've been reading this thread within the Googleplex and appreciate both the concrete feedback as well as the more general suggestions. This is an algorithmic change and it doesn't have any manual exceptions applied to it, but this feedback will be useful as we work on future iterations of the algorithm.

Wysz

[google.com...]

[edited by: tedster at 4:35 pm (utc) on Mar 9, 2011]
[edit reason] add quote box [/edit]

walkman




msg:4278517
 9:59 pm on Mar 8, 2011 (gmt 0)

the bot cant see Adsense, but Google knows it's there


Sure it can. Do a view source on your site

econman




msg:4278524
 10:27 pm on Mar 8, 2011 (gmt 0)

He has confirmed we are dealing with two significant paradigm shifts in a single update.

reduce rankings for low-quality sites,


This update is the first time Google has officially/explicitly focused on "quality" instead of "relevance".

It is also the first time Google is explicitly evaluating "sites" instead of just individual "documents" or pages.

This update already has caught a lot of attention despite supposedly affecting just 12% of all queries.

I suspect the ultimate impact of these paradigm shifts will be far more significant than anything we've seen in the US SERPs so far.

For those who think we weren't given enough warning before this change was rolled out, it's time to think about the long term implications of these paradigm shifts.

I think content publishers are finally being given an incentive to create high quality sites filled with high quality content, rather than just being given an incentive to publish as many pages of content as possible, in the hopes that some of the pages will rank, or that the sheer volume of pages will help our successful pages rank higher.

I haven't thought through the implications for ecommerce sites, but I suspect there will eventually be changes to how Google ranks those sites as well -- just as the Adwords Landing Page Quality Scores were (in a sense) a harbinger of this change affecting content publishers.

smithaa02




msg:4278573
 12:13 am on Mar 9, 2011 (gmt 0)

Interesting PDF... So if bounce rate off of google ads helps page rank, should I be spam clicking my google ads to go x pages deep to fake a 'low bounce rate'. Wonder if this isn't just for bounce rates off of the sponsored ads, but the general SERP's?

If so, that would be unfortunate as bounce rates are sooo deceptive. Many people if they want to go to amazon.com don't type in amazon.com but type in amazon into google each and every time...Mostly newbie users who don't want to bother with bookmarks (to them google is a big bookmark) nor do they want to bother with that scary 'http://'... I see this all the time in my logs...what this means however is that when google is doubling as a domain lookup service, it is exaggerating how low of a bounce rate a big name brand site has, if this makes sense. Lesser no-name sites don't have a known brand like amazon or wal-mart, so they are not going to get these 'high quality' domain lookup pseudo searches that the big boys will. Because these 'domain searches' tend to lead to more click depth, such an algo would exaggerate large name-brand sites.

By rewarding low bounce rates, google would also be punishing simple, straight to the point website that don't have a maze of clicks to get what you want. If I search yellow pages businessa citya, and the first directory site perfectly has the phone number I need, I don't need to browse anymore...but did I just hurt that site?

Then if you have the misfortune of having a one page website (nothing wrong with this as it has it's advantages and niches), then bounce rate = serps = murder.

For one pagers/low pagers, if google wants to go in this direction, my vote would be for them to use visitor loyalty and/or length of visit instead (they are probably mining GA data anyways).

aakk9999




msg:4278587
 12:31 am on Mar 9, 2011 (gmt 0)

If I search yellow pages businessa citya, and the first directory site perfectly has the phone number I need, I don't need to browse anymore...

From what I have read, the bounce rate was mentioned in context of how quickly the visitor clicked back to Google SERPs and not whether it was only one page view. In which case not all bounces are seen as "true bounces".

E.g. it could be:
a) return to SERPs within 5 sec = bounce
b) return to SERPS within >5 sec = page answered question, not a bounce despite one page viewed only

<added> 5 sec is arbitrary, for example only </added>

tangor




msg:4278595
 12:42 am on Mar 9, 2011 (gmt 0)

mind that people searching on Google typically don't want to see shallow or poorly written content, content thatís copied from other websites, or information that are


@seoholic

If you're in the Googplex... can you let us know how YOU know who copied which content from which other site? Who had it first? Who copied it? Do we have to resort to DCMAs to supply that info? We all recognize that scraping is the problem, by why are so many ORIGINAL CONTENT CREATORS getting creamed?

That said, if the content was good enough the first time, it ought to be good enough, even copied. Throwing the baby out with the bathwater at the moment...

Venting, of course, nothing personal! But I do remain confused as to how Google identifies Who Came First?

AlyssaS




msg:4278599
 12:53 am on Mar 9, 2011 (gmt 0)

Interesting PDF... So if bounce rate off of google ads helps page rank, should I be spam clicking my google ads to go x pages deep to fake a 'low bounce rate'. Wonder if this isn't just for bounce rates off of the sponsored ads, but the general SERP's?


The PDF isn't about the bounce rate off google ads on your site. The experiment it is describing looks at the sponsored ads on the search pages - you know, the ones on the right hand side. They were using bounce as a way of judging how good the landing pages are, and appear to be extending this now to the organic results.

If I search yellow pages businessa citya, and the first directory site perfectly has the phone number I need, I don't need to browse anymore...but did I just hurt that site?


No - because you wouldn't be searching any further - you'd be off phoning the number you just found.

As far as I can tell, they are comparing websites, against others on the same search page. It's not the case of pitting the yellow pages site against Amazon.com, but against other sites on the same search page returned when you searched for abc's phone number. So the type of page will be similar, the backlinks are probably of equal weight, the question then is which page delivers the best result from the user point of view. And like aakk9999 said, there's probably a time element in play too. Perhaps a range of times, and your site will get plotted on a graph. It takes at least 10 seconds to read a sentence, so if someone is bouncing faster than that, is it because the site was loading slowly and they got fed-up, or did they see a whole bunch of ads and decide it wasn't what they wanted, and so on.

They are probably counting other things too - say the number of people who click on the preview button but fail to click on the site, the length of time people stay on a site, and so on.

It's smart and hard to game. It's no good making a bot to perform searches and click your site, as they are probably only counting when a browser is present and in addition probably they know all the ip addresses doing these bot type searches (and all their proxies). You could pay someone to do it, but as soon as you stopped, the learning machine would pick up that you'd stopped and would conclude your site had deteriorated relative to every one else, and down you'd go. It would be cheaper to just improve your site.

mcolom




msg:4278605
 1:26 am on Mar 9, 2011 (gmt 0)

Lots of ppl search-tab. I mean, look at the google results, tab the pages they think could have what they're searching for, and then read them one after another. How could G separate this user behaviour (which I think is quite common) from bouncing all the pages?

aakk9999




msg:4278606
 1:27 am on Mar 9, 2011 (gmt 0)

there's probably a time element in play too. Perhaps a range of times, and your site will get plotted on a graph.

It takes at least 10 seconds to read a sentence, so if someone is bouncing faster than that, is it because the site was loading slowly and they got fed-up...

I would agree with this. And as AlyssaS said, they may also take into account the average page load for this page, they certainly do have lots of this info as it shows in WMT labs section. Perhaps even combine page load time with the surfing speed of the visitor.

This would negate slow site appearing not to be a bounce just because it took 8 seconds for the page to load. I am sure there is quite a bit of complexity involved here, but with the right amount of the data it is possible.

aakk9999




msg:4278607
 1:30 am on Mar 9, 2011 (gmt 0)

Lots of ppl search-tab. I mean, look at the google results, tab the pages they think could have what they're searching for, and then read them one after another. How could G separate this user behaviour (which I think is quite common) from bouncing all the pages?

Good point. I do the same - open in a new tab and have FF set up not to immediately switch to it. But I would still think that your AVERAGE searcher does not do this - it is more power searcher type of behaviour. So perhaps it is not significant enough?

<added>Some time ago I was monitoring what happens when the user clicks on "back" from the page to SERPs and saw that there is a /gen_204?atyp=i&ct=backbutton request fired by Google when you return to SERPs via BACK button. I just checked, if you just opened many tabs from SERPs, but did not click "BACK", you do not get this request fired. Hence the pages opened in other tabs would not get recorded as gone quickly back to SERPs even though you might have clicked on another SERPs entry. It may be that Google just discards these searches and not take them into account.</added>

mcolom




msg:4278612
 1:45 am on Mar 9, 2011 (gmt 0)

I'm not sure is that marginal. Yes it's true that this is a power searchers thing, but there are other groups. For instance those who have slow connections, because its faster, they dont have to go back and forth, just open a new tab and if its not what they wanted, close it.

mcolom




msg:4278613
 1:47 am on Mar 9, 2011 (gmt 0)

I think you found the solution to the problem aakk9999, nice point.

P.D: Anyone knows if the algo change already affects Europe?

Leosghost




msg:4278622
 2:33 am on Mar 9, 2011 (gmt 0)

Europe is not one country ..neither as a market , nor as a single serp, nor as a single "hosted here / targeted market is "Europe" entity" ..

Each country has it's own serps / hosting / targets ( there is "overlap" but it is very small ..and very much smaller than the .EU registrar would have you believe ( I cede to JMcM for a real explanation of why and how and the shenanigans associated with the .EU in all it's shame ;-) hat tip to "himself" ) ..suffice it to say that IMO ;-) there will be no simultaneous Europe wide "update" ..and if a modified version rolls ( and modified it will be depending on each country ? browser language ?, OS language ? site target? site host? and all and any combinations of the foregoing etc ) ..it will probably begin with the UK ..and Ireland ..

Both predominantly Anglo-phone structures and thus from the plex's point of view a "baby step" after the USA ..

More interesting IMO, is what are the perceived differences in implementation between the USA roll out and that of Canada..?

The market and search demographic in USA / Canada has way more overlap than any other area ( even taking into account the linguistic effect of Quebecois and the bi-lingual internal Canadian structure that that imposes ) ..it would give interesting insights into the currently prevailing thinking at the plex if Canadian English serps were very different ( and in what way ? ) from Canadian French serps.

kd454




msg:4278637
 3:15 am on Mar 9, 2011 (gmt 0)

Here is a new article from CNN on the matter: [money.cnn.com...]

[edited by: kd454 at 3:15 am (utc) on Mar 9, 2011]

tangor




msg:4278638
 3:15 am on Mar 9, 2011 (gmt 0)

Many sites I've seen dropping after this Google algo change have been EU sites as VIEWED by the USA... sites that I regularly visit (and have bookmarked but are not showing in the same positions as earlier). Not sure how this applies, just commenting and am curious if any in the EU areas have seen less USA traffic in recent days...

Leosghost




msg:4278642
 3:23 am on Mar 9, 2011 (gmt 0)

You mean .eu ? or EU hosted ( but .com, .net.org etc ) or ,EU..or .co.uk ..or .fr or .de or .sp ? ( and if any of the latter group ? ( which specifically ) ..hosted where ? )..

For example the (el)reg ( .co.uk ) is probably mirrored / balanced ( don't know haven't checked ..should have I know ..but guessing..would be standard/best practice for a high traffic multi national audience site ..especially "tech" ) as it has a heavily split transatlantic audience...and then the rest of the world.

viggen




msg:4278647
 3:37 am on Mar 9, 2011 (gmt 0)

@seoholic low quality content on part of a site can impact a site's ranking as a whole

Thats rather a very weird approach to search,

so if I have an authority site say a discussion forum about green widget and all the experts write about it, talk about it and participate in it, but I also have an offtopic section where people talk allot of rubish about unrelated stuff, so does that now mean the offtopic section devalues the whole forum? Do i have to no follow the offtopic section, to I have to shut it down, how far is this madness going?

minnapple




msg:4278649
 3:48 am on Mar 9, 2011 (gmt 0)

Since this algo change, my phone has been ringing off the hook. I have been around for awhile and have a decent reputation, hence the calls.

I have not been able to unlock a single site that was hit from the introduction of the new algo.

Not one of these sites have quality issues, all of them are credible sites that have been around for a decade.

I am beginning to question if a secondary algo modification, was rolled out with the much lauded quality algo.

It might not be the case, but it is something to think about.

[edited by: minnapple at 3:56 am (utc) on Mar 9, 2011]

tangor




msg:4278651
 3:50 am on Mar 9, 2011 (gmt 0)

EU..or .co.uk ..or .fr or .de or .sp ? ( and if any of the latter group ? ( which specifically ) ..hosted where ? )..
This group, hosted NOT USA. Many of these bookmarks are 6-7 years old. Sites exist, but I searched for them again in G and most do not appear in the first three page returns when they were on page one when I found them, back then. As I said, just an observation.

Thanks goodness for bookmarks (favorites, links, whatever you want to call them)...

nuthin




msg:4278672
 5:16 am on Mar 9, 2011 (gmt 0)

From @seoholic's response, I get they just want web sites to stay on a particular theme. The more widely your site varies from that theme, the likelihood that some of your content may get devalued or be seen as 'low quality' the further it strays from the overall theme of the site.

@viggen raises a few good points.

If they are looking to do that and how it may impact the overall authority of your entire web site due to a few pages I would be worried with this approach they are heading towards. Kill the pages, not the site.

SEOPTI




msg:4278673
 5:25 am on Mar 9, 2011 (gmt 0)

SERPs have been locked after panda algo change, this means a new quality reranking did not take place.

chrisv1963




msg:4278698
 7:33 am on Mar 9, 2011 (gmt 0)

If you're in the Googplex... can you let us know how YOU know who copied which content from which other site? Who had it first? Who copied it? Do we have to resort to DCMAs to supply that info? We all recognize that scraping is the problem, by why are so many ORIGINAL CONTENT CREATORS getting creamed?


I don't think they do know. Yesterday I checked a few articles that dropped in rankings. Most of the articles that dropped dramatically have been copied by websites in Vietnam, Indonesia, ... Now they rank above me, with my content. One article was copied by more than 20 websites. To my surprise I no longer rank for that article, but the content thiefs do!

Globetrotter




msg:4278701
 7:43 am on Mar 9, 2011 (gmt 0)

@tangor i'm seeing an exact same drop curve as Vanessa wrote on search engine land and i'm based in europe. I do not have US based traffic (or ever had) and all content is written in my local language. Lost about 30% of traffic since 24th of februari.

I'm do not depend on US links. What I do see here in the SERPS is a lot of foreign websites if you look for country specific keywords. (like you would do when you would have a travel site). Before this update we would see sites in our native language now we see sites from other countries, wikipedia and native sites with the keyword in the domain name.

zoltan




msg:4278702
 7:56 am on Mar 9, 2011 (gmt 0)

Interesting theory regarding EU sites. My site is global, hosted on US servers, targeting a global audience. But in footer, I state on all pages: "Website is maintained and operated by My Company Ltd. a registered EU (European Union) company. Our VAT number is: XX01234567". Could that be an issue?
Also, I really wonder if and when this new algo will be released worldwide not only in US. I believe Google know they screwed up something in US but putting back the old results would publicly admit: "Yes, we screwed up and put everything back..." However, not releasing this yet in other countries would be a wise idea for them IMO.

TheMadScientist




msg:4278703
 8:03 am on Mar 9, 2011 (gmt 0)

I believe Google know they screwed up something in US but putting back the old results would publicly admit: "Yes, we screwed up and put everything back..."

Actually, they've rolled back changes to the old results on numerous occasions in the past, so imo if they really thought it was wrong they would do it again ... They've actually rolled, rolled back, rinsed and repeated on more than one occasion when they've done updates over the years, so the fact they're not rolling back these changes strongly suggests their internal data is telling them something a bit different than the webmasters (mainly the ones who don't like specific sites or lost their positions) are saying.

There's always some false positives and they'll work to correct those if history repeats itself, but again, in the past they have rolled back changes and I think they would still the ability built into the system, so my guess is this update is not nearly as bad from their side as you would believe if you spend much time reading webmaster forums...

zoltan




msg:4278707
 8:12 am on Mar 9, 2011 (gmt 0)

TheMadScientist, I believe everyone knows his niche and make their comments based on what they see for the keywords that are targeted. Yes, everyone tends to think their site is better than all other sites but simply moving from position 14 to 130 is not something we can accept. Probably moving down to position 30 would be acceptable.
For my main term, I see a lot of junk including garbage and 10 links with the same title posted over a period of 2 days to a not moderated classified section. Posted by the same member who simply repost the same ad every day 5-10 times and clearly abuse the site.

Jane_Doe




msg:4278709
 8:16 am on Mar 9, 2011 (gmt 0)

I do not think they will roll this change back. They succeeded in getting rid of many of the content farm pages. I would expect some tweaks but I would be very surprised if this change got rolled back completely.

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 6 7 8 9 [10] 11 12 13 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved