Welcome to WebmasterWorld Guest from 54.158.65.139

Matt Cutts About the Next Penguin Update

   
10:27 am on Aug 16, 2012 (gmt 0)



Matt Cutts discussed at SES San Francisco to talk on stage and answer questions...

When people asked Cutts about the next Penguin Update he thought: You don’t want the next Penguin update, the engineers have been working hard.


For Penguin:- The updates are going the be jarring and julting for a while.


Webmasters who want to get as much visibility as possible should look at the spectrum of value you’re adding.
11:37 pm on Aug 21, 2012 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I don't see how that approach scale, because a "site" cannot be reviewed in isolation. Its individual pages would need to be reviewed with reference to a particular query phrase.

At the scale of today's (and tomorrow's) web, it looks to me like algorithms are a necessity. Penguin is an early attempt at a new kind of algorithm (the ground was first broken by Panda) and it's only a beginning, I'm sure.
12:43 am on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



@tedster - of course I am kidding (notice the wink). However, I still wouldn't mind the second "human" review beyond what the algo had determined, especially since even G acknowledges that there are false positives. That's an ugly place to be.
1:01 am on Aug 22, 2012 (gmt 0)

10+ Year Member



At the scale of today's (and tomorrow's) web, it looks to me like algorithms are a necessity. Penguin is an early attempt at a new kind of algorithm (the ground was first broken by Panda) and it's only a beginning, I'm sure.


Yes, they need algorithms - but they need people to step in where the algorithms are not working properly. Google has profits in the BILLIONS. They could stand to implement a team where, upon request, they review the decisions made by their algorithms to see if they make sense.

There is no way a generic algorithm can determine which page, among several, is the "best". In the past it didn't matter as much because Google would show all of the results, and the users would ultimately pick the ones they liked by both clicking and linking to them. But Google is moving in the direction of suppressing pages when they decide that several present "essentially the same information" - and only serving the "best" page. In other words, instead of degrees of winning, there is going to be a single winner and everyone else loses.

Read Matt Cutts' "frog" example: [stonetemple.com ]. Cutts uses the phrase "While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank."

Google thinks its generic algorithms can determine the nuanced content differences between pages. It can't.
8:49 am on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Google thinks its generic algorithms can determine the nuanced content differences between pages. It can't.


I think, more to the point, more than 10 sites regurgitate the same generic information. Probably more than 1000. In such a situation, 990 of those sites cannot possibly rank.

Google does not need to have confidence in its ability to surface the correct page, to make the entirely logical deduction that each of those sites (on my numbers) has a 1% chance of ranking, and thus expecting to rank is prone to disappointment 99% of the time.

And thats before you even think about 10 sites giving qualitively better information.
9:01 am on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@Shaddows,

and now google servs only one site but 10 times with the same information. I remember when google changed their algo ( about 2 years ago ) where they ensure that a domain is only 1 or 2 times in the results. They said they want to push up the diversity and ensure that not one domain dominates the results. That were great times in regard to now.
9:42 am on Aug 22, 2012 (gmt 0)



Backdraft7, you're site IS being humanly reviewed every single day....by your visitors. Google uses that information to decide what to do with you. That's why Google says focus on your users.

There's nothing as impartial or more up to date than the collective views of your visitors.
10:49 am on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@muzza64,

if google does not send users there is no review! Before this disaster many site got thousands of visitors every day. Panda and Penguin wipped them away. You tell me know that google pre Panda was wrong?
I think they are on a journey to beat spam/SEO that they can´t win and they know it, therefore it is easier to push brands to fill the results. Collateral damage is calculated. Only if a site gets public attention it is released out of panda/penguin.
Cutts told us to put more >> real, unique << content. That is a fact and I understand it. But tell me what a price compare engine gives you unique content?
I think we all did a lot of work ( because it is our business ) and we olny can hope that this Ranky modifying Patent is the reason why most poeple don´t see improvements but worsening.
11:15 am on Aug 22, 2012 (gmt 0)



"if google does not send users there is no review"

Exactly. Why would Google want to rank a site with no visitors.
11:21 am on Aug 22, 2012 (gmt 0)



"if google does not send users there is no review"

Exactly. Why would Google want to rank a site with no visitors.

Words fail me. Are you being deliberately awkward? The sites had many visitors (AND PAYING CUSTOMERS) prior to this mess. They didn't all disappear over night. Google has taken them away, plain & simple. If you believe otherwise then good luck.
3:41 pm on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Exactly. Why would Google want to rank a site with no visitors.


Because if a site has no visitors, as does EVERY new site initially, no new site would EVER get ranked. Bit of a CATCH 22 situation if what you say were to be true.
4:29 pm on Aug 22, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Because if a site has no visitors, as does EVERY new site initially, no new site would EVER get ranked. Bit of a CATCH 22 situation if what you say were to be true.

Only if assume Google to be the sole source of traffic. In which case, you're probably going to get yourself into trouble anyway.
8:26 pm on Aug 23, 2012 (gmt 0)

WebmasterWorld Senior Member



There's nothing as impartial or more up to date than the collective views of your visitors.


Google isn't in a position to measure enough user metrics with enough accuracy to rely on them alone. Hence the use of links and so on.

Building for visitors is not enough. Being great isn't, either. Marketing is key, and it's got to go way beyond Google.
8:56 pm on Aug 23, 2012 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



Google isn't in a position to measure enough user metrics with enough accuracy to rely on them alone. Hence the use of links and so on.

Perhaps not relying on user metrics alone, but I think it has a big influence. I have cases of pages with no links, which have good onpage technical SEO, good content and are pages that visitors like, climbing steadily up in SERPs. The result is not instantaneous (the climb is slower than via the link building), but watching a number of pages like that over a year or so, I have no doubt that user engagement is an important factor.

I also think that if you do build links, these must come hand in hand with user metrics. In other words, building links may bring your page up the SERPs sooner, but if user metrics do not "validate" page's place, the ranking can deteriorate with the time.

I personally think that with over 20% of usage via Chrome, complemented with usage via many other Google properties (maps, other google APIs etc), Google has more than enough information on user metrics.
9:06 pm on Aug 23, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Cutts told us to put more >> real, unique << content. That is a fact and I understand it. But tell me what a price compare engine gives you unique content?


Surely Google's logic is that the world only needs one price comparison engine?

Out of interest, why all these Penguin iterations anyway...why don't Google just stop using links as a ranking factor altogether? It's clearly easily manipulated. Is it simply because the algorithm isn't yet good enough to rely on other factors?
9:32 pm on Aug 23, 2012 (gmt 0)

5+ Year Member



Aakk999, that is also my experience. Sections of our site hit by Panda are also the parts that got a negative reaction from google when we tried overcoming our demotion with quality links.

I think the point of Penguin is that links aren't the thing that drives rankings any more, they are just a signal but if the page doesn't justify ranking well for a phrase it won't rank well regardless of how great your links are.

I've heard too many stories of successful sites that did no link building and focused purely on user engagement to believe links matter much any more.

There is no doubt in my mind that google has sufficient access to user metrics for it to be a major component to the algo. Mobile is another matter, but google is on the case with that one and once they have sufficient data from mobile users I think we'll see mobile and non-mobile results starting to look very different from each other.

It just seems to me google wants the link building to stop and penguin amongst other things does seem to be having that effect.
10:45 pm on Sep 2, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



< moved from another location >

Has there been any releases on when this new Penguin update is expected to take effect?
.

[edited by: Robert_Charlton at 4:27 am (utc) on Sep 3, 2012]

7:10 am on Sep 14, 2012 (gmt 0)



@ponyboy96...

i can give you as many examples where the top SERP sites have useless and old content with no new content since long and still on top due to outstanding SEO....
I wanna discuss that but suppose this thread is not a perfect place to discuss...
8:23 am on Sep 14, 2012 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



gagankkharbanda - yep when I watch 10 different keyword, I also see no real content, a lot of link exchange, 5 banner above fold and the list goes on. So im not sure I believe what google says or if the drop of so many sites are just some shuffle to make seos/webmaster confused, it sure look that way, when I look at rankings.
10:39 am on Sep 14, 2012 (gmt 0)



Hi Zeus, I have experienced some outstanding SEO (sometime feels that its blackhat) but the results are awsum..

I'm trying to figure out the exact working but really hard to understand...
10:57 am on Sep 14, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



gagankkharbanda, What do you mean by "outstanding SEO"?
This 140 message thread spans 5 pages: 140
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month