Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

According to Google: Penguin 3.0 is continuing

         

Robert Charlton

12:16 am on Dec 2, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Search Engine Land reported this morning that it had received confirmation from Google that serp changes many were reporting over the Thanksgiving weekend are a continuation of the Penguin 3.0 rollout that began on Friday October 17....

Google: Penguin 3.0 Rollout Still Ongoing
Google says the Thanksgiving ranking shuffle is related to the Penguin 3.0 release from six weeks ago.
Barry Schwartz on December 1, 2014 at 11:48 am
http://searchengineland.com/google-penguin-3-0-rollout-still-ongoing-209886 [searchengineland.com]

Google has confirmed with us that the shifts and changes reported throughout the industry on Thanksgiving day were a result of the Penguin 3.0 refresh that first began rolling out 6-weeks ago.

Google told us in response to what we saw on Thanksgiving day, "the Penguin rollout is ongoing, and this is just the effect of that."

There's been lots of speculation in several discussions here about whether this was a Penguin algo Update. As I interpret what Barry has reported, what we're now seeing is an ongoing rollout of Penguin version 3.0.

My own speculations here: I'm thinking that the algorithm may be highly "recursive"... with the same or related processes repeated on the results of the previous operations, giving us results that are increasingly refined. There's likely a pause to check results at every step, so Google can gauge whether the algorithm is working as anticipated and decide what to do next. Perhaps this will eventually lead to a procedure that can be maintained on a more continuous basis.

See...
Recursion Vs. Iteration
[www2.hawaii.edu...]

I'm not a mathematician, but I'm guessing that this is the kind of routine that might be involved.


PS: Mod's note: Added link I accidentally omitted a few months back to Barry's SEL article

[edited by: Robert_Charlton at 7:24 pm (utc) on Mar 4, 2015]

samwest

2:37 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The facts don't lie. As EG points out, there are a LOT more searches every year. G has just gotten the system down to where if they can read our minds, they can redirect and craft the serp layout to best position their interests and assets in the premium spots for maximum monetization and profits. The numbers don't lie.

Like EG, I still buy from mom & pop stores for the obscure items. My process is to check Amazon first for deals, then I'll start checking M&P's.

BTW - I shop for a similar footwear item every two years. I'm gonna guess you wear "Old Pal" footwear...nice cozy moccasins that fall apart every two years as if on schedule. That's a whole 'nother topic.

RedBar

3:02 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The numbers don't lie.


Of course they can, they may be getting more searches but why? Is it because their results are so bad that people are having to search more within their results.

Invariably when I use Bing I get what I am looking for first time, whenever I am forced to use Google it usually take several attempts to find the answer.

Sure their search volume is up along with searcher dissatisfaction.

EditorialGuy

3:10 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sure their search volume is up along with searcher dissatisfaction.


Yep, that must be why their market share keeps plummeting.

TheMadScientist

4:38 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If market-share is such a great indicator of quality, why did M$ emulate Apple's Mac OS with Windows when it had something like 90% rather than sticking with what it had -- Did M$ need to "tone down" how great it's OS was or something along those lines rather than the more obvious conclusion of having to make a change because it was totally beat, even though it completely dominated the market?

McDonald's has a higher % of market-share than Olive Garden, Chevy's, Red Robbin, Applebee's, Outback Steakhouse, etc. but anyone arguing the higher % McDonald's has means it's food is better than the others would get laughed at by anyone who's eaten at all the establishments I listed.

The market-share argument holds no water when it's thought about logically and reasonably.

EditorialGuy

5:26 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The market-share argument holds no water when it's thought about logically and reasonably.


It makes more sense than the "user dissatisfaction" argument does when the latter isn't backed with empirical evidence.

As for your Microsoft comparison, that's a red herring, because people (and organizations, for that matter) can switch search engines a lot more easily than they can switch operating systems and OS-specific applications. That's why Google was able to blow past AltaVista, Infoseek, Hotbot, etc. in the late 1990s and early 2000s.

Getting back to the original topic of this thread, I wonder if the "Penguin 3.0 is continuing" story isn't being overblown. When Penguin 3.0 rolled out, Google said it would affect only a tiny percentage of search queries, and the number of Webmaster World forum members who are reporting dramatic fluctuations appears to be fairly small.

What's more, there's a tendency for people to blame the most recently-announced update (Panda, Penguin, Page Layout Algorithm, etc.) when their rankings slip or slide. Now that Google has suggested that Penguin will be an ongoing process (a la Panda and Google's search algorithm in general), it's that much more tempting for people to become obsessed with Penguin when their problems may lie elsewhere.

TheMadScientist

5:31 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It makes more sense than the "user dissatisfaction" argument does when the latter isn't backed with empirical evidence.

As for your Microsoft comparison, that's a red herring, because people (and organizations, for that matter) can switch search engines a lot more easily than they can switch operating systems and OS-specific applications.

People didn't need to switch to present the empirical evidence you would like since once M$ realized it was getting totally outdone it switched it's system so people would keep using it rather than switching to a better system -- The only evidence necessary is present in the fact M$ changed it's OS to emulate Apple rather than sticking with what it had even though at the time of the change it had about 90% market-share -- Please, if you're going to argue and contradict what people say, at least make the argument you present a good, reasonable, logical, well-thought-out one. Thanks

...people (and organizations, for that matter) can switch search engines a lot more easily than they can switch operating systems and OS-specific applications.

People are creatures of habit and convenience, hence McDonald's market-share even though not presenting the best quality product by any stretch of the imagination -- People can switch where they eat nearly as easily as they can a search engine. All they have to do is turn into a different parking lot or drive-thru. They don't due to marketing, perceived convenience, and an inherent reluctance to change.

Market-share is not necessarily indicative of the best quality of product or offering available. It can simply mean it's the most "comfortable", "perceived convenience", or "most entrenched habit" people are reluctant to change from -- And Google didn't "blow past" everyone else simply due to it's results. A large part was due to negotiations leading to it's supplying results to others [cost-savings to the company Google supplied the results to] and agreements with browser creators to be "coded in" as the default search engine, neither of which indicate "the best", except in terms of negotiating ability, much like what M$ did with DOS.

aristotle

7:54 pm on Dec 13, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google said it would affect only a tiny percentage of search queries

Google didn't tell the truth -- It affects the results for virtually all search queries if you go far enough down the list. In other words, the percentage affected depends on how many results you include in the analysis. In order to get a low percentage, like the 1% or whatever Google announced, you have to restrict it to the first 2 or 3 results. Google never says how they define it, but they're obviously only using the very top of the results. Whether they're doing this to intentionally mislead people, I don't know. But a lot of people are being mislead because they don't understand how Google determines it.

micklearn

5:22 am on Dec 14, 2014 (gmt 0)

10+ Year Member



What's the verdict on why Matt Cutts seems to have left the building? Penguin related? Panda related?

GreyBeard123

5:50 am on Dec 14, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



I wonder if the "Penguin 3.0 is continuing" story isn't being overblown. When Penguin 3.0 rolled out, Google said it would affect only a tiny percentage of search queries, and the number of Webmaster World forum members who are reporting dramatic fluctuations appears to be fairly small.


Why would it be overblown?

What happened directly after P3, do you know?

And what is happening today, and what is going to happen tomorrow, or thereafter?

Penguin devalues links...
If Penguin devalues more, and more, and more, then more and more and more sites will be affected...

I’m sure you know the above, don’t you?

So why then will it affect “only a tiny percentage”?

flatfile

6:31 am on Dec 14, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



When Penguin 3.0 rolled out, Google said it would affect only a tiny percentage of search queries


Google gets over 3 billion search queries a day. So even a mere 1% equals to over 30 million queries.

samwest

1:42 pm on Dec 14, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Even after 15 years my back link profile is pretty weak. I never concentrated on building back links, whatever happened are naturally occurring. Some verticals are very conducive to back linking, some are not. Mine is one of those NOT verticals.

I've been watching Penguin results in my vertical and it does not seem to have made a dent in those sites with multimillion back links that suddenly bulldozed my rankings. For years I ranked 1-5 on a single four letter word term. Now that term is held by only sites with huge, quickly built and presumably bought back links profiles. Their content doesn't even match the query, but they gots lotsa back links so they must be great sites....right? Nope, nothing but ads.

The web appears different depending on your reference point. My observation may have nothing at all to do with what you see...but add all the stories up and you get the general picture of where this is all going....flush.

glakes

2:36 pm on Dec 14, 2014 (gmt 0)



Their profits are reported as fact but we only know what they tell us, and we all know they can't be trusted.

I think the same can be said of any multinational company. The only difference being that Google has much more user data to analyze.

Saturday I noticed a substantial increase in traffic. I wonder if that was panda or penguin? Something definitely changed - another possible rollback.

brotherhood of LAN

9:38 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've removed some off-topic and personal posts. Feel free to continue posting if it's on-topic to the subject at hand.

seoskunk

9:59 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Perhaps google have now made Penguin a part of their algo, difficult to believe this is the same Penguin that run back in october, that would take at max 3 days to run. I think we are in the midst of everflux and that maybe penguin is now incorporated with panda into the mix.

I see a time shortly when google will not index your whole site until it has reached a threshold of quality signals.

EditorialGuy

11:18 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I see a time shortly when google will not index your whole site until it has reached a threshold of quality signals.


Interesting thought. Google has always been like a giant vacuum cleaner, sucking up (and seemingly disgorging) everything in sight. Setting a threshold or baseline for indexing could be more useful, in some ways, than focusing on penalties.

Penalties obviously have their place, but there's an awful lot of dross that wouldn't be missed if it simply disappeared from the SERPs, regardless of whether the owners did or didn't practice shady linking, cloaking, etc.

trabis

11:25 pm on Dec 15, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



Quoting Pierre Far:

On Friday last week, we started rolling out a Penguin refresh affecting fewer than 1% of queries in US English search results. This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly-discovered spam.


Let's say that within the 1% of queries we can find the query "red widgets".
What are the webspam signals that can demote a page about red widgets?
Is it on page factors such as high "red widgets" density in content, metadata and internal linking?
Is it off page factors such as "red widgets" inbound links?

Whatever the answer is to the above questions, another bigger question arises:
Why does not Penguin affect 100% of the queries?

And btw, when they say 1% of queries do they mean 1% of unique queries?
I can imagine 1% of unique queries representing 99% of the search volume if I include popular brand names.

brotherhood of LAN

1:07 am on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



And btw, when they say 1% of queries do they mean 1% of unique queries?
I can imagine 1% of unique queries representing 99% of the search volume if I include popular brand names.


It could also be read as affecting the top 10 of every 1000 search results, though it seems unlikely. I haven't seen anyone confirm the frame of reference meant, though maybe someone can confirm.

seoskunk

2:08 am on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@EditoralGuy other search engines are already setting a threshold ......Yandex,Bing Google would be just following in this case

micklearn

4:34 am on Dec 16, 2014 (gmt 0)

10+ Year Member



Why does not Penguin affect 100% of the queries?


I've often wondered that about any update that they've pushed out, where are the percentages they put out there coming from?

McMohan

11:58 am on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Wondering if my experience is consistent with others here? In the recent Penguin 3.0 rollout/everflux, it seems that Google is not accounting for links generated in the last 2 months or so, but only applying algorithm tweeks to the linkset that existed earlier.

aristotle

12:21 pm on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I tried to explain this earlier. The percentage affected depends on how you calculate it. The 1% number that Google gave is meaningless because they didn't define how they calculated it.
In fact, the results for virtually all queries are affected if you take all the facets into account. I don't have time to go into details right now but it's not hard to understand if you're not too menatlly lazy to take the time to analyze it.

Itanium

12:34 pm on Dec 16, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



Maybe.

I'm pretty sure that happend with the first Penguin 3.0 update, since my competitor got very nice links 1 month prior to me getting good links too. His rankings skyrocketed, mine didn't.

Not sure if it's still that way, since my rankings rose a little bit (but not as much as his).

RedBar

9:51 pm on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I see a time shortly when google will not index your whole site until it has reached a threshold of quality signals.


Certainly Bing is already doing this and even posted about it last week:

[webmasterworld.com...]

Whatever Bing is doing, insofar as my widget niche is concerned, their results are far more accurate and much cleaner. One of the problems I could envisage with Google going down that route would be that they really do not seem to have the same clarity of vision, they're too obsessed with chasing their own agendae.

FranticFish

8:33 am on Dec 19, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



^^ Simple idea but very effective, and perhaps also sidesteps the abuse issues that Google seem to find themselves mired in.

Years back Google were very proud of having won the 'whose index is bigger' wars, but whilst this might mean they could index more urls for the long tail and be more comprehensive, it's also meant that they happily index auto-generated gumpf.

It was only after this time that the emphasis on brand started, yet brand has caused relevancy issues and spam built on huge networks of individually insignificant pages is still with us.

The irony is that Google seem less fond of the long tail now anyway, and frequently stem search queries to produce more generic results.

I can see issues with Bing's approach, in that some worthy content might get overlooked. But perhaps they're right in that this is the lesser of two evils.

MichaelP

2:48 am on Dec 20, 2014 (gmt 0)

10+ Year Member



Traffic on one of our sites was way up in December but it slowly fell for the last few days. We still rank on keywords so I guess shopping sprees are over? (black Friday, Christmas shopping etc). Anyone seeing their traffic down in the last 2-3 days?

samwest

5:40 am on Dec 20, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



the week before Christmas is always slow...the week after is the biggest of the year for my vertical...fingers crossed.

RedBar

2:44 pm on Dec 20, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Anyone seeing their traffic down in the last 2-3 days?


Actually mine has held up better than I thought it would, it usually has a very predictable decline towards Xmas. Yesterday, Friday, there was a very notable reduction in traffic and especially so from the UK being, for many, their last day at work until 5th January.

It was interesting to note that news channels were calling it Black Friday in reference to it being one of the busiest days of the year for A & E departments...yep, every amateur and his dog were out!

EditorialGuy

4:10 pm on Dec 20, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Anyone seeing their traffic down in the last 2-3 days?


Our traffic has been up slightly in recent days. But then, our low point in the year occurs somewhere around the first or second week in December, so by now we should be revving up slowly for the usual big jump in January.

Metrics that I find more useful, in terms of gauging how we're doing in Google Search, are Google Organic Traffic in Google Analytics and Search Queries (Av. position with change) in Webmaster Tools.
This 88 message thread spans 3 pages: 88