Welcome to WebmasterWorld Guest from 22.214.171.124
We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.
This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.
[edited by: Brett_Tabke at 12:12 pm (utc) on May 23, 2013]
[edit reason] added quote [/edit]
[edited by: Lorel at 12:46 am (utc) on May 28, 2013]
As much as Penguin is supposedly about link profiles and anchor text, could it be that other factors are used to determine if a site qualifies for a penguin penalty?
Google has never specified what Penguin is
Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.
I'd suggest that most members here (me included) might easily fall under the above description. I know that I backed WAY off from the kind of "SEO" that I used to do.
After a year of Penguin there does not seem to be even a consensus if it is on page or off page?
So now I am wondering if this update was less about backlinks and more about title weighting....
In fact the top 5 SERPs results are all the first words exact match in the title!
Those who are using the maximum character limit and beyond in their title are hitting first page. It's pretty horrific if you ask me. I'm seeing the entire first page of the serps for some niches covered in old forum posts, with broken English, while the major players are sitting on page two. It's actually funny because many of these forum posts on page 1 are linking to those major players on page 2. :) Demote the major players and elevate old forum posts so users have to click twice to get to their destination. This is a major fail on Google's part.
A two word term that I used to be 3rd for now 7th has been replaced by sites-guide. / alexa.com/siteinfo / siteinfo.org.uk - all pointing to my site ! what benefit does that offer the surfer NONE!
It's like Google's trying to decide who to include rather than who to exclude.
[edited by: tedster at 1:36 pm (utc) on May 28, 2013]
We are just a small accountancy business
Does anyone have clients that are just booming like mad?
Well if that is true then I would think they would be moving to structured data in a hurry but that doesn't seem to be the case.
Structured data would allow things to be grouped in subsets relatively fast thus achieving exactly what you describe.
But what I am seeing is Google is extremely slow these days to index new data and they appear to up only small pieces of pages.
Almost like they don't have the computing power to index correctly.
Maybe are the searches so huge that it is slowing things down?
Well if that was the case then traffic would be up
but we don't see that do we?
[edited by: tedster at 1:32 pm (utc) on May 28, 2013]
It's like Google's trying to decide who to include rather than who to exclude
We are just a small accountancy business and previously competed very well against the huge accountancy websites across the country due to our very good and regularly updated content. PR4, DA37, PA47 are better stats than any other small-medium sized accountancy. We have about 10,000 back-links which I have created over the last 3 years with visitor numbers going up every single month.
For example, if I have a widget site and that site is split into different colours of widgets and those different colours direct surfers to different sizes how can that be a bad thing? if your helping the surfer - example
For my ecommerce clients, I'm trying to convince them to stop making separate pages for each size or each color, but to combine the choices on to one or two pages at most. So you have a widget, and you have one dropdown for color and one dropdown for size, rather than have a separate page for every permutation. Unfortunately not every CMS or shopping cart (or internal SKU numbering system) wants to cooperate. But I'm pretty sure that in order to save a sinking ecommerce site, you need to reduce those near-duplicate pages as much as possible.
Google has asserted that we'll never manage to backward engineer Penguin. Therefore, it HAS to be about more than backlinks, unless that statement was an elaborate bluff on their part.
One way ( IMO certainly not the only way ) that they "pre-sort" is to use ( and "boost" ) sites which "pre-sort" for them..pinterest is a good example of this ( made G's image sorting job much easier..now they know where the cute kittens and lace are likely to be ) ..especially if their search history and personalisation data for you knows that you are likely to be female and spend time and money on fashion sites, and looking up bios of soap stars..they ain't gonna send you to 4chan..:)
Perhaps Penguin 2 works in a reverse order deciding authority based on weight of positives as well as negatives. This would certainly account for websites that have been affected that didn't build links at all.
For my ecommerce clients, I'm trying to convince them to stop making separate pages for each size or each color, but to combine the choices on to one or two pages at most.
Now am I relaxed or am I relaxed?