homepage Welcome to WebmasterWorld Guest from 23.22.173.58
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 8 [9] 10 11 12 > >     
Penguin 2.0 is upon us - May 22, 2013
viral



 
Msg#: 4576740 posted 12:52 am on May 23, 2013 (gmt 0)

Matt has announced Penguin 2.0 (Penguin 4). Either way it is out there and affecting.

Is anyone noticing much movement in the serps? I personally haven't seen much flux but Mozcast seems to be feeling something.

[mattcutts.com...]

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

[edited by: Brett_Tabke at 12:12 pm (utc) on May 23, 2013]
[edit reason] added quote [/edit]

 

Lorel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4576740 posted 12:36 am on May 28, 2013 (gmt 0)

I was checking to see why a clients preferred keyword was only ranking on page 2 (this site not affected by Penguin2) so I checked the top ranking indie site (#1 was Amazon) and almost all it's backlinks are on a network of domains it owns (it even links them all with the same logo on all sites). The next step is to file a complaint about Penguin2 serps that MC provided?

[edited by: Lorel at 12:46 am (utc) on May 28, 2013]

rango



 
Msg#: 4576740 posted 12:39 am on May 28, 2013 (gmt 0)

@Whitey, as far as I can tell Penguin has not affected us. Never had any unnatural link warnings and our link profile is quite clean on the whole I would say.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4576740 posted 1:44 am on May 28, 2013 (gmt 0)

Does anyone know of any site-owners with "unnatural link notices" that were unaffected by Penguin 2.0

I may have missed something, but think it's important to receive feedback on, as many folks were waiting to see "what happened".

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 4:26 am on May 28, 2013 (gmt 0)

As much as Penguin is supposedly about link profiles and anchor text, could it be that other factors are used to determine if a site qualifies for a penguin penalty?

Amen. A tweak to link profiles and anchor text would not require a team to take months to build a new algorithm.

Google has never specified what Penguin is

No they never specified the exact factors. But they have given some suggestive comments. From Google's original Penguin announcement: [insidesearch.blogspot.com]

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.


I'd suggest that most members here (me included) might easily fall under the above description. I know that I backed WAY off from the kind of "SEO" that I used to do.

tigger

WebmasterWorld Senior Member tigger us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 6:43 am on May 28, 2013 (gmt 0)

so if Penguin 2 is "maybe" about internal linking control both onsite & off site how are you supposed to build a site for surfers if you can't direct them onto relevant pages ?

Forgetting wikki as thats clearly untouchable - how can we build sites without making them look like we are trying to control link flow?

For example, if I have a widget site and that site is split into different colours of widgets and those different colours direct surfers to different sizes how can that be a bad thing? if your helping the surfer - example

Widget Site > Blue widgets > Different sizes cross linked

So the different size widgets are all on theme to the blue widget and help people choose which one they want - but if I'm understanding this correctly it means by me showing surfers alternative sizes I'm effectively in Gs eyes controlling link flow within a group of pages

So does this mean I have to remove all these links or no follow them? surly this offers nothing to the surfer if they can't follow a sites natural flow of information...effectively what wikki offers


driller41

5+ Year Member



 
Msg#: 4576740 posted 9:39 am on May 28, 2013 (gmt 0)

I'd suggest that most members here (me included) might easily fall under the above description. I know that I backed WAY off from the kind of "SEO" that I used to do.

After a year of Penguin there does not seem to be even a consensus if it is on page or off page?

So do we all give up and rebrand ourselves are internet marketeers rather than SEO's

ColourOfSpring



 
Msg#: 4576740 posted 10:05 am on May 28, 2013 (gmt 0)

After a year of Penguin there does not seem to be even a consensus if it is on page or off page?


I suspect there's no consensus because there are number of factors at play. If you are ranking #3 for a keyword, then after Penguin you are ranking #23 because you've been replaced by 20 big brand results (and I don't mean 20 individual big brands, but perhaps 7 or 8 big brands with host crowding) - a very common situation for a lot of my clients - then it's more likely "Penguin" also includes positives (for big brands) as well as negatives for smaller sites. I've also noticed that Penguin updates have a "slide" affect - you lose 2 positions 1 day, then another 2 positions 5 days later, then 3 positions 1 week later etc - it's like Google are transitioning a new SERP into place - it makes it all the more difficult to work out why things are happening that way.

Alex997



 
Msg#: 4576740 posted 10:39 am on May 28, 2013 (gmt 0)

As mentioned previously, Penguin 2.0 has resulted in a big drop for all home page keyword SERPs. For example:

* accountants <townlocation> has gone from #1 to #9 (never moved from #1 for last 3 years)
* accountants <citylocation> has gone from #2 to #26

We are just a small accountancy business and previously competed very well against the huge accountancy websites across the country due to our very good and regularly updated content. PR4, DA37, PA47 are better stats than any other small-medium sized accountancy. We have about 10,000 back-links which I have created over the last 3 years with visitor numbers going up every single month.

Looking at the <townlocation> NEW #1 website, this has PR2, DA15, PA29 and has a total of 12 (yes 12!) backlinks, only 2 of which use "accountants <townlocation>" in the anchors and both are no-follow. Basically it's only ranking at all because the keywords are the first thing in it's home page TITLE.

In fact the top 5 SERPs results are all the first words exact match in the title!

So now I am wondering if this update was less about backlinks and more about title weighting....

tigger

WebmasterWorld Senior Member tigger us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 10:53 am on May 28, 2013 (gmt 0)

So now I am wondering if this update was less about backlinks and more about title weighting....


I wondered the same but I have pages where the main K1/K2 are first in the title and have held there first place rankings and in other pages where the page has dropped to 10-20th place. I also have other pages where the keywords are used within the 3rd & 4th words and some have held, others dropped - There appears to be no pattern where the K1/K2 are within the title

Hence I was wondering about use of navigation (wikki)but are we to leave surfers confused where to click just to please the almighty

turbocharged



 
Msg#: 4576740 posted 11:10 am on May 28, 2013 (gmt 0)

In fact the top 5 SERPs results are all the first words exact match in the title!

For non-Amazon and other corporate giant owned queries I am seeing:

Those who are using the maximum character limit and beyond in their title are hitting first page. It's pretty horrific if you ask me. I'm seeing the entire first page of the serps for some niches covered in old forum posts, with broken English, while the major players are sitting on page two. It's actually funny because many of these forum posts on page 1 are linking to those major players on page 2. :) Demote the major players and elevate old forum posts so users have to click twice to get to their destination. This is a major fail on Google's part.

ColourOfSpring



 
Msg#: 4576740 posted 11:15 am on May 28, 2013 (gmt 0)

Those who are using the maximum character limit and beyond in their title are hitting first page. It's pretty horrific if you ask me. I'm seeing the entire first page of the serps for some niches covered in old forum posts, with broken English, while the major players are sitting on page two. It's actually funny because many of these forum posts on page 1 are linking to those major players on page 2. :) Demote the major players and elevate old forum posts so users have to click twice to get to their destination. This is a major fail on Google's part.


It would be funny if Google weren't driving 90% of commercial internet traffic. Makes me wonder if I shouldn't just move into offline work. I think I'd much rather work with the predictable laws of physics than deal with non-sensical issues everyday, with the latent feeling I'm doing something "wrong" all the time.... :( - at least with 99% of other jobs, you either do it right or you do it wrong, and if you do it wrong, you know exactly where you went wrong and you improve - a.k.a learning a trade. My background is website development, but it's not even enough to just do that these days - Google take over even our jobs because online success is so dependent on Google..... :(

tigger

WebmasterWorld Senior Member tigger us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 11:23 am on May 28, 2013 (gmt 0)

This is one reason I wonder if the serps hasn't still got some cooking to do

A two word term that I used to be 3rd for now 7th has been replaced by sites-guide. / alexa.com/siteinfo / siteinfo.org.uk - all pointing to my site ! what benefit does that offer the surfer NONE!

ColourOfSpring



 
Msg#: 4576740 posted 11:50 am on May 28, 2013 (gmt 0)

A two word term that I used to be 3rd for now 7th has been replaced by sites-guide. / alexa.com/siteinfo / siteinfo.org.uk - all pointing to my site ! what benefit does that offer the surfer NONE!


There's a site that ranks 2 spots above me for a competitive keyword - half of its content is "lorem ipsum" filler! The other half is scraped from various sites, with small smatterings of unique content (literally, unique intro sentences) - many links not working (they literally go to "#" on all the links on their main menu as if they've put in "#" as a temporary place holder link). I wish I could link to this site to show you guys, but I understand the policy here...

tigger

WebmasterWorld Senior Member tigger us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 11:56 am on May 28, 2013 (gmt 0)

This is why right now I'm not doing anything other than putting up new content - what is being offered to the surfers is rubbish and fail to see how G would be impressed with the mess P2 has left them

Leosghost

WebmasterWorld Senior Member leosghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 12:16 pm on May 28, 2013 (gmt 0)

It's like Google's trying to decide who to include rather than who to exclude.

Been saying since waaaayback that G were "pre-sorting" and that it would continue ever more "aggressively"..so the signs they are doing so..comes as no surprise..

They have no choice..it is the only way that they can cope..the sheer numbers out paced even their estimates of what they could sift and return SERPs for instantly in response to any query..

The number of queries exploded ( even G did not see the effect that being able to search from mobile phones would have upon their sytems ) prior to the ubiquitous use of mobile phones and other mobile devices, searches were made only by those who were in a particular place in front of a particular machine, searching took "effort"..and chairs, desks, keyboards, monitors, towers or laptops, cables etc etc ..

Now ..anyone with a mobile device can search ..and they can be anywhere when they do so..it is no longer a specific "hardware"( all of the above items, including desks, and chairs and cables..are hardware ) and "environment dependent" task..

The infrastructure of search engines, their underlying systems and operating systems ( G just changed theirs..again..the old one could not keep up..it "wobbled" under the data load and required speed of "I want the answer now" load ) even their own "hardware" data centres etc ) could not keep up..

Now, with anyone being able to ask at any time from anywhere about anything ( scenario.."bar room discussion" about any subject..previously one person maybe with a laptop ( if there was such a person in the corner of the bar, with the space on a table for a laptop ) could "look up" the answer..now the entire bar reaches for their phones and searches )..people used to watch TV and wonder "where have I seen that actress before"..Now they have a phone or a tablet and they look it up..while still watching TV..no getting up and going to the computer..or thinking "I'll look it up later" and forgetting..

So ..if you want to be a fast search engine ( and if you aren't fast..you die )..you have to "pre-sort"..so as to be searching a "subset" and not the entire database..<= which is itself in permanent "flux" as new pages come on line <= millions every day..and their relationships change as fast as clouds change shape in the sky on a summers day..( "look dad! a dragon is chasing that rabbit".."Where son?" .."Oh, now it's a dog and a sailboat..you missed it dad..it changed while you were looking the other way".. )..and it is going to get faster..

One way ( IMO certainly not the only way ) that they "pre-sort" is to use ( and "boost" ) sites which "pre-sort" for them..pinterest is a good example of this ( made G's image sorting job much easier..now they know where the cute kittens and lace are likely to be ) ..especially if their search history and personalisation data for you knows that you are likely to be female and spend time and money on fashion sites, and looking up bios of soap stars..they ain't gonna send you to 4chan..:)

Amazon..also is "pre-sorted" ..they, or the merchants on there, probably sell just about everything..and when it is "out of stock", that is just 3 words on a p)age with photos , descriptions,specifications, review comments and similar product links..and Amazon has "flow"..<= easy to use and compare and checkout..<= for "average surfers"..

Wikipedia ?..pages on just about everything, and constantly under review by it's own internal editors and outside editors, and world and their dog..( forget about if you think it is accurate or not ..G doesn't really care..because the "wiki system" does some "pre-sorting" for G, so G doesn't have to worry about "the load"..and the answers will be right more often than they are wrong..at least right enough for any one who didn't know the answers and thus had to ask the question )..same applies to ehow..if you have to ask the question..chances are that the ehow answer , re-spun from 3 or 4 real sources , will be more than you knew..so satisfied "average surfer"..

Remember..we ( and anyone who actually has website ) are not the "average surfer"..

Larry was worried about losing eyeballs ( and all their surfing data ) behind the walled garden of facebook..thought he could invent G+ and get them to move..it would also have helped him in his "pre-sorting"..you'd have been "pre sorting" yourselves into "sets" ( I have used set theory many times to demonstrate things on WebmasterWorld..look it up..and think about it )..but "group inertia" has defeated him..( he needs to study a little deeper how to "entice"/"influence" non geek people to do things °..but that would require empathy..not merely smiles for the camera..

As to consensus here..?..

Consensus is not needed, in order to be able to see what is happening, what will be happening, and to be reactive..

Example..

G have been saying for along long long time now that they preferred a "responsive site"..
Well of course they would prefer it..! "responsive" means they only have to crawl, index, categorise, filter and cache each page once.."responsive" lowers the load on G's systems..

Their problem has been that with the number of sites running adsense ( G's way to make the average webmaster an evangelist ) and until very recently..no easy to understand, acceptable , G approved ( won't get you "insta banned" ) "responsive" way to serve adsense was available to webmasters/publishers..

So many did not make "responsive sites"..many made two or more versions of the same pages and sites..and G had to deal with that load..had to because mobile phone and tablet use was exploding..but desktop and laptop use are not dead ( and are unlikely to die )..But G tested "responsive adsense" in a few non English speaking areas..and apparently it did not get abused..nor did it result in so many fumble fingered clicks on ads that advertisers screamed..

So now everyone can use "responsive sites"..and G would like you to do so as soon as possible..because it lightens their load..

( you did take advantage of the last couple of years of them saying that they liked "responsive sites"..and their promotion of them in SERPs to learn how to make "responsive sites"..didn't you.. )

As to the detail of what to do in ( apart from "go responsive"..if you can ..and if it is actually possible with your "content" ) various niches and verticals..the details of how to be included in the "set"(s)" that G show, after various "pre-sortings*" have been run and prior to searchers seeing a SERP..these will vary with each niche and vertical..

IMO .. specific discussion, especially in public fora or venues, in the hope of finding a recipe of fixes that will work for all sites, is inadvisable..

Also, bear in mind that to a degree, yes, G will be trying to increase spend on adwords..they merely need to make it so hard for the average webmaster to understand what gets one a good position , that the average webmaster will pay G to be on page one..

But that could be regarded as a sort of Darwinism as applied to website pages as species, and search as an environment ..

At any one time both the effects of such search Darwinism.."pre-sorting" and nevertheless their being still overwhelmed by "data volume" and "flux" in "page relationships" in certain areas, may be visible at once..which would explain many ongoing anomalies..and many new ones, others yet to be seen..

[edited by: tedster at 1:36 pm (utc) on May 28, 2013]

DXL

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4576740 posted 12:24 pm on May 28, 2013 (gmt 0)

We are just a small accountancy business


Granted you're competing for a variety of cities, but your post made me consider how an accountant client of mine was ranking post-Penguin 2.0.

For at least eight years, that client was #1 for a search of accountants in their city (population of a million). As of this last week, he was bumped down to the local results. Now the top three spots(before you get to local results) are for Indeed, Yellowpages, and Careerbuilder.

That client is still #1 on Yahoo and Bing for the same search term.

tigger

WebmasterWorld Senior Member tigger us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 12:27 pm on May 28, 2013 (gmt 0)


That client is still #1 on Yahoo and Bing for the same search term.


its a good job he's not UK based, here its either G traffic or die

Awarn



 
Msg#: 4576740 posted 12:37 pm on May 28, 2013 (gmt 0)

Well if that is true then I would think they would be moving to structured data in a hurry but that doesn't seem to be the case. Structured data would allow things to be grouped in subsets relatively fast thus achieving exactly what you describe. But what I am seeing is Google is extremely slow these days to index new data and they appear to up only small pieces of pages. Almost like they don't have the computing power to index correctly. Maybe are the searches so huge that it is slowing things down? Well if that was the case then traffic would be up but we don't see that do we? Does anyone have clients that are just booming like mad?

Awarn



 
Msg#: 4576740 posted 1:02 pm on May 28, 2013 (gmt 0)

DXL - It appears Google is getting more searches for accountant in a search term where people are looking for jobs, thus indeed and careerbuilder. That means Google can't determine the difference between the business or service (CPA in practice)and a person looking for a job as an accountant. With it graduation time I expect there are a lot of searches for accounting positions or accountants. So it appears maybe you need find a way to play to that weakness in Google's algorithm.

globalecommerce



 
Msg#: 4576740 posted 1:03 pm on May 28, 2013 (gmt 0)

Hi,

Yes my site got beating with Penguin update... Real reason I don't know as the rankings were dipped. However, am not practicing non-organic link scheme... yet got beating... Need to figure it out though.. Help me if you can....

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4576740 posted 1:03 pm on May 28, 2013 (gmt 0)

Does anyone have clients that are just booming like mad?


I have. Over years he did gain a lot of links. MOst are payed. This last penguin throw him to the top of serps. Even outranks some big selling platforms.
We did never kill any links since Panda/penguin.

Our own site ( 95% natural links ) are suffering again since yesterday. But we killed a lot of links.

ecom, germany

Leosghost

WebmasterWorld Senior Member leosghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 1:27 pm on May 28, 2013 (gmt 0)

Well if that is true then I would think they would be moving to structured data in a hurry but that doesn't seem to be the case.

"In a hurry" is a relative term..and much happens behind the curtain..

Structured data would allow things to be grouped in subsets relatively fast thus achieving exactly what you describe.

Which is why I think they'll go there ( in the interim, they are "pre-sorting" , and "using" ( preferring if you will ) "pre-sorted" sites )..it isn't like they can Just say " hold everything, no searches for a month or so, and we'll be back with different structures"..They have to integrate changes into a live and evolving environment..

But what I am seeing is Google is extremely slow these days to index new data and they appear to up only small pieces of pages.

Which is what would happen if they were trying to reduce the amount of data that they were having to handle..

Almost like they don't have the computing power to index correctly.

That is what I said, both in the previous post ( and previous posts in other threads )..and just above in this post..we are in agreement on this aspect/ interpretation :)


Maybe are the searches so huge that it is slowing things down?

Only if they were to try to search everything in real time ( as they did for a while in the past )..rather than "pre-sorting", as they have been doing, for the last two years or so..

Well if that was the case then traffic would be up

Not necessarily..the likes of wikipedia and amazon do not report their traffic fluctuations here:) But look at the meteoric rise of Pinterest and ehow et al..and their increased traffic, is where the traffic of many, whose traffic has fallen, has been redirected to..

but we don't see that do we?

See my paragraph immediately above..

[edited by: tedster at 1:32 pm (utc) on May 28, 2013]

seoskunk



 
Msg#: 4576740 posted 2:36 pm on May 28, 2013 (gmt 0)

I think that Google have always pre-sorted the results to some degree otherwise you would have "how to build a nuclear bomb" online. There is a growing number of people that believe Google is the internet and of course it isn't. Maybe its 30% of it something like that. It doesn't display all the results it indexes, never has done, so it wouldn't surprise me at all if pre-sorting has been extended.

Is pre-sorting a prerequisite to categorisation of websites? I certainly could see how that could work with a background colour showing each category (green for information sites, blue for ecommerce etc) and the heavier the background the more emphasis on that category.

In order to do this you would certainly need to appoint authority first and that brings us back to Penguin 2.

It's like Google's trying to decide who to include rather than who to exclude


Perhaps Penguin 2 works in a reverse order deciding authority based on weight of positives as well as negatives. This would certainly account for websites that have been affected that didn't build links at all. And explain Google's reference to grey area website's.

What we do know is Google is looking at punishing Black Hat Seo here, but rather than simply penalize I think Google is balancing a positive against a negative.

So Factors That could count against you IMO:

1. Links from bad neighbourhoods

2. Unnatural link patterns such as forum signature, comment and directory.

3 Unnatural links from syndicated articles.

4. Unnatural diversity of links indicating paid link schemes.

5. Unnaturally high keyword density in internal links.

Factors that could count for you IMO:

1. High quality inbound links.

2. Engagement and user retention

3. Social Media Signals

4. Strong internal structure and few Errors

Leosghost

WebmasterWorld Senior Member leosghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 2:47 pm on May 28, 2013 (gmt 0)

I think that Google have always pre-sorted the results to some degree otherwise you would have "how to build a nuclear bomb" online


Your results for certain queries, must be far more filtered than mine, and those of many many others, ever have been :)

ColourOfSpring



 
Msg#: 4576740 posted 2:49 pm on May 28, 2013 (gmt 0)

Leosghost, are you talking about Google caching SERPs? So if someone searches "blue widgets" he/she gets (essentially) a cached results set that is then passed through a personalisation filter (personalised results) - database untouched. Perhaps this is what you are you saying when you mention "pre-sorting" (caching?). If so, I'm very sure I've heard MC say this is the case in one of his videos - Google cache results, hence why you get such fast results. Those caches probably update on a schedule and this is where true "heavy lifting" processing happens (generation of caches) - and the new caches are echoed across the data centers. Google get new queries everyday of course, and I'm sure they try to find the nearest used query cache to retrieve for that new query (and may even create a queue of "new queries" from which the algo needs to generate corresponding caches for).

All of the above is speculation by me :)

Perhaps the index is - as you say - getting too unweildly - and so they're making quite brutal decisions on separating wheat from the "chaff", with the results all to clear to a lot of us - some chaff remains, a fair amount of wheat thrown away - but all expedient to Google's processing goals.

Ultimately though - and this is what REALLY matters to us - is Google simply becoming an ever-losing proposition to small site owners? If so, there's still a demand for small businesses and the services / products they provide. If Google won't supply that demand, then someone has to.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4576740 posted 3:01 pm on May 28, 2013 (gmt 0)

We are just a small accountancy business and previously competed very well against the huge accountancy websites across the country due to our very good and regularly updated content. PR4, DA37, PA47 are better stats than any other small-medium sized accountancy. We have about 10,000 back-links which I have created over the last 3 years with visitor numbers going up every single month.


I would probably start with those 10,000 backlinks that you created.

For example, if I have a widget site and that site is split into different colours of widgets and those different colours direct surfers to different sizes how can that be a bad thing? if your helping the surfer - example


For my ecommerce clients, I'm trying to convince them to stop making separate pages for each size or each color, but to combine the choices on to one or two pages at most. So you have a widget, and you have one dropdown for color and one dropdown for size, rather than have a separate page for every permutation. Unfortunately not every CMS or shopping cart (or internal SKU numbering system) wants to cooperate. But I'm pretty sure that in order to save a sinking ecommerce site, you need to reduce those near-duplicate pages as much as possible.

fathom

WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4576740 posted 3:34 pm on May 28, 2013 (gmt 0)

For my ecommerce clients, I'm trying to convince them to stop making separate pages for each size or each color, but to combine the choices on to one or two pages at most. So you have a widget, and you have one dropdown for color and one dropdown for size, rather than have a separate page for every permutation. Unfortunately not every CMS or shopping cart (or internal SKU numbering system) wants to cooperate. But I'm pretty sure that in order to save a sinking ecommerce site, you need to reduce those near-duplicate pages as much as possible.


If you want some additional data to back your suggestion up that you are categorically correct, let me know.

I have been working with a multi-million dollar domain that was brought to its knees doing that form of content spinning/doorway page development.

They hired $200/page wordsmithers to provide really top quality product descriptions but when the only difference between core content of 7 pages is navy, black, tan, brown, red, large or small... how exactly do you prevent "spinning?"

If they wish to see the aftermath I'm sure I can forward analytical details.

HuskyPup



 
Msg#: 4576740 posted 3:52 pm on May 28, 2013 (gmt 0)

Google has asserted that we'll never manage to backward engineer Penguin. Therefore, it HAS to be about more than backlinks, unless that statement was an elaborate bluff on their part.


Bluff? More likely admitting the truth that they lost control of what they were attempting to do ages ago and THEY don't know how to reverse engineer it back to a set of sensible SERPs.

I'm in Germany at a big trade widget exhibition right now and I am absolutely amazed at the quantity of people/companies that have lost all confidence in Google's ability to generate even a mediocre set of widget results.

Interestingly it is more like going back 20+ years when nobody bothered about computers except for pricing and design work, everyone's talking to each other sensibly ... and enjoying the superb local beers and wines:-)

Now am I relaxed or am I relaxed?

diberry

WebmasterWorld Senior Member



 
Msg#: 4576740 posted 4:04 pm on May 28, 2013 (gmt 0)

One way ( IMO certainly not the only way ) that they "pre-sort" is to use ( and "boost" ) sites which "pre-sort" for them..pinterest is a good example of this ( made G's image sorting job much easier..now they know where the cute kittens and lace are likely to be ) ..especially if their search history and personalisation data for you knows that you are likely to be female and spend time and money on fashion sites, and looking up bios of soap stars..they ain't gonna send you to 4chan..:)


This would definitely explain why Google is sending people to sites that have strong internal search already (so much so that even my Baby Boomer friends wonder "Why doesn't Google understand that when I want Amazon results, I'll go to Amazon? When I use Google, I'm looking for something else").

And it does make sense for all the reasons discussed.

Perhaps Penguin 2 works in a reverse order deciding authority based on weight of positives as well as negatives. This would certainly account for websites that have been affected that didn't build links at all.


This is exactly what I've been thinking for a while. My site that Penguin hit had no manipulated links or spammy stuff going on. But it WAS my weakest site - visitors didn't respond to it like they do my better sites. I never understood why some of its pages soared to #1, and when they fell that didn't seem wrong to me. It was only when I learned that what had dropped them was an update supposedly about backlinks and spamming that I panicked - if Google thought I was spamming when I knew I'd never had any such intention, I had to figure out what I'd done that "looked" wrong to them.

But if Penguin also had factors that boosted some sites, then of course some non-spamming sites would see lowered rankings just because. And in fact that's what my Penguin demotion looked like - other pages were better than mine, so they bumped me down. (And I am seeing a small recovery post Penguin 2.0 - probably because I've worked on improving the site and am getting a better response from visitors now.)

For my ecommerce clients, I'm trying to convince them to stop making separate pages for each size or each color, but to combine the choices on to one or two pages at most.


I've been wondering why sites are still doing this. One niche I work in peripherally is product review. Some people in that niche set up a separate page for every single shade/shape/slight variation on a product to review. They say it's because three widgets from the exact same line may not perform the same way. But it's really just that they're afraid visitors won't find the exact widget review through Google unless it has its own page. I understand the dilemma, but I still think you can use headers to give Google a clue that your page is about multiple tightly related items rather than setting up separate pages.

Now am I relaxed or am I relaxed?


LOL, glad to hear it!

Lorel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4576740 posted 4:15 pm on May 28, 2013 (gmt 0)

Re setting up separate pages for each size and color:

Seems like setting up one page with all sizes/colors listed as the canonical and pointing all those other sizes/colors to this one would fix that problem.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4576740 posted 4:21 pm on May 28, 2013 (gmt 0)

I've been wondering why sites are still doing this.


As I say, not every CMS or cart makes this easy (specially if you have to pull in things like tiered pricing and so forth) and a lot of businesses can't afford to completely overhaul their online ecommerce site (specially if they've been dinged by Google) But ultimately that is what has to happen.

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 8 [9] 10 11 12 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved