homepage Welcome to WebmasterWorld Guest from 54.237.95.6
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 [8] 9 10 11 12 > >     
Penguin 2.0 is upon us - May 22, 2013
viral




msg:4576742
 12:52 am on May 23, 2013 (gmt 0)

Matt has announced Penguin 2.0 (Penguin 4). Either way it is out there and affecting.

Is anyone noticing much movement in the serps? I personally haven't seen much flux but Mozcast seems to be feeling something.

[mattcutts.com...]

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

[edited by: Brett_Tabke at 12:12 pm (utc) on May 23, 2013]
[edit reason] added quote [/edit]

 

fathom




msg:4578295
 6:28 pm on May 27, 2013 (gmt 0)

Talk about sitting on the fence.

You do understand, of course, that even if your experiences and anecdotal evidence covered hundreds, or thousands, or even ten thousand sites, it would still be statistically insignificant compared to the total number of sites or URLs out there, of every type and every topic.

It's really easy to fall into the trap of "because it's happening to me and these hundred other people, it must be universal" and I fall into that trap all the time myself. But it really gets in the way of trying to parse out what's going on, and how to navigate through it. And of course, to figure out what to do next if you can't.


Over the last 14 months, it probably falls into the "thousands" range. However, the aggregate knowledge/experience is from a wide range of sources, both related to me and also unrelated. Are you saying that all of these sources cannot be representative of a larger constituency of websites? Perhaps it's just a wild coincidence that all of these sources experienced the same types of demotions for sites that had zero SEO work done on them?


That would indeed be jarring & jolting news and would afford a huge class action lawsuit for unfair business practices.

I am positive wilburforce will seek legal counsel with that tidbit.


I agree, but I doubt it would ever be provable - the algo isn't open to any kind of scrutiny as we all know - how can you possibly "prove" something when the evidence isn't available to view?


First you suggest that all that stuff is evidence and cannot simply be wild coincidence and then you say none of it proves anything.

What you mean to say... you do not have the expertise to determine what is and what is not evidence and all the other references that are suggestive facts are not immediately available to you thus that is why you cannot conclusive prove anything to a preponderance of evidence which is all that is needed to file a suit of this nature.

turbocharged




msg:4578296
 6:34 pm on May 27, 2013 (gmt 0)

I am not simply taking Google word...

You most certainly did when you stated:

The lack of a Google acknowledgement is the best evidence.

Your statement is not evidence at all and can be classified as an unsubstantiated opinion. Case studies do exist, which support the theory that Wikipedia is whitelisted. Whether you choose to research these studies to determine their legitimacy is a matter of personal choice or lack thereof.

According to your theory, I also must be whitelisted.

If it were subjected to the same spam links that were used in the case studies I noted, and did not witness a Penguin demotion, then it would provide evidence of whitelisting. But it has not and therefore is inconsequential for the purpose of this discussion.

fathom




msg:4578297
 6:38 pm on May 27, 2013 (gmt 0)

Then you really need to read up on random sampling - like I say, many sources whose experiences matched my own were unrelated to me, plus there's the added randomness of various developers having all kinds of clients who are not related to one another in anyway (same with my own clients too). When you add it up, it's a large random mix of thousands of sites. BUT...."pretty much, yep" they are all somehow related to one another - granted, statistically this is possible in the same way you may win the lottery twice in a row, but I'd (understatedly) say it's unlikely that you are right.


And yet again, the random sample begins with a constant... PENGUIN 2.0 targeting link webspam.

I have yet to see a single website negatively impacted that did not have something to do with links.

If you have a domain the does not have something to do with links - point it out - it is a false positive and Google would love to fix your issue immediately.

fathom




msg:4578301
 6:52 pm on May 27, 2013 (gmt 0)

What evidence would there be to suggest that Wikipedia is not whitelisted?


I am not simply taking Google word...


You most certainly did when you stated:


The lack of a Google acknowledgement is the best evidence.


Your statement is not evidence at all and can be classified as an unsubstantiated opinion. Case studies do exist, which support the theory that Wikipedia is whitelisted. Whether you choose to research these studies to determine their legitimacy is a matter of personal choice or lack thereof.


According to your theory, I also must be whitelisted.


If it were subjected to the same spam links that were used in the case studies I noted, and did not witness a Penguin demotion, then it would provide evidence of whitelisting. But it has not and therefore is inconsequential for the purpose of this discussion.


I cannot prove a null which is the side topic here.

But I did already note that I have websites that emulate Wikipedia architecture and they do perfectly well which was the rational for why I must be whitelisted to.

[edited by: fathom at 7:05 pm (utc) on May 27, 2013]

taberstruths




msg:4578305
 7:03 pm on May 27, 2013 (gmt 0)

Honestly guys,

Why does Wikipedia interlink? Citations which is exactly what Google says it is looking for. Why do other sites interlink? Many do so for SEO which is what Google says it is going after. It all comes down to what is best for your users/visitors. If people come to your site and are click happy because you are offering them what they are looking for, Google will love you. If they come to your site and see a bunch of interlinking that distracts from what they are looking for, then Google will hate you. Your user metrics will show the difference.

fathom




msg:4578306
 7:03 pm on May 27, 2013 (gmt 0)

[searchengineland.com...]

Cutts:
What I think most people talk about [when they ask about whitelists], “Is there is some type of overriding, you are golden, you can never be touch, either philosophy or list.” And to the best of my knowledge, there is nothing like that.


Sullivan: So there is no overall, let’s call it the “Wikipedia list,” there is no overall, “This site should always be fine for everything.” But if you have a particular signal that you are implementing and you think, this signal is working well to deal with with 99% of the sites out there but “Wow, it is really hurting this other site for completely unrelated reasons, then let’s exempt them from that” type of thing?

Cutts: Well, I think if you were in charge of Google. Like suppose we all got hit by a bus and you guys had to come into the Googleplex and run the search engine yourselves, right? After you ate all the free food, then the next thing you would do, is you’d think of what is the philosophy, how do we make it work? And I think the right instinct is to try to do as much as you can algorithmically,

It is the same thing as spam. You try to solve hidden text or hacked site algorithmically, and then the stuff you can’t catch you are willing to take manual action on to remove because that is abuse and it is a bad user experience. And then you use that data to try do it the next round, so it is completely algorithmic, or it is algorithmic for more languages.

And I think that is the philosophy like almost anybody would come to, you want to be as scalable as you can.

Sullivan: So that is yes?

Cutts: I think that is a yes.


I could not find your case studies maybe you could post a thread on them.

ColourOfSpring




msg:4578307
 7:04 pm on May 27, 2013 (gmt 0)

First you suggest that all that stuff is evidence and cannot simply be wild coincidence and then you say none of it proves anything.

What you mean to say... you do not have the expertise to determine what is and what is not evidence and all the other references that are suggestive facts are not immediately available to you thus that is why you cannot conclusive prove anything to a preponderance of evidence which is all that is needed to file a suit of this nature.


Edit: I'm just commenting on what I - and many other website developers and clients of website developers have observed.

[edited by: ColourOfSpring at 7:22 pm (utc) on May 27, 2013]

ColourOfSpring




msg:4578309
 7:16 pm on May 27, 2013 (gmt 0)

And yet again, the random sample begins with a constant... PENGUIN 2.0 targeting link webspam.

I have yet to see a single website negatively impacted that did not have something to do with links.

If you have a domain the does not have something to do with links - point it out - it is a false positive and Google would love to fix your issue immediately.


Fair enough - you discount my anecdotal evidence by presenting your own anecdotal evidence. People tend to base their opinion from their own experiences. But if they want to hold their personal experiences up to scrutiny, they go deeper and get involved in some data collection and research. That's what I did. I'm a developer, and so I asked other developers. Ones I knew personally (and trusted), and ones I didn't know via forums. We all noticed commonalities between many sites that got Penguin demotions - i.e. sites that didn't engage in off-page SEO getting penalised. Very perplexing and disturbing. Clearly there's more to Penguin than purely off-page signals. That's my experience. I guess your only recourse to disagreeing with me is to flat-out call me a liar, but that's what my research found, no matter how counterintuitive it is to you.

The other telling observation was the ubiquity of big brands taking over the top ranking positions / pages, which is well documented. Would you disagree with this particular finding in regards to big brands dominating many commercial searches?

fathom




msg:4578310
 7:28 pm on May 27, 2013 (gmt 0)

Fair enough - you discount my anecdotal evidence by presenting your own anecdotal evidence. People tend to learn from their experience. But if they want to hold their personal experiences up to scrutiny, they go deeper and get involved in some data collection and research. That's what I did. I'm a developer, and so I asked other developers. Ones I knew personally (and trusted), and ones I didn't know via forums. We all noticed commonalities between many sites that got Penguin demotions - i.e. sites that didn't engage in off-page SEO getting penalised. Very perplexing and disturbing. Clearly there's more to Penguin than purely off-page signals. That's my experience. I guess your only recourse to disagreeing with me is to flat-out call me a liar, but that's what my research found, no matter how counterintuitive it is to you.

And then....when you look after Penguin updates in many of the niches that the affected sites were competing in, brands really took over (and even more so after Penguin 2.0).


I'm not calling anyone a liar.

Question to a developer: how does Google automatically program to favor only brands so that whatever a non-brand is can't get in on it?

Surely they don't do that manually.

They suggest (not me) it is primarily link oriented.

taberstruths




msg:4578313
 7:33 pm on May 27, 2013 (gmt 0)

They suggest (not me) it is primarily link oriented.


Links may be part, but I believe the way that it has happened is user metrics. Searches for brand name, time on sight, bounce rate, pages per visit. Take a look at the brands then take a look at your site. I would be surprised if you are beating the brand in these 4 areas.

ColourOfSpring




msg:4578315
 7:42 pm on May 27, 2013 (gmt 0)

Question to a developer: how does Google automatically program to favor only brands so that whatever a non-brand is can't get in on it?

Surely they don't that manually.

They suggest (not me) it is primarily link oriented.


You imply that white-listing is "surely" hard to do. I'm not saying they do that (though it may be the case), but it's one of the easier things to do if they wanted to go that route. There's a relatively small number of big brands in any given niche. Why is it hard for Google to identify them? Many fashion searches on google.co.uk reveal a who's who of typical high street brands in the UK. If you're talking about fashion in the UK, I'd say there are 15 to 20 major big brands. I could list them if you give me an hour of research based on turnover, number of high street branches, advertising spend (and Google would have a CLEAR insight into that via Adwords). In fact, Google could identify big brands by Adwords spend alone (or at least as a filter). Big brands by their very nature are easily identifiable from a "manual" point of view.

But I hear your cynical laughter already - yes, Google do everything 100% automated, it's all algorithm...yes, yes of course. I'm just presenting an alternative possiblity.

My personal opinion is that Google use links - big brands naturally have a larger reach than a small business when it comes to links. They will win links from august media sites (NOT even advertorials, but things like financial articles that delve into a big brands finances via ft.com and the like) - "bigness" begets the kind of links small businesses can never hope to get. And so it's a case of "might is right" - very expedient for Google to elect that method of determination, but from the look of the SERPs, that's where we are.

fathom




msg:4578316
 7:47 pm on May 27, 2013 (gmt 0)

Links may be part, but I believe the way that it has happened is user metrics. Searches for brand name, time on sight, bounce rate, pages per visit. Take a look at the brands then take a look at your site. I would be surprised if you are beating the brand in these 4 areas.


I would agree that behavioral analysis is a useful and interesting tool but ranking domains based on those matrix is problematic.

Searches for brand name - how does searches looking for Google impact searches for search engine or PPC?

time on site - how does Google gather that information universally? It doesn't have access to my domains' data especially of I use statscounter on hitslink or logfile analyzers. If I use Analytics - ok mabye but Google would need near 100% market share to be accurate. I'd be surprise if Analytics has 20% market share.

bounce rate - how does Google gather that information universally? It doesn't have access to my domains' data for this.

pages per visit - how does Google gather that information universally? It doesn't have access to my domains' data for this.

Wilburforce




msg:4578319
 7:56 pm on May 27, 2013 (gmt 0)

They suggest (not me) it is primarily link oriented.


I personally think - and my own site's behaviour is consistent with but not proof or my only evidence of this - that links are a strong element, but certainly not an exclusive one.

As I have previously posted, a couple of sites in my sector have followed the same pattern for a couple of key terms: several weeks before Penguin slipping from the top of page 1, then wavering between pages 1 and 2, then disappearing (i.e. a fall of 10-50 pages) in the first day or so after an iteration of Penguin. Three ex #1 sites inlcuding mine have all gone the same way.

As I have also posted (make of it what you will), for several weeks three ex-page 1 sites for my main key term were all on the same results page (page 48, from memory) a month or two ago.

Somewhere in all this I suspect there is a poisonous Panda/Penguin cocktail: if some on-page issues are flagged AND linking issues are added, down to the bottom you go.

fathom




msg:4578322
 7:59 pm on May 27, 2013 (gmt 0)

big brands naturally have a larger reach than a small business when it comes to links.


Sure and that is a competitive advantage... but link quality TRUMPS link quantity... and that evens the playing field again.

fathom




msg:4578325
 8:10 pm on May 27, 2013 (gmt 0)

I personally think - and my own site's behavior is consistent with but not proof or my only evidence of this - that links are a strong element, but certainly not an exclusive one.


ABSOLUTELY!

That's why many domains are failing to compete because "WE" have made link development into a completely segregated development.

With Google's algorithm (with PENGUIN as a side dish at the moment) all factors are collectively modeled.

It makes sense that Google started PENGUIN 1.0 on the homepage... it was the easiest place to determine webspam. Ask 1,000,000 website owners to link to you but don't tell them where to link or how to link and you'll get brand based links (company name or URL) to the homepage, what, about 90% of the time.

Tie that into their EMD algorithm and you can fathom "brand based linking" has any enormous advantage... and why when you don't emulate those values... you lose.

Now Google has taken another step deeper into the domain... and brand already had the advantage at the homepage and the homepage tends to feed the rest of the website (unless of course it isn't where the root of your links are coming from.

ColourOfSpring




msg:4578338
 8:47 pm on May 27, 2013 (gmt 0)

Sure and that is a competitive advantage... but link quality TRUMPS link quantity... and that evens the playing field again.


I gave the example about how big brands will get the attention of august media outlets like ft.com discussing the financial issues of big brand X or Y - no way is a small boutique business with 1 to 3 employees going to gain the same attention. If you have a £X,000,000 to spend on building branches in almost every high street in the UK, with £X,000,000 ad spend, then you will get that kind of attention. Quantity of links is nothing as we all know - it's just noise that will likely punish a site with a weak link profile anyway. Problem: particular links of quality that big brands have access to are out of reach of the small business site owner though - specifically media links.

Let me put it like this - are you suggesting that a small business owner can usurp the truly big brands in Google? It seems highly unlikely given the way things are going (more and more weight given to big brands). If you think it IS possible for small businesses to "win back" their position, then you seem to be suggesting that this may well be temporary that the big brands are dominating many commercial SERPs and it's just a case of small businesses "figuring it out" (what to do) to usurp the big brands. Or do you think in general terms that the big brands have a natural advantage over small brands and therefore generally speaking big brands will continue to dominate?

[edited by: ColourOfSpring at 8:50 pm (utc) on May 27, 2013]

taberstruths




msg:4578340
 8:49 pm on May 27, 2013 (gmt 0)

Fathom,

Search for brand would give a level of brand authority or level of user interest in the site. It would be a factor, not a catch all.

Same the other user metrics. You do not have to have 100% sampling to come up with a statistical average with a margin for error that is within acceptable limits. Think Presidential polling data. 5% of likely voters. They get more that that amount of data from serp ctr, chrome, and sites with adsense. Then take into consideration such data that is given from second party providers such as alexa, compete, quantcast ect.

I would highly doubt they do not have these user metrics about your site unless your site is 100% blocked from Chrome Users, Alexa toolbar, no adsense, and no serps.

ColourOfSpring




msg:4578348
 8:53 pm on May 27, 2013 (gmt 0)

I would highly doubt they do not have these user metrics about your site unless your site is 100% blocked from Chrome Users, Alexa toolbar, no adsense, and no serps.


Agree 100%. The reason Google developed Android and Chrome was to expand their data capture reach.

turbocharged




msg:4578351
 8:58 pm on May 27, 2013 (gmt 0)

Once again fathom we return to you believing everything that Google says, by quoting Google employee statements, even though you say you don't.

I could not find your case studies maybe you could post a thread on them.

I don't think I am allowed to post a direct link to the case studies. But I will provide you some final stats for one of them and a snippet you can use to find it on your own, if you so choose.

The goal - test if blasting 100s of thousands links each day can penalize site or help it?

Site 1 - Wikipedia Page - position 3 (starting position = 4)
Site 2 - Government Site - 258 (starting position = 36)
Site 3 - Small Private Site - Not in top500 (starting position = 42)

Every website not whitelisted is vulnerable to external forces outside of their control. I believe this because I have watched many of these case studies with interest, and they all end in some unknown website being driven into oblivion.

I won't discuss this any further with you fathom. My opinion is formed by evidence whereas yours appears to subjectively based on biased interpretations of what Google employees say. Such statements by Google employees are not evidence but merely public statements that are in direct conflict with everything I have witnessed.

taberstruths




msg:4578352
 9:04 pm on May 27, 2013 (gmt 0)

Turbo,

It would be interesting to look at the user metrics of the 3 sites and establish that the 3 sites had near identical user metrics. The same with whether or not they had the same link base to start with and if the links slammed at the sites was in proportion to the percentage of links they started with.

Sgt_Kickaxe




msg:4578361
 9:46 pm on May 27, 2013 (gmt 0)

I don't know what the argument above is about, I don't have the time to figure it out, so forgive me if this angers anyone.

- All sites are not treated equally
- All topics are not treated equally

Those are the only two truths with Google at this time. There is a difference in how Google handles informational and transactional sites and there are many layers of differences in how Google handles different stores/sources of information. In fact Google wants to know where you prefer shopping to give you results you'll appreciate based on your habits so every Google user is served potentially different results.

Penguin further divides the personal user experience in a way that makes "blanket" statements completely irrelevant.

Topics: If your site in ANY way is about a product that can be currently purchased widely in stores then it is highly unlikely that you will find search results for new, obscure, affiliate or less well known websites ranking in the top 10(after an initial surge of traffic at launch while Google figures the site out).

Informational: More and more subjects are treated equally strongly by Google.

The end result is that if your site is not well established or does not "wow" people in some way the odds of it being ranked well for mainstream news/shopping keywords is slim.

I just don't buy that Penguin is a punisher of websites anymore, it's more of a "does this site fit within our strict new expectations for this keyword" and "is this site targeting our shortlist of elite keywords" type scenario.

You're not ranking top 10 for a product found in stores without some serious net clout, and likewise for trending news, end of story. Far too many webmasters keep trying to crack this now impossible shortlist of Google coveted keywords.

fathom




msg:4578362
 9:46 pm on May 27, 2013 (gmt 0)

I don't think I am allowed to post a direct link to the case studies.


A legitimate case study.. sure why not?

But I will provide you some final stats for one of them and a snippet you can use to find it on your own, if you so choose.

The goal - test if blasting 100s of thousands links each day can penalize site or help it?

Site 1 - Wikipedia Page - position 3 (starting position = 4)
Site 2 - Government Site - 258 (starting position = 36)
Site 3 - Small Private Site - Not in top500 (starting position = 42)

Every website not whitelisted is vulnerable to external forces outside of their control. I believe this because I have watched many of these case studies with interest, and they all end in some unknown website being driven into oblivion.


Did the government website (example NASA) and the small private site rank for millions of competitive phrases to start with that shows the three domains were equal in opening comparisons minus whitelist status?

I find it odd that your test case sort of suggest the testing of Negative SEO and puts forwards a hypothesis that maybe wikipedia is whitelisted but you did not start with a confirmation on that point so I cannot see how you can conclude whitelisting is the causation of near null impact.

I did not try the case study but maybe wikipedia massive amount of quality trumped your inadequate value of quantity.

How did you resolve that?

You likely proved that most websites are vulnerable but when the internal linking structure dwarfs your test parameters you do not get conclusive data thus the lack of supporting data but that does not confirm anything other than inconclusive results.

Your saying I don`t have the ability to test this not - I tested this.

I won't discuss this any further with you fathom. My opinion is formed by evidence whereas yours appears to subjectively based on biased interpretations of what Google employees say. Such statements by Google employees are not evidence but merely public statements that are in direct conflict with everything I have witnessed.


The problem with a 2 + 2 = 4 equation you don`t have that... you have _____________________ = 4 and you pretend the 2+2 is all that exists on the other side.

Wilburforce




msg:4578365
 10:26 pm on May 27, 2013 (gmt 0)

You're not ranking top 10 for a product found in stores without some serious net clout


I don't supply a store-based product: it is a service-sector over-supplied niche market.

Until Penguin 1, the only pages in that sector returning good positions were well-optimised with good content (and that scenario persists on all other main search engines).

All the well-optimised good content has now gone (no page 1 result from 18 months ago is in the current top 10 pages), and the current top 10 - none above PR3 - really do make you wonder what they are doing there.

I appreciate that Google cannot be looking with human eyes at every search term in every sector, and if any Google employee looked at the main terms I track they would struggle to avoid embarrassment.

The problem is that really embarrasing results are not confined to niche service queries. I have said before that Google is the way people find things. I was one of those people. I now intermittently try Google for personal searches I have already conducted using Bing, and they only perform better than Bing for complex (long) search terms and music/videos: I can't find things on Google now.

The thing is, having an idea of how a search engine works and knowing how to search are part of the same process. When you make its operation completely incomprehensible, you make it useless.

I am now following (and posting in) these threads more out of curiosity than from any sense that there might be a solution. I really do think Google organic listings are kaput, both for webmasters and for users.

spina45




msg:4578368
 10:41 pm on May 27, 2013 (gmt 0)

I don't mean to interrupt a passionate debate...but related to the Penguin update I have a comment/question. I'm seeing something alarming in the SERPs and would like to get opinions from others.

Out of nowhere a new site is at number #1 for a specific key phrase. One word of the phrase is "DVD." When I Cmd-F and search his page I see 81 occurrences of the word "DVD." I thought this was considered "keyword stuffing" and would result in a penalty...not a reward.

The reason there are 81 occurrences is because under his description he lists....”You may also be interested in the following...and he includes “DVD” as part of every title name. Ex: Title 1 DVD, Title 2 DVD, etc.

Hmm. This got me thinking. I use “add to cart” as a text link on all my category pages. So in one instance I have “add to cart” repeated 100 times on a certain category page. “Add to cart” is listed in GWT as one of my most linked terms. I never liked knowing this but figured what else could I do?

After seeing these results I want to change “add to cart” to “Buy DVD.” Does anyone see any problems I may face by doing this? And if this does help me, so much for being shy about repeating keywords!

turbocharged




msg:4578370
 10:48 pm on May 27, 2013 (gmt 0)

The problem is that really embarrasing results are not confined to niche service queries.

I concur and have noted how Google is now displaying both Amazon USA and UK listings for some product searches originating from the United States. I'm not an Amazon guru, but the last time I heard Amazon UK would not accept orders from shipping destinations in the USA. So much for relevancy. :)

I really do think Google organic listings are kaput, both for webmasters and for users.

Once again I agree. We are probably real close to paid listings pushing many organic listings off of page 1 and beyond if there are enough Adwords advertisers. Google's trimming down the organic results, sometimes to 7 or 8 listings, would indicate Google's desire to reduce the competition created by organic listings and create a more profitable user experience.

Wilburforce




msg:4578372
 11:06 pm on May 27, 2013 (gmt 0)

Does anyone see any problems I may face by doing this?


I wouldn't change anything without waiting (for weeks, not hours) to see whether the other site maintains a good position.

Because "add to cart" is in such widespread use it may have immunity to keyword stuffing that other phrases do not have, so I wouldn't be confident that a block replacement will have

1. any positive, and
2. no negative

consequences.

Knee-jerk responses to Google are more likely to do harm than good. I would say "concentrate on the end-user", but I'm not sure that works any more.

spina45




msg:4578375
 11:41 pm on May 27, 2013 (gmt 0)

> I would say "concentrate on the end-user", but I'm not sure that works any more.

Yeah, you got that right! I hear you regarding "knee jerk" but since sales have been way down for too long, I changed it "just to see." I'm getting a larger and larger share of traffic from StumbleUpon and Pinterest and I don't even have accounts. It's other people sharing my "terrific content." I thought that's exactly what Google wanted and rewarded. So much for "do no evil."

rango




msg:4578384
 11:55 pm on May 27, 2013 (gmt 0)

An observation from my stats before this update. A few days in advance, we had a traffic bump, up 10-15% (which for us is a significant number). It lasted about a day and then the update happened.

This has happened in the past with updates also. The theory being that Google sends a bit more traffic to test whether they are satisfied -part of the quality scoring algorithm that is more associated with Panda.

But two things perplex me about that.

1. Panda is supposedly continuous now, so why the need for a bump any more?
2. The update was Penguin and focussed on inbound links.

It's just not quite adding up for me. Is Penguin more than just links? Or is Panda still subject to refreshes? Or is something else causing the pre-update traffic bump?

Context: we are affected by Panda

diberry




msg:4578392
 12:30 am on May 28, 2013 (gmt 0)

I'm not sure what you're considering commercial/transactional queries. The main keywords I'm referring to are used for people looking for free and/or to buy.


You're right, it's somewhat subjective. Basically some queries are very obviously purely for information, like, "What was Elvis' middle name" or "how to [just about anything]." Other queries, like "Canon T3i Rebel" could go either way unless the person adds "best price" or "buy" or "reviews" to clarify. And then queries that use "shop", "buy", "price" are almost for sure leading to a transaction. But there is a lot of overlap.

As much as Penguin is supposedly about link profiles and anchor text, could it be that other factors are used to determine if a site qualifies for a penguin penalty?


Definitely, because Google has never specified what Penguin is, other than to basically say we'll never figure it out. And that suggests that it IS a more complex pattern like you describe.

You do understand, of course, that even if your experiences and anecdotal evidence covered hundreds, or thousands, or even ten thousand sites, it would still be statistically insignificant compared to the total number of sites or URLs out there, of every type and every topic.


He's not offering this as a theory to be established, but rather as a counter to Fathom's assertions that these algorithms never have unintended consequences. Of course they do - algos, like juries or polls, do a good job on the whole but aren't perfect, especially with outliers. Of course not every site's going to rank precisely where the Google engineers would hope the algo would put them. To make the argument that Penguin is not perfect, you don't need a substantive dataset. You only need a substantial dataset to "prove" Google has misled us about the purpose of Penguin or is deliberately mistreating certain sites, which is a whole other topic.

I have yet to see a single website negatively impacted that did not have something to do with links.


Maybe that's because your mind is so closed that people doubt you would be objective if they DID show you their sites so you could verify that they weren't link spamming. I know that's how I feel.

I just don't buy that Penguin is a punisher of websites anymore, it's more of a "does this site fit within our strict new expectations for this keyword" and "is this site targeting our shortlist of elite keywords" type scenario.


This is what I'm wondering too, with the way I'm seeing brands, brands, brands with NO indie sites mixed in on some queries. It's like Google's trying to decide who to include rather than who to exclude.

I'm not an Amazon guru, but the last time I heard Amazon UK would not accept orders from shipping destinations in the USA.


That's not true anymore. I've been ordering Region 2 DVDs from them for years now. So in this case, what Google's doing MIGHT make sense for consumers (it would for me if I searched Google instead of just going straight to Amazon).

Is Penguin more than just links?


Google has asserted that we'll never manage to backward engineer Penguin. Therefore, it HAS to be about more than backlinks, unless that statement was an elaborate bluff on their part. But also consider: why would they spend a lot of money and energy developing a new algo that just looks at backlinks, something the old algo has done reasonably well with for years?

Whitey




msg:4578393
 12:32 am on May 28, 2013 (gmt 0)

@rango - can you clarify if you were also affected by the Penguin updates.

Just seen a network of over 100 websites involving small business' demoted in Penguin 2.0 with off topic footer interlinking. The sites were all managed by one SEO. Can't understand why Google had to wait until Penguin came along to adjust this network. This has been ABC penalty/filter tactics for over 8 years. I think the SEO might find it hard to stay in business, if ever they deserved to be in business for implementing this between their client base. They are apparently telling clients that they are trying to figure out what went wrong. I wonder what they read - terrible.

The funny thing is the footer links provided no lift for the keyword anchor text anyway, so Google didn't need to penalize the sites. still .... this is no good SEO.

Lorel




msg:4578394
 12:36 am on May 28, 2013 (gmt 0)

I was checking to see why a clients preferred keyword was only ranking on page 2 (this site not affected by Penguin2) so I checked the top ranking indie site (#1 was Amazon) and almost all it's backlinks are on a network of domains it owns (it even links them all with the same logo on all sites). The next step is to file a complaint about Penguin2 serps that MC provided?

[edited by: Lorel at 12:46 am (utc) on May 28, 2013]

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 [8] 9 10 11 12 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved