homepage Welcome to WebmasterWorld Guest from 54.161.155.142
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 8 9 [10] 11 12 > >     
Penguin 2.0 is upon us - May 22, 2013
viral




msg:4576742
 12:52 am on May 23, 2013 (gmt 0)

Matt has announced Penguin 2.0 (Penguin 4). Either way it is out there and affecting.

Is anyone noticing much movement in the serps? I personally haven't seen much flux but Mozcast seems to be feeling something.

[mattcutts.com...]

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

[edited by: Brett_Tabke at 12:12 pm (utc) on May 23, 2013]
[edit reason] added quote [/edit]

 

netmeg




msg:4578638
 4:21 pm on May 28, 2013 (gmt 0)

I've been wondering why sites are still doing this.


As I say, not every CMS or cart makes this easy (specially if you have to pull in things like tiered pricing and so forth) and a lot of businesses can't afford to completely overhaul their online ecommerce site (specially if they've been dinged by Google) But ultimately that is what has to happen.

Leosghost




msg:4578642
 4:30 pm on May 28, 2013 (gmt 0)

Leosghost, are you talking about Google caching SERPs?

No..

Perhaps this is what you are you saying when you mention "pre-sorting"

No..what I'm talking about is "pre-sorting" and discarding ( or sufficiently "demoting" the "relevance score" of some pages..and thus in some cases some sites ) some pages, prior to the actual running of a query .. So the database or "set" that they have to run live queries or searches on..is smaller..and the "pre-sorting" happens outside of "real time search"..

Hence "recovery", can take much longer than it used to..and in some cases might never happen for some pages..

Real time running of search queries against the full data set of all crawled pages that were ever crawled..( and returning a SERP page or pages ) is no longer feasible for them..

So it appears ( all my observations and the observations of many others appears to bear this out ) that they do not..

They continue to crawl all they can ( and in considerably more depth than Bing ) but they do so less frequently..and the frequency is determined by if they "expect" a site's content ( based upon their historical data about the site(s)) to change rapidly or to contain "fresh material" at short intervals..( in simple terms ..hence WebmasterWorld gets crawled more frequently than a mom and pop site )..but they do not "interrogate" the full index for each query "live"..

Even if they do not throw anything away..they don't go looking through the entire index they have anymore when you ask" how to" or how much is" or "where can I buy"..and frequently when you make your search..they reinterpret ( "instant" is a good example of this in action ) what you said into what you maybe "meant to say"..and then they give you back results that they think you'll like and find relevant..from a "set" that they keep "pre-sorted" and "nearer and easier to put their hands on"..

The more people that appear to G ( via whatever metrics they are using ) to be satisfied with what was nearest to hand..the more G keeps such pages and sites..in a "nearer to hand" box or "set"..and it is quicker to "go look" for them too..:)

Think along the lines of "seek time" on spinning rust HD's...the bigger the HD capacity..the longer the "seek time"..even with a given quantity of RAM..( and windows which stores files in a linear system is slower than Linux and other Unix type file systems which don't..one reason why they did not set up on winboxen ) ..easier and faster to get "answers" /"find files that match criteria via an algo" out of a small set ( HD )..than a big set ( HD )...and from time to time ..you run filters ( more complex algos ) on the big set ( HD ) to see what you want to include in the small set(s)..Solid state has speeded things up ( although the "write cycles life" is shorter ATM )..but hardware speeds still make for bottlenecks..and having to match across the entire index is also subject to hardware constraints..which are worse than matching across "subsets"..

What Matt mentioned about "caching" ( and in particular "local caching" ), is a closer fit to why, even with GEO-targeting nullified as much as one can, you in the UK will not see the same SERP for the exact same query as I will in France or diberry or tedster will in the USA..

However, local caching at their DCs ( even allowing for their DC balancing, and "hand offs" )..will affect to a degree the subsets that we individually see...as will any other personalisations included by G..

purplekitty




msg:4578648
 4:53 pm on May 28, 2013 (gmt 0)

It would be funny if Google weren't driving 90% of commercial internet traffic.

This isn't true at all. It might have been back in its heyday, but as I, and others, have posted, many people are bypassing Google all together, and social media is on the rise as a way people are finding product.

If you're talking about search traffic, I wouldn't even say Google has a 90% share of that either.

ColourOfSpring




msg:4578660
 5:51 pm on May 28, 2013 (gmt 0)

If you're talking about search traffic, I wouldn't even say Google has a 90% share of that either.


In the UK, it's above 90% search share on the recent surveys I've read. Sad to say, but search traffic is high-intent - sure you can win traffic from Pinterest and other sites, but I do wonder what kind of intent that traffic has...(of course dependent on the type of site you have)

ColourOfSpring




msg:4578669
 5:58 pm on May 28, 2013 (gmt 0)

Leosghost thanks for the explanation on "pre-sorting" - fully understood now. It makes me think of how Bing and other search engines deal with an ever-expanding internet.

simplo




msg:4578726
 8:17 pm on May 28, 2013 (gmt 0)

Although I have not been affected by panda 1 or 2 directly, I'm seeing favorable results after panda 2 in the serps. Starting Friday, May 24th a blog that popped up back in January with spammy content and outranked me in 10+ phrases has dropped from 1st to 5th or lower consistently without recovery. My longtime competitor went from 3rd to 1st is some keywords. I'd rather share the traffic with them than some spammy blog. So, from my perspective, in this example the update appears to be targeting the correct sites.

This blog did have a link farm of approximately 10 sites, which seemed small but some sites contained up to 30k indexed pages.

kellyman




msg:4578737
 8:43 pm on May 28, 2013 (gmt 0)

something else is brewing, as mentioned prior my home page keywords got hit however all my Inner pages responded with slightly improved rankings,

Tonight some keywords that were homepage targets have moved to the inner pages and keywords i have never ranked too well for have appeared on the home page

It makes me think that the Anchor keyword density is playing a much bigger part than i first thought

scottsonline




msg:4578739
 8:47 pm on May 28, 2013 (gmt 0)

They seem to be going backwards. We are seeing a ton of #1 spots filled by template manufacturer sites with zero back links. I guess now is a good time to start a new site?

bwnbwn




msg:4578746
 9:06 pm on May 28, 2013 (gmt 0)

They seem to be going backwards. We are seeing a ton of #1 spots filled by template manufacturer sites with zero back links. I guess now is a good time to start a new site?
I call this the Google Churn. Testing different aspects or variables of the recent P2 launch. It won't stay that way I am 99% sure. Still to early to do anything but wait till you can see some stability in the serps. Right now is not a good time to do anything.
Whitey




msg:4578798
 11:37 pm on May 28, 2013 (gmt 0)

I'm proposing some renewed perspective on this thread :
In reading and re-reading many of our threads including this one, I think the problem is that people are assuming that Penguin is only about backlinks. There are many reasons NOT to assume that. [webmasterworld.com...]


Let's get some more data focused points and encourage one another to think outside of just backlinks. [ I posted an earlier question without response so reposted the question here on those unaffected by link notices : [webmasterworld.com...] .

Leosghost




msg:4578804
 11:53 pm on May 28, 2013 (gmt 0)

Who do you think is thinking about just backlinks..other theories have already been put forward..

"Encouragement" to do otherwise ( than to "encourage one another to think outside of just backlinks", which is not what is happening in this thread ) would appear to me to be superfluous, and conflating two separate threads would ( IMO ) serve no purpose..especially as this thread is already 10 pages in length...if people are not joining in and responding to your post in the other thread..it is perhaps because they either disagree with your premise ..or merely do not wish to comment in that thread..

Better to bump that thread..( and see if it "grows legs" ) than to attempt to make the "straw man argument" to divert / narrow / restrict / the focus of this one.. :)

* in fact your proposition does not narrow it..but does IMO attempt to quash discussion of "backlinks" in this thread..when all possibilities may well be relevant..in this thread..

Asking for "data"..without offering one's own ideas at to what might be going on or be influencing ..always sounds ( again IMO ) like the threads in "apache" where people ask for ideas ( ie; "solutions" ) without offering any of their own ..or attempting "shepherding" threads..for which we have designated admins and mods..:)

[edited by: Leosghost at 12:06 am (utc) on May 29, 2013]

seoskunk




msg:4578807
 12:03 am on May 29, 2013 (gmt 0)

Agree with Leosghost, and other theories have been put forward.

But links have historically been the primary divider and ranking factor on Google so its only natural to think of them first and they are the fundamental basis of discussion.

Leosghost




msg:4578810
 12:17 am on May 29, 2013 (gmt 0)

I'm proposing some renewed perspective on this thread : In reading and re-reading many of our threads including this one, I think the problem is that people are assuming that Penguin is only about backlinks. There are many reasons NOT to assume that. [webmasterworld.com...]




Let's get some more data focused points and encourage one another to think outside of just backlinks. [ I posted an earlier question without response so reposted the question here on those unaffected by link notices : [webmasterworld.com...] .

So ..you think it might all be down to what..precisely..

Speculate, rather than ask for "the data" from others and then "collate"..

One does not get to understand what makes people hold viewpoints , merely by asking for them and then collating them..one must occasionally take the plunge and make an analysis of one's own, and do so before "consensus" appears to have been reached..

Some of the best and most enlightened things ever posted here came "off the wall" ..and not as result of waiting to see which way admins, mods or senior members were leaning..

Many of us who fall into those three categories, may have been doing this for so long, that we may be unable to see the wood for the trees..

Especially those of us whose livelihoods depend upon search and the internet..

The story of "The Emperors clothes" ( or was it the king ? Danny Kay sang about "the king" it seems to me? but the original story was about "The Emperor" ) and "all that" springs to mind..:)

Whitey




msg:4578814
 12:47 am on May 29, 2013 (gmt 0)

links have historically been the primary divider and ranking factor

links have historically been seen as the primary divider and ranking factor

That's the part of the problem IMO. Search has shifted. I agree with Tedster's sentiment.

Links have been cleaned. Links notices have been ignored. No difference in the relatively small number of examples provided in the latter. So Penguin is not just about links.

So what stands out in this thread and elsewhere around the net as a major possibility to anyone. After 284 posts I might well have missed something. Forgive me - I understand the valuable efforts and contributions of all involved.

turbocharged




msg:4578815
 12:51 am on May 29, 2013 (gmt 0)

Seems like setting up one page with all sizes/colors listed as the canonical and pointing all those other sizes/colors to this one would fix that problem.

Agreed. Amazon does quite well with affiliate links using rel=canonical to point to the non-affiliate URL. Maybe quite well is an understatement these days. :)

Leosghost




msg:4578817
 1:12 am on May 29, 2013 (gmt 0)

I, for one, would love to be doing as "quite well" as Amazon..and I'm doing "quite well"..

@Whitey..so what do you think is a major possibility..?
( apart from "agreeing with Ted", who hasn't actually said what he thinks is a possibility..only what he thinks is not as important a possibility, as he thinks some do/might ) ..we all have our own slightly ( or largely ) differing opinions..yours is..? ..why..?

Because if we all only speak about ( or collate ) what others are speaking about ( here and around the net ) ..we get nothing new ..only the lowest common denominator..and that gets you a camel ..instead of a horse..:) Great in the desert..crap when racing against other horses in the Arc de Triomphe..

[edited by: Leosghost at 1:30 am (utc) on May 29, 2013]

Leosghost




msg:4578821
 1:26 am on May 29, 2013 (gmt 0)

<slightly OT>
Amazed that anyone still does the "page per size" and "page per colour" thing..

Page or even site per garment ( think.. jacket .com and pants.com etc )..yes..works great..

But within those..drop downs ( or keep them on the page, in some way ) all the way..

Let them go to a "open new page" and you might just lose them to a bad connection, a "time out", a DNS fault, or any other glitch..or their browser might simply "crash"..

And when they come back to find you via..Google ( or whoever )..they might get your competitor ( instead of you ) at #1 slot..and decide to try them, instead of continuing on through your cart and buying..
</slightly OT>

taberstruths




msg:4578848
 3:51 am on May 29, 2013 (gmt 0)

Here are the theories I am working with now

Link velocity and traffic stats are tied together now.

Links with activity at originating source are given greater value than links with no activity. Activity = social signals and traffic including actual click throughs.

bounce rate, time on sight, and page views are used to give a site a quality rank.

Those are what I am doing some experimentation with at the moment.

Martin Ice Web




msg:4578906
 7:27 am on May 29, 2013 (gmt 0)

@Leosghost,

it now Little off-Topic.
I put all my similar widgets on one page with one description.
From this i link to the single page - using the widget Name as anachor - because there is a need for PDF Downloads and other slight factors i canīt put on the summary page. The single has a canonical tag. Google catched this up.
The summary page exists for all Colors linking to the widgets with lenght and the one Color.

So u mean I have to summarize all colors and lenght into one page by using Dropdowns?
Should the Dropdown then link to the single page in order to buy?

Has Google become this goony not to see that this is not spam but made for users?

AND! Is this user frindly, because i canīt put stuff like stock, question button... into the Dropdown. The user ->has<- to go to the single page to see more information.

I saw that ebay is using this and some of clothing sites.

Maybe it is time to use Ajax? But i donīt what Google thinks about a page with multiple datasets?

seoskunk




msg:4578960
 10:51 am on May 29, 2013 (gmt 0)

Do you think negative pagerank exists?

I have heard a number of people speculate on this, just as pagerank on "do follow" links flows what about negative pagerank flowing.

So in effect a penalty is passed on by outward bound links. Where one site in a network is penalised that dampening factor is passed through negative pagerank to other sites in the network.

Wilburforce




msg:4578997
 1:06 pm on May 29, 2013 (gmt 0)

I have heard a number of people speculate on this


A principal difficulty is that it isn't really possible to do much more than speculate on any of it.

What we are seeing is some previously well-placed pages/sites/key-term results lose position - sometimes massively - for no obvious reason, and others appear in good positions in an equally obscure manner.

Some elements we know about almost certainly form part of an overall destructive score, and from here the following look likely prime candidates:

1. Backlink profile;
2. Internal links;
3. On-page and backlink keyword concentration;
4. Duplicate or similar content;
5. High outward-link count;
6. Hidden page elements.

I have a suspicion, also, that Google doesn't like breadcrumb menus (or, at least, not in combination with other internal menus and links).

Positive scoring is much harder to guess. I have seen page 1 results in the last year (not always for any length of time) that do not have any apparent relationship with the search term, or any apparent authority at all. The current page 1 results for my own key term look like they wouldn't recognise SEO if it bit them (although the #1 has a PR4 backlink, which probably accounts for a lot). However, there have also been quite protracted periods where the top positions have been dominated by very black-hat spammy-looking pages.

It looks to me as if anything at all that could possibly be construed as gaming Google's algorithms - even if it is very white-hat SEO - can now prove permanently fatal. My best shot at best practice for current Google SERPS would be:

1. Do not create or seek high numbers of low-quality backlinks;
2. Use key terms very sparingly;
3. Use lots of synonyms for key terms;
4. Use minimal internal and external links;
5. Avoid page similarity.

On the last point, if you have to have similar content (e.g. blue wighet, green widget...) try to vary on-page descriptions rather than having blue/green as the only difference, and add a a rel="canonical" link to the <head> section of the non-canonical version of each page.

My current view of future Google organic results, however, is pretty gloomy. If I had any Google shares I would sell them now, as they cannot hold their market share indefinitely with the current mess, which has now even attracted mainstream media comment.

tedster




msg:4578998
 1:06 pm on May 29, 2013 (gmt 0)

I saw that ebay is using this and some of clothing sites.

Sure, if the list is relatively small - not in the dozens or more.

Maybe it is time to use Ajax? But i donīt what Google thinks about a page with multiple datasets?

See Google's Webmaster Help for Ajax [support.google.com].

Leosghost




msg:4579000
 1:11 pm on May 29, 2013 (gmt 0)

@Martin Ice Web

Items and pages .."drop downs" etc..
A thread that is dealing with a similar issue to this right now..:)
[webmasterworld.com...]

SerpsGuy




msg:4579008
 1:19 pm on May 29, 2013 (gmt 0)

One of my keywords is way down as of yesterday. Could be google churn, but I dont see to much more movement.

Lorel




msg:4579114
 5:10 pm on May 29, 2013 (gmt 0)

@Wilburforce

In addition to your list below I would add focus on social networking.



1. Do not create or seek high numbers of low-quality backlinks;
2. Use key terms very sparingly;
3. Use lots of synonyms for key terms;
4. Use minimal internal and external links;
5. Avoid page similarity.

netmeg




msg:4579129
 5:57 pm on May 29, 2013 (gmt 0)

I have a suspicion, also, that Google doesn't like breadcrumb menus


Maybe, but most (maybe all by now) of the sites under my purview have breadcrumbs and I haven't noticed it having a deleterious effect.

bwnbwn




msg:4579130
 5:59 pm on May 29, 2013 (gmt 0)

I think from what I have seen a lot of old domains got hit. The ones we would target specific anchor text with links. The good ole link trading days have come back and bit us. I suspect the ones that were ranking either got lazy or have to many webs to work and let their work slip, without fresh links the site looks to Black and got Wacked.

In some of the serps I am very familiar with I see domains that I have never seen in the serps before. I took some time and examined the top sites it is bad real bad. They have to name a few-No redirect from non to www, url's are ?=id_25, horrible titles just keyword stuffed, no description. I really hope this won't stick because it is really really bad. BTW I have no interest in this term I just worked in this area for 10 years until we sold the domains in it, so I have no personal interest or investment.. I am just looking because I know this area and what websites ranked before P2. The webs I work now are all OK for now but jeeze Google I know you didn't want the serps to have this crap floating in the top spots.

Dymero




msg:4579221
 9:48 pm on May 29, 2013 (gmt 0)

2. Internal links;
4. Use minimal internal and external links;


There is, of course, a difference between going overboard on internal links and having a strong internal linking structure. It's also possibly to under do it and get no additional benefit.

I've had pages rank well based on internal links alone.

Awarn




msg:4579251
 11:27 pm on May 29, 2013 (gmt 0)

I think we might be making this too hard. The person in the top position of my niche has the same menu and links and breadcrumbs on 10K pages. Now here is the interesting part. There is a mirror site that links 10K pages to the main site. Then there is another 10K links that is something like pci or cart links (see 301 redirect right back to the main site) Then another 10K links that is a paid link without a nofollow. Now that link is from a site on the same subject matter. But if you look at it generically you see 30K links from sites on the same subject matter. So maybe you just need multiple URLs all on the same subject linking to one another. I know there are errors in the site (404s). No structured data, nothing huge on the social side. I think the whole nofollow thing is a gimmick.

One of the other sites on page one has multiple URLs and they link them to the main site to create their backlink profile too. Don't use any nofollows either. Both are like single company link wheels use multiple domains.

Garya




msg:4579265
 11:56 pm on May 29, 2013 (gmt 0)

One of my sites got hit, it's been up since 1998. Always been on the first page sometimes top or bottom of the page over the years.

Over the past 10 years I acquired quite a bit of directory links and other type links but I never paid for them.

Last week my homepage is now somewhere in 10+ pages back in the listings.

So I went back and looked at all my links and I found out I have tons of back links from directories with the same key phrase around 200 domains. They are all old links.

So it loos like 2.0 went after all these old links and instead of just ignoring them or devaluing them they penalized me for them in 2.0.

How do I know this, my other site has a similar link profile but does not have
many directory links if any. And it improved in the rankings.

So logically if I remove all of the directory links I should recover as most of my inner pages are OK.

There is no other reason I could find for the penalty.

turbocharged




msg:4579273
 12:45 am on May 30, 2013 (gmt 0)

So it loos like 2.0 went after all these old links and instead of just ignoring them or devaluing them they penalized me for them in 2.0.

Maybe they were not devalued before and now are. This may explain your drop in ranks. Did you receive any warnings in Webmaster Tools?

The way most people used directories is to use the same info in each one by copying and pasting. Those that did not do this are fine. All duplicate content gets devalued, whether it's on your site or linking to your site.

This 343 message thread spans 12 pages: < < 343 ( 1 2 3 4 5 6 7 8 9 [10] 11 12 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved