homepage Welcome to WebmasterWorld Guest from 54.227.41.242
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 60 message thread spans 2 pages: < < 60 ( 1 [2]     
Panda Updates Now Processed In Real Time?
SEOPTI




msg:4349757
 1:30 am on Aug 10, 2011 (gmt 0)

I'm quite sure since two sites have completely recovered. One last week on Friday and the second one today which means +400% for each site.

 

Rasputin




msg:4350363
 2:09 pm on Aug 11, 2011 (gmt 0)

Londrum, I think they would look at several related factors - for example if a page has a high bounce rate, a low time on page, followed by searchers quickly looking at another result on the same search page that would suggest dissatisfaction with the first result.

They would probably also compare user metrics only with those for a similar search - so if my page about 'big red widgets' keeps people interested for 2 minutes, and people then look at another page on the site, perhaps occasionally sharing on facebook etc, while they are there that would give specific user metrics - but only to compare with other searches for 'big red widgets', not with searches for 'euro-usd exchange rate' etc.

And if your average result for quite a few pages is not good you perhaps get another point towards getting hit by panda.

To be honest although I've been hit by panda myself it does seem like quite a plausible approach to assessing pages and sites, at least for sites getting a significant aount of traffic where metrics can be measured.

(Although I believe the implementation is currently flawed for certain types of site, such as those that are high quality but cover a broad range of topics, or where bounce rate etc can vary naturally in the way you describe between different sections of the site).

MrFewkes




msg:4350369
 2:32 pm on Aug 11, 2011 (gmt 0)

Hi,

Can you guys tell me how you know theres been another Panda update?

It confuses me how you can be so precise with version numbering updates right down to a decimal place!

Surely you cannot do this just by looking at serps - what is the source of info?

Thanks

AlexB77




msg:4350398
 3:16 pm on Aug 11, 2011 (gmt 0)

Let me make myself little clear.

I have talked about 2 pages above that were both hit by panda in February, but honestly forgot to mention that none of the pages were improved in any way but simply adding facebook, tweet this and +1 button on the page 2. As I have already explained the situation withe the P/R on both pages I would not go back to do it again but will add one thing that in my opinion plays major role here, Bounce Rate. Page 1 had prepanda B/R 30% and after panda 37-42%, Page 2 had prepanda B/R 46-52% and after 27th July 36%. as I have already mentioned page 2 only had one improvement made to it and that was a social buttons, nothing else, not even a dot.

I hope this at least sounds like a fact.

londrum




msg:4350467
 5:29 pm on Aug 11, 2011 (gmt 0)

Rasputin: I think they would look at several related factors - for example if a page has a high bounce rate, a low time on page, followed by searchers quickly looking at another result on the same search page that would suggest dissatisfaction with the first result.

They would probably also compare user metrics only with those for a similar search - so if my page about 'big red widgets' keeps people interested for 2 minutes, and people then look at another page on the site, perhaps occasionally sharing on facebook etc, while they are there that would give specific user metrics - but only to compare with other searches for 'big red widgets', not with searches for 'euro-usd exchange rate' etc.


that all sounds good, but none of that is new. it would be very surprising if stuff like that wasn't built into the algo already. they must have been doing this kind of stuff for years.

that is why i dont think user metrics plays a part in what makes panda "new". because there is nothing new about user metrics.

when people improve stats like bounce rate and say they've seen an improvement in the SERPs, that can be explained easily... because its has always been so. it hasn't got anything to do with panda per se, i dont think.

Lapizuli




msg:4350468
 5:36 pm on Aug 11, 2011 (gmt 0)

I don't have any facts, but I do think that Google mysteriously gained more confidence in THEIR user metrics with Panda. It may be totally off base, but it's the only explanation that makes sense to me right now - it explains why Google's said what they say, why few people recover, the time lags of Panda updates...oh, lots of things.

Is there some reason why people are thinking we should have access to all the user data they do?

triggerfinger




msg:4350561
 8:59 pm on Aug 11, 2011 (gmt 0)

I'm sympathetic to the skepticism regarding changing our perception of SEO best practices.
My question is: If it's not unique content, and if its not user metrics, then what is Panda? Links? What is left?

serenoo




msg:4350569
 9:18 pm on Aug 11, 2011 (gmt 0)

I have been hit from Panda 1.0 on Feb. Yesterday my home page was number one. But I recovered only the home page and we do not know if it will last next days. Probably tomorrow I will be ranked in page number 100 again.
I am going on vacation so I will not able to reply to your questions.
What worked for me I think was facebook and youtube

johnhh




msg:4350572
 9:28 pm on Aug 11, 2011 (gmt 0)

i am in the @Lapizuli camp

After extensive a/b testing Panda appears to have taken control of the SERPS away from what is actually on the page, or, put another way, what is in control of webmasters.

If machine learning is what it is about it could take many runs to achieve the required result. Certainly not real time , all depends how close to the edge you are.

Currrently testing a last gasp approach before going for a site re-write.

<edit> It is possible that current recoveries could have been the result of a white list push out </edit>

[edited by: johnhh at 9:31 pm (utc) on Aug 11, 2011]

canuckseo




msg:4350573
 9:29 pm on Aug 11, 2011 (gmt 0)

Searchengineland says they've talked to Google and they say there isn't a Panda or any other update going on right now:

[searchengineland.com ]

I think they are misinformed - since Sunday I've seen traffic drop on some of my sites, and increase on others, but the net effect is a 10% loss in Google traffic.

I don't know if it is was a hiccup or what - but monday i was down almost 15% - seems to be recovering somewhat because yesterday we were only down 7%

freejung




msg:4350598
 10:57 pm on Aug 11, 2011 (gmt 0)

I've seen a very slight but distinct improvement. Nothing like a "recovery," just the sort of thing you would normally expect due to normal algo changes. It's worth remembering that Panda is not the only thing they're tweaking. My guess at this point, until we see evidence otherwise, is that a couple of days ago there was a normal algo tweak significant enough to produce noticeable effects, but not an actual Panda update.

I still find that encouraging, as it means that a Pandalized site can still benefit from other changes. I was beginning to think that once Pandalized, you would not see any improvement at all for any reason whatsoever unless you were rescored by Panda.

whatson




msg:4350630
 2:28 am on Aug 12, 2011 (gmt 0)

I think there is far too much speculation here on user metrics. It is an uneven playing field as a lot of sites don't use Analytics, WMT, Adsense,etc.

High or low bounce rates do not indicate good or bad quality. What if someone looked up GOOG, and just wanted to know the stock price and click back in a few seconds, where as someone else might want to read all the news and financial data and stay for 10 minutes. Both visitors have found what they wanted, yet had different needs and therefore different behavior.

Where as a site which is not well designed and hard to navigate, may cause the visitor to spend more time searching for what it is they are seeking.

There are a lot of different ways to design a web site. Imagine if someone was looking for the year MGM Grand was built in Las Vegas. One site might have it in the corner in an easy to find table. Where as another might have it embedded in content and take some time to read through and find.

Make no mistakes people this is a content algorithm.
In Google's Webmaster Guidelines, under Quality Guidelines, they clearly say "Don't create multiple pages, subdomains, or domains with substantially duplicate content."

Lapizuli




msg:4350640
 3:14 am on Aug 12, 2011 (gmt 0)

Of course bounce rate is a noisy signal, as the saying goes. But Google has access to way more than bounce rate. I can only imagine the patterns their data must show, and if they can, they'll learn how to interpret all that chaos and recognize what the data means about their users' satisfaction.

That's a major part of cognition - the detection of associations and patterns. A person doesn't have to smile and say "I'm happy" for another person to know he's happy. Google's trying to read indirect signals as well as a human being would, using just data, and they have confidence they're getting closer - I believe, anyway. As I said, I could be way off. But it makes sense to me.

I do believe we're losing control of SEO - a natural consequence of Google getting smarter and not needing the same signals they literally asked for previously. (I think they conveniently forgot they asked for them and now, as they are better at sussing the bad guys, they wave aside our efforts to optimize with contempt and suspicion, like a child of six going, "I'm not a little kid anymore" when you sing the alphabet song to him.)

Really, SEO is all about communicating with an algorithm. As the algorithm gets smarter, we naturally will be communicating with it with different signals than before.

I'm thinking that as long as we remain Google-search dependent, the area of greatest control we have is in determining whether we've pleased our customers.

That said, I don't think Google's even close to getting it right. But that's all I see we can do for now.

tedster




msg:4350645
 3:49 am on Aug 12, 2011 (gmt 0)

Not the "dumb" bounce rate percentage you see in most analytics program... no. Search engines (Google and Bing at least) want to see the "long click". That is, a click on the search result where the user stays engaged for a while. In other words, a click that doesn't result in an immediate return to the search results.

You can get a good sense of a healthy page by checking the bounce rate percentage, but ignoring it if the "average time on page" is good. That said, if the page is intended as the beginning of a sales funnel, then a bounce rate in the 80s or 90s could be a sign that the page isn't working well, even if people spend enough time to read every word.

SEOPTI




msg:4350675
 7:32 am on Aug 12, 2011 (gmt 0)

I really don't care what Google says, but the money helps for my cat rescue projets. thanks for Bing and all the converting traffic.

Some pople here should really try to focus Bing, forget about Google for at least a few monts.

whatson




msg:4350677
 7:59 am on Aug 12, 2011 (gmt 0)

Whether you like it or not Google is the make or break for the majority of web sites, I don't care about what these other people say about finding alternative methods of traffic through social networking or other search engines. It's a lot of work, and will still only amount to something in the region of about 30% of what Google provides anyway. So there isn't much choice other than to play ball with Google, but you need to work out the ball game rules.

rowtc2




msg:4350755
 1:22 pm on Aug 12, 2011 (gmt 0)

Not the "dumb" bounce rate percentage you see in most analytics program... no. Search engines (Google and Bing at least) want to see the "long click". That is, a click on the search result where the user stays engaged for a while. In other words, a click that doesn't result in an immediate return to the search results.


I suspect this is the main ingredient of Panda.

A scenario:
A site has 5.000 pages.
Google track these click back for a month for each page and
each page have a Panda score.
They update Panda total for a site once a month (when Panda rolls out).
End

How Google can tell if a health page/site is useful or not?

whatson




msg:4350991
 10:00 pm on Aug 12, 2011 (gmt 0)

I suspect not at all. You are chasing a ghost.

canuckseo




msg:4350999
 10:28 pm on Aug 12, 2011 (gmt 0)

I suspect not at all. You are chasing a ghost.


I have to agree. Sure there are many sites that use google analytics so they have access to bounce rates, time on site etc. But what about all the other sites that don't use google analytics. How does Google determine their bounce rates and time on site?

Simple, they can't with any accuracy. Therefore the whole premise that these are now ranking factors goes out the window.

rowtc2




msg:4351009
 10:57 pm on Aug 12, 2011 (gmt 0)

@whatson, i have recovered one site one month ago, now i am working on others, i will see what is happening. What is your thought about main cause of Panda?

@canuckseo, i do not think they use GA, but they can track cookies when people click on results. When an user click an Adsense ad, Adwords service knows what user is doing next, so I think is possible.

whatson




msg:4351018
 12:09 am on Aug 13, 2011 (gmt 0)

Well its also known as the Farmer Update - targeted at content farms. The explains it all. There is clear evidence that this is a factor. I find it unlikely that even if they did use other metrics as well, that they would release them at the same time and some how work within one another, when they are two very mutually exclusive factors.

Main cause: duplicate, copied, similar or thin content. End of story.
Any page that can be created in just a few minutes, is likely an offender.

triggerfinger




msg:4351603
 2:29 pm on Aug 15, 2011 (gmt 0)

"Simple, they can't with any accuracy. Therefore the whole premise that these are now ranking factors goes out the window. " - That's not true, they can tell if a user returns to a SERP, and hence how long they stayed on the result, i.e. time on site.
Like I said before, I'm all for skepticism, but complete disregard is not healthy skepticism. Again, we have:
User metrics
Link metrics
Content Analysis

Panda is a combination of those, the question is what type of combination.

freejung




msg:4351649
 3:53 pm on Aug 15, 2011 (gmt 0)

Simple, they can't with any accuracy. Therefore the whole premise that these are now ranking factors goes out the window.

I think this is based on a misconception, which is popping up in other threads on this issue as well. There is a tendency to assume that because data is noisy, it must be useless. There is a whole branch of mathematics dedicated to extracting meaningful information from noisy data, and Google is very good at that particular branch of mathematics. If you have a lot of data and you run it through the right kind of statistical filtering algorithms, you can get accurate measurements of aggregate properties, even if in each individual case the measurements are extremely inaccurate.

When we talk about Google using user behavior metrics, we're not talking about watching one single user click through to a site and then click back to the SERPs and deciding that the site must be low quality because of that. We're talking about looking at millions of users clicking through to millions of sites and looking for patterns in that huge sea of data.

If you've ever looked at the analytics for a site with even a moderate amount of traffic you know this: individually people are quite unpredictable and their behavior seems almost random, but in the aggregate over a large numbers, their behavior is remarkably consistent and predictable.

The behavior of an individual human interacting with a system tells you more about that person than it does about the system. The behavior of a million humans interacting with the same system tells you about the system itself.

conroy




msg:4351654
 4:05 pm on Aug 15, 2011 (gmt 0)

Main cause: duplicate, copied, similar or thin content. End of story.


I'm coming to the same conclusion as whatson and I have a lot of panda penalized sites to look at.

whatson




msg:4351789
 9:28 pm on Aug 15, 2011 (gmt 0)

freejung even if you are right, then why would I have sites in the same industry with good bounce rates panda'd and other ones with worse bounce rates that are fine?

wheel




msg:4351816
 10:13 pm on Aug 15, 2011 (gmt 0)

There is a whole branch of mathematics dedicated to extracting meaningful information from noisy data,

Sure there is. NEvertheless, if I recall correctly Google actually somehow inferred that the info was to noisy to use. Or maybe I'm not remembering.

The point of the exercise was thin content. Does bounce rate identify thin content? Too many overt ads/page layout? All the other things?

Either Google has an algorithm to directly read and detect thin content, or they are using some type of secondary signals like bounce rate etc (or some combination). Frankly, using something not 'onpage' like direct content would seem to be Google's standard way of detecting this stuff, ala links. But in this case I'd be very suspicious that they aren't simply looking at the onpage content and making a determination from that. It's not like Google cares about whacking a few false positives.

In any event, if it's content their targetting, then it's content you should be looking at to fix - said content then either passes, or creates whatever secondary signals they're looking for.

Trying to figure out the secondary signals and only addressing those, well, that gives us the last few months of unsuccessful scurrying around that we've seen.

whatson




msg:4351834
 11:43 pm on Aug 15, 2011 (gmt 0)

Wheel, you are right, thin content does likely effect bounce rates, so although Google was trying to provide sites with higher bounce rates better rankings, they had to use factors like thin content to decide which ones had the bad bounce rate, opposed to the actual bounce rate.

freejung




msg:4352047
 3:18 pm on Aug 16, 2011 (gmt 0)

freejung even if you are right, then why would I have sites in the same industry with good bounce rates panda'd and other ones with worse bounce rates that are fine?

Because it's not the only factor.

Let me clarify: I'm not trying to say that I know what Google is doing. I'm just trying to refute what I see as a spurious argument about what they can or can't do if they choose.

People keep saying that Google can't be using behavioral metrics such as bounce rate, time on site etc because those signals are too noisy or hard to measure or there are counterexamples or people might bounce because they found what they're looking for or because not everyone uses GA, etc. IMO none of those arguments are valid, for the reasons I mentioned above. That doesn't mean Google is using these signals. What I mean is that these arguments don't rule out the possibility.

Google actually somehow inferred that the info was to noisy to use

"When I say something, that's implying. How you take it, that's inferring." - (I believe that's a quote from Dennis Quaid in D.O.A. but I can't seem to find it in Google ;-). Sorry, that's one of my pet peeves...)

Just because they implied that at some point doesn't make it true.

it's content you should be looking at to fix

Yes, of course. The question is whether to focus exclusively on improving the quality of your text content, or to also spend time on other things that might improve user engagement. I'm inclined to spend some time on the latter regardless, as it improves my site. My pandalized site is an image site. I really don't think my users give a da** about the quality of my text content. They're there for the pics. I spent quite a bit of time in the past few weeks making significant improvements to my download process. That has no impact whatsoever on the quality of my text content, but it makes the site a lot better for my users and is therefore a good thing IMO regardless of what Google is or is not using. However, it would suck if this has no impact at all on Google's assessment of the site, because it definitely impacts the site's actual value to the user.

Anyway, my point is that this issue does matter, because improving the quality of your text content is not the only thing you can do to improve your site.

Main cause: duplicate, copied, similar or thin content. End of story.

I just find it hard to believe it's that simple. Every time someone has come out and said "it's all down to x" we find lots of counterexamples and contradicting evidence. I seriously doubt it's all down to any one particular factor. I think the most plausible explanation is that Panda is a machine learning algo which was allowed to range over a very large dataset, probably including most of the metrics we've thought of and quite a few that we haven't. That would include on-page content, layout, internal and external link structure, and user behavior metrics.

If you got pandalized, it's probably due to a combination of several of these factors, and you'll probably have to make improvements along several different dimensions to improve your ranking. I think it's a mistake to focus exclusively on one particular factor, as you might miss other major problems with your site.

In my case, I asked a lot of people questions about my site and learned that my download process pretty much sucked, so I fixed it. That was only a few days ago and I haven't seen significant impact from it, but of course that's to be expected... anyway, we'll see. I just don't think it's limited to any one particular thing.

SEOPTI




msg:4354095
 3:06 am on Aug 22, 2011 (gmt 0)

A few days ago my third long tail site completely recovered and no, it was not on a Panda date. Same as before. Time to save some more cats with the fresh money coming in.

SEOPTI




msg:4356791
 6:34 pm on Aug 30, 2011 (gmt 0)

I'm still sure there is at least one panda calculation which works in real time since the recoveries has been partially reverted today and a few days ago.

freejung




msg:4356818
 8:00 pm on Aug 30, 2011 (gmt 0)

Any idea why?

This 60 message thread spans 2 pages: < < 60 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved