Welcome to WebmasterWorld Guest from 220.127.116.11
When asked if an iteration of Panda was implemented this week, a Google spokesperson told us, “yes.” She also provided the following statement:
“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”
If you’ve followed the Google Panda update saga throughout the year, you may recall Dani Horowitz’s story. She runs an IT discussion community called Daniweb, and it was hit hard by the Panda update, but she made a lot of changes, and gradually started to build back some Google cred
[edited by: Dave_Hybrid at 7:54 pm (utc) on Oct 10, 2011]
People are in denial. Google can and will use real user data in the algo. Don't fight it, it is/will happen.It has been happening for 5+ years, at least, everyone knows it. Bing uses it extensively as well but for keyword /page which is the correct use. If Bing sends you a visitor for a keyword and they return right away they will make note that your page might not be the best match for that keyword. But on the other hand they will keep sending you referrals if the users is 'happy.' I get dozens of referrals for 2 of my pages for an extremely popular keyword from Bing.
[edited by: Dave_Hybrid at 10:08 am (utc) on Oct 11, 2011]
Has Panda been applied to all languages?Other than Chinese and Korean I think.
at Google the left hand has no idea what the right hand does.
I don't even care about their traffic, I suggest everyone to do the same, tracking every move that Google does is a waste of time.
It's hard not to care about Google when you lose 100k a year. I'm guessing you're not in that boat. But many of us are. Doing what Google likes definitely hasn't been a waste of time for me for the last 6 years. Although atm anything i do seems futile.Been in this boat for 7+ months. The trend is even more worrying as independent webmaster are being squeezed out little by little and that's by design IMO. The only safe bet is to design a site that Google cannot penalize, Panda or no Panda, like Cultofmac, Daniweb, Android Police and the likes. Otherwise the commercial side will be full of huge brands that paid their way there, either by bribing people for links /G+ or paying to build the 'trust' Google loves so much.
Im not in the commercial space, im an info site just like the examples you made. I'm still hit, hard.Panda and their massive brand bias even resurrected ZDNet, Cnet, PcMag and other once left for dead tech brands. They are on page one for pretty much any tech related search, so you add them and those manually exempted from the Panda rules and your drop may be explained. Level playing field and all
guess the conclusion is that we've been trying too hard. Ditch the databases, ajax, scripts and CMS and return to simple and very quick loading hand coded websites, even if the information is horribly inadequate and out of date.
(MikeNoLastName) - Therefore they can only evaluate what they CAN see and that is: whether a person RETURNS to G and SELECTS _+ANOTHER+_ SEARCH RESULT USING THE SAME SEARCH TERMS!
I actually lost some of the small boost I finally got on Sept 28th. It was Google's let's-get-DaniWeb-off-the-news update so it was for a good cause. Naturally, as a good netizen, I gladly accepted it. Today looks I lost a bit more, but I don't know why since I don't follow the twitter messages exchanged among the cool crowd.
Speculation says google didn't like their own update: [blog.searchmetrics.com...]
anyone hit by the latest iteration of panda show recovery lately?
Quite often old wine is better than new wine
ETA: Sorry I can't delete this. More research revealed that the bulk of the search spike came from traffic going to one particular page. One of the keyphrases had suddenly doubled in traffic (and I was already #1 for that phrase), so it must just be that suddenly a lot more people than usual are searching for that phrase. Must've been in the news or something.