| 1:56 am on Oct 9, 2012 (gmt 0)|
Can you tell me if you increased or decreased the keyword density and what type of anchor text does the links that you got use?
| 1:57 am on Oct 9, 2012 (gmt 0)|
@zeus - I don't have big hopes with regards to the metrics.
Firstly Google isn't playing a straight hand by revealing what they are relying on. We only have half the story from them and we don't know where the metrics come from or how they interpret them and weight each of them. And in any case, crafting the engineering to match theirs seems impossible - as good metrics are in the eye of the beholder, which is going to be different to different sites. Google may have planted some vary narrow variables for all we know, and we don't know. Also, I've not heard of any improvement from cutting pages to improve metrics sufficently to break Panda totally. Someone may correct me if I'm wrong.
Just do what's good for you and your users - that's all you can do.
Secondly, I would bet strongly on quality elements in your design, workflow / navigation and improved naturaland engaging content to assist. This may be more intense in some popular verticals and types of websites. Again, just do your best and observe benchmarks that are not brand - they are getting away with murder or to put it politely, brand is a stronger signal than any other impropriety in the absense of settling down Google's other factors :)
They probably need a couple of years to get this right, plus other reasons we do not know in terms of overall strategy and technicalities. Who knows.
By which time i doubt if any SERP slots will remain for certain verticals. So surviving SEO centric sites are kinda being forced to reinvent how they market to, and serve users anyway.
Good luck and well done for sticking though your ordeal. I hope we have a happy story soon - you deserve a good ROI/effort.
| 2:32 am on Oct 9, 2012 (gmt 0)|
|Firstly Google isn't... revealing what they are relying on. We only have half the story from them and we don't know where the metrics come from or how they interpret them and weight each of them. |
From the beginnings of SEO back before Google, that was always the webmaster's challenge. No search engine ever spilled their algorithm, nor could they afford to.
In my memory, Google tells us more than any other search engine ever did. And that's a big part of our frustration - it's so tantalizing and enticing to know a lot "about" the complex logic involved, but it's immensely frustrating not to know the exact details.
| 4:16 am on Oct 9, 2012 (gmt 0)|
This is what I am seeing with this latest panda update. It might just be a coincidence but I am noticing the same trend across many different similar niches. I am seeing websites that scrape content and give credit to the source are doing very well. I am seeing a lot of sites that disappeared are pretty much the same exact site as the sites that were not harmed and the only difference are the ones that did not get hit are all mentioning a source to their content.
Anyone else seeing this?
| 8:51 am on Oct 9, 2012 (gmt 0)|
gouri - decreased keyword density - anchor text is without any keyword
Whitey - thats why I dont have that big hopes, be cause we dont know anything about Panda. If I would do what is good for my users, I would get my old site back, but
these days we dont build for users anymore, just google. Thanks I hope something will work, be cause I only have 2 Month left to survive this, then I have give up.
| 10:06 pm on Oct 9, 2012 (gmt 0)|
With Panda you may need a bit more than 2 months, as refinements, re evaluations and data refreshes take time to kick in. Stay with it, just cool your expectancies, if you can.
| 10:17 pm on Oct 9, 2012 (gmt 0)|
I would say, so as the metrics are now, it has been that way for 2-3 month.
| 12:41 am on Oct 10, 2012 (gmt 0)|
To my beloved Google, I'd respect you a bit more if you'd just go ahead and say that Google is the only entity permitted to earn money via the internet.
Everyone else if you want traffic just buy Adwords Ads.
Nothing in results make sense.
| 4:01 am on Oct 10, 2012 (gmt 0)|
Creamed. Thirteen years and now only a trickle of G traffic on ten year site, which was 75% Google search outside of inbound links. Yahoo and Bing traffic providing 24% of my original traffic. Bailed out of G+ today, taking videos off Youtube and hosting them myself, switched browser search to bing, which has much better maps than Google. Left Picasa and gmail account is closing. Not going to keep supporting this craziness. I provide websites, they bring it up in search, if they feel like it, then serve a complete screenshot of the page so people don't have to visit the site. We give them content to make money, we get a kick in the backside. Fed up.
| 6:40 am on Oct 11, 2012 (gmt 0)|
I'm not sure if it was the EMD update or something else, but the network of local websites I'm running about an area of a country are totally wiped out from the SERPs.
Their ages range between 1-3 years, and considering that several of them are small places, they are the only sources of information you would find anywhere on the web. Every single one of them has pictures and videos shot on the spot, several have maps based on local observations etc.
What mighty G would serve instead? Brand sites with utterly useless coverage. Sometimes even whois sites showing my domain name for the location like for content. And sometimes a website that took the liberty to steal my photos and put them on their site...
Shame on you G, you are such a DISSERVICE.
| 6:42 am on Oct 11, 2012 (gmt 0)|
Addenda: domain names are the exact, pure geographic names of the towns.
| 11:06 am on Oct 11, 2012 (gmt 0)|
@radix - your issue sounds like EMD & Penguin rather than a Panda issue.
This thread really discusses this in detail
But I think you will find that because your sites are small and highly focussed they will attract very tightly focussed anchor text for the backlinks, which will trigger over-optimisation demotions.
Check the last page or two of the SERPs for your keywords - go to the second page of the SERPs and then set the &start=nn parameter to &start=950 or 990 to get to the last page.
| 1:35 pm on Oct 18, 2012 (gmt 0)|
hmm the next Panda update should be on the way, my metrics have got a little better since my last post here.
2 min 05s average visit duration
40% visits have bounced (left the website after one page)
4.1 actions (page views, downloads and outlinks) per visit.
Remember before my bounce was in the 80s and time on site under 1min.
I did make a big change to get to 40%, that was to place a script on my site so other sites can not frame my pages/image, like google and a lot of others.
| 9:17 pm on Oct 18, 2012 (gmt 0)|
That is an interesting observation zeus - effectively you cut your bounce rate by 50% by using a deframing script for pages and images.
I can see how the page script would work - not too difficult to do that one - but how did you manage to get a script that deframes images?
| 9:41 pm on Oct 18, 2012 (gmt 0)|
IanTurner - I think I was at 82% bounce rate in 2011 where I also got hit by Panda, but dont underestimate the site,people loved be cause they got what they came for at once. I then changed the site totally and then had a bounce rate of 49%, now with the script installed its 39-40% bounce rate.
[edited by: Robert_Charlton at 6:02 am (utc) on Oct 22, 2012]
| 10:56 am on Oct 19, 2012 (gmt 0)|
"The bombs keep falling from the sky. I know the intent of Panda. If I suck (and I have before) then I get drops after updates."
I suggest you not to panic and start looking for main inconsistencies according newest Panda update.
You shall start from unique content check...
| 4:35 pm on Oct 19, 2012 (gmt 0)|
What about webmasterworld, has it suffered at all? You guys are a "webmaster forum" right? So I see a bunch of crap higher than you including the EMD that is parked and still listed above you.
| 6:11 am on Oct 22, 2012 (gmt 0)|
Mod's note: Please no public discussion regarding getting copies of Zeus's script. That kind of communication belongs in a stickymail, where it won't take the thread off topic.
| 9:19 pm on Oct 25, 2012 (gmt 0)|
Google come on make a Panda update, my site can take 2 Month under the Panda virus, if my site dont recover, then im out of business, so its pretty exiting.
| 10:04 pm on Oct 25, 2012 (gmt 0)|
It's probably not the intenetion Google have with all their animal poo-poo but for every thing I have tried to do I have got less and less traffic AND as a result I have also got LESS traffic from Bing and Yahoo. So, I wonder, have anybody else seem the same or you just sit around waiting to recover?
I haven't done anything that would trigger new penalties, but what do I know anymore. I have tried to remove anyh=thing that *I* thought might be viewed upon as spammy, removed some keywords here and there, asked them to remove links to a blog that was never used and also not to crawl the same blog at all..... all to no avail :( In fact, to the contrary.
| 10:41 pm on Oct 25, 2012 (gmt 0)|
gehrlekrona, could it be that all of your actions, taken together, send a signal to Google that you are trying to game them?
| 11:04 pm on Oct 25, 2012 (gmt 0)|
@atomic, I don't know anymore but it could be. Most of the things I have done is what they recommend in WMT, (duplicate title tags and such) I have also removes pages that are not very content rich. I think (?!?) one of my problems is the pages are generated and some of them ends up as duplicates and I have to removed as much of them as possible. Not sure how you remove them from GOOG.
| 12:47 am on Oct 26, 2012 (gmt 0)|
|Not sure how you remove them from GOOG. |
I am working with a customer with horrible duplication issues right now. To eliminate duplication I am starting with:
1. A function that determines which pages are duplicates and adding <META NAME="ROBOTS" CONTENT="NOINDEX"> to the header.
2. WMT URL removal tool
| 10:29 am on Oct 26, 2012 (gmt 0)|
In my experience, WMT URL removal tool doesn't help. It makes it so that the articles are not shown in the SERPs, but the SERPs still include them in their overall numbers. I'm talking specifically about a section of the website which actually was removed and does not exist anymore.
After Google finally actually dropped the pages (the number went down), I started seeing some improvement. It might all just be coincidence, however, since I made other changes to the site at the same time. But, whenever I see the number go down a certain amount, I also find the website improving in rank.
| 10:55 am on Oct 26, 2012 (gmt 0)|
Personally I don't believe internal duplication (multiple urls for the same content) is a Panda issue. Certainly good to tidy up from an SEO perspective, but my site has been squeaky clean on that front for over a year and it hasn't brought about any improvement.
I believe Panda is about things that people can actually see, things that affect how they use your site. It's those sorts of changes that have produced the results for me.
| 11:57 am on Oct 26, 2012 (gmt 0)|
| 12:13 pm on Oct 26, 2012 (gmt 0)|
@zeus Google Chrome sends back all kinds of data.
| 1:05 pm on Oct 26, 2012 (gmt 0)|
I think Google is only interested in the metrics produced by visitors THEY send to your site.
I would say a new site starts life Panda-free. Google starts sending traffic to deserving pages and assessing the quality based on how people respond. Then a plus or minus Panda rating can be assigned and traffic/rankings adjusted accordingly.
| 1:23 pm on Oct 26, 2012 (gmt 0)|
hmm I also think they use visits from google, but lets say my site, today not one hit from google, what would happen if a old Panda hit site dont get any visits, how would they judge if the site is good or not then? good question right!
| 1:39 pm on Oct 26, 2012 (gmt 0)|
|Google Chrome sends back all kinds of data. |
G says no..
Wireshark says G are telling porky pies..;)..
Once you accept that G are being disingenuous about what Chrome is doing..and you install it on a machine that is set up to run only it..( and Wireshark )you learn many things about what and how G make decisions about pages and how they know what searchers do when they click through from SERPs or even when they click from one site to another when using Chrome..and especially what they do on or in pages..
And how they react to SERPs, carousel etc..
Beware of Geeks bearing gifts..be they Chrome or Analytics, or anything..
Of course you may not use Chrome..but a sizable number of all of our visitors do..and G is watching them ..not guessing their "intent"..but actually watching them, in real time..and crunching what they see..
| 2:53 pm on Oct 26, 2012 (gmt 0)|
Zeus, I think that the Panda rating of existing sites would be reset once all their Google traffic has totally gone for a long enough period, say 1-2 months, and the site would become eligible for traffic again, like a new site (assuming they were only affected by Panda). Like a kind of extreme version of the noindex solution many have claimed helped their sites recover partially while they improved the quality of those sections of their sites.
However, if the site quality hasn't improved when traffic returns then the site will just be demoted again.
Regarding where the metrics come from, I still haven't seen a quote from Google that says Chrome data isn't used for Panda (I've seen a quote that says data from Chrome is not used in the main algo, but that's not the same as saying it's not used for Panda).
It's definitely not Google Analytics though. The data in there is too inaccurate (even Google have said GA data should be used as a guide, not relied on 100%). Chrome data though - that would be very accurate.
| This 137 message thread spans 5 pages: < < 137 ( 1 2 3  5 ) > > |