I appreciate that we all have conjectures, opinions and philosophies about what Google might be up to. However, that kind of discussion will drive this thread off-topic and undermine its present and future value.
Let's keep the focus on what we see happening in the SERPs, please.
Something happened to one of my already Pandalized sites yesterday that was showing a recovery, lots of terms that were floating around page 4-5 have fallen off the face of the earth - a lot of them now down to the 300's/400's.
gyppo - I assume we're talking long tail Y/N?
What's that in terms of post Panda recovery rise and subsequent % traffic fall ?
Do you have a hunch on what might be causing it amongst the usual reported algo turbulences?
Not long tail, short tail queries + long tail.
90% Panda fall since Feb, 20% recovery in the last update. We've taken the site from 150k indexed pages to just 450 in the last month (to try & shake Panda once & for all) however the terms I monitor include just the 450 pages (everything else is pretty much deindexed now). I'd also say the quality on our site has increased drastically compared to people competing on the same SERPS.
Could just be normal Google flux & the terms could bounce back in a few days, I've seen it happen before. But I posted just incase anyone else was seeing anything similar.
... so is your traffic drop related to blocked content and internal linking influence with that de indexing ?
Looks like you may need to wait 1 or 2 iteration to get it right.
i've finally managed to get a bit of traffic back that i lost last April. the thing that helped me is a bit obvious, but it passed me by for six months.
i just stuck a load more keywords onto the page (and i mean 'loads') -- more header titles, bold text, image filenames -- real old-school stuff like that. the kind of stuff that i thought went out years ago.
my pages have never been very heavy in that kind of stuff, just normal amounts, and they always managed to rank okay. but when panda hit in april i took a dive.
i tried loads of stuff to get my traffic back (speeding up page load times... moving to dedicated hosting... reducing bounce rate...) and i got bits and pieces back, but not all of it.
after looking carefully at all my competitors this month, i noticed that most of them are spamming keywords on the page. so i stuck a load more on mine, thinking that i've tried everything else, so i may as well give it a go, and lo and behold it worked!
i wouldn't go so far as to say that i'm spamming keywords now, but i've definitely beefed it up a lot. it's easily the one thing that's helped me the most.
i'm not sure that's a very good advert for panda... but i'm not complaining.
@londrum, what's the keywords density percentage you tried? Is it over 5% for your competitive keywords?
What I tried over the past year or so by adjusting the keywords density with some pages, when I increased it seemed to initially increase traffic and page positioning, but then it dropped. So I cannot tell if it's the content freshness factor or the keywords density change itself.
I also see even if the keywords for the title and meta-tags are relevant but broad vs the main content and the later is too specific you may get the -50 or -100 penalty. For example a products listing in a category page:
title and tags about the category page: general widgets
main content listing:
specific blue widget brand-X item
specific red widget brand-Y item
Page maybe considered to have very thin content and little relevancy. At least under normal circumstances, it doesn't affect other pages or the entire domain from what I see.
@Landrum, funny because I was just testing that theory... but in the opposite direction: taking out ANY signs of traditional SEO key word positioning... ie. getting rid of bolded text; getting rid of abnoxious not-necessary repetition of the key term. etc. and I while I'm still testing it appears that the approach is doing better than the alternative.
I guess it's another sign to how Panda is a complex problem...
Could it be that what you are doing is increasing the uniqueness of the page relevative to others on your domain?
On my commerce site, I had five sections covering one brand (eg brand widgets, brand wodgets, brand wotsits, brand gizmos, etc). Each sub section was entirely plausible, covering a different sub-type. They all sank in April Panda (UK site).
I have been gradually deleting the sections and with each delete comes a rise in the rank of the overall/ umbrella section, which is now back in top 5 places.
This would fit with Google's aim to kill content farms, who tend to have many overlapping articles.
You shouldn't just test for uniqueness in www terms, but also on your site.
BTW, having unique content for each is not enough. I tried that. You just cannot get away with having many pages on same topic, even if technically unique.
who knows what the actual thinking behind it is. all i know is i added a load more keywords and my rankings went up
londrum, did you add more occurrences of the exact same keyword, more variations of the important keyword, or more and different keyword vocabulary?
< continued here: [webmasterworld.com...] >
| This 161 message thread spans 6 pages: < < 161 ( 1 2 3 4 5  ) |