I think the non-english Google Panda found my website, but I’m not sure why. Yes I might have some pages who could be considered thin, but the majority of the content on the website is unique and quality content.
I do have two ltd’s pointing to the same content. Could this be the problem? In the past it wasn’t a problem when you had a .com site and a .co.uk pointing to the same content, could this be a problem at this time?
|One of the quality signals I definitely thought Panda would take into account are spelling and grammar based upon my past training in tests and measurements. Amazingly after two months of ongoing study it seemingly has little impact. In fact upon close inspection I am saw sites in the top results with grammar so horrific its jaw dropping. The first few sentences of the page could fool anybody but the rest could be total mush. My conclusion was what I thought would be a major factors in discerning quality were not. In fact the availability of so many faulty pages practically eliminated the idea that the “bottom of the barrel” pages had been eliminated. In other words Panda seems very forgiving in areas related to grammar and spelling. The only thing that did emerge, that I wasn’t testing, was there seemed to be page templates almost immune to Panda. In that case it could have been a high number of similar templates catching my eye. |
You may be interested to know that Matt Cutts recently said in a video that Google has been doing testing on grammar and spelling. Early test results, he said, indicated the sites with the better grammar and spelling were the better sites. So even though Google after Panda issued guidelines which mentioned grammar and spelling, and evidently didn't consider them with the first versions of the Panda, the day may be coming when they become important for ranking.
|Shatner said: |
How do you beat Panda? Do nothing and wait it out.
I agree with this approach.
I treat Panda like a virus that can't be removed or beaten..
Live and let live.
I do nothing with 'Panda in mind' but do many things in favor of the visitors.
|azn romeo 4u|
@zivush I agree. I been concentrating on my visitors and I been able to double my visits since panda started last year.
also about the duplicate ratio...according to that site I have 100% duplicate content within my own pages =( but I seem to be unaffected by the dup penalty. I know my content is unique though.
Maybe the Panda effect is never going away. In simple terms Google has more and more quality sites so they can divide the available traffic into smaller less manageable slices. If you want the cream your going have to pay for it, end of story. Time to adapt or die as they say, I'm not holding my breath waiting for recovery.
|I do nothing with 'Panda in mind' but do many things in favor of the visitors. |
That's the answer.
And look at sites that are performing well and focus on commentary that is positive. Forget about the imperfections and complaints involving poor quality SERP's as Panda isn't mature yet.
If folks keep convincing themselves that the end is in sight, it will surely come quickly. Instead, focus on what great opportunities that Panda is forcing you to rethink about the user experience and respond with. And let this be with visitors in mind - Google will follow, instead of folks chasing the algo around in ever decreasing circles.
Without doubt, some are finding this a very costly and difficult experience to navigate through - as it may be taking some a lot of time and financial investment and experimentation without a full understanding of the scope of recovery - with all the social consequences that flow from this - so i don't want to downplay that.
But, concentrate on the postive posts and hopefully you will see better and have a stronger site for the experience of it.
PG, appreciate the update.
There was a session at pubcon yesterday where a fellow from hubpages talked. They got hammered in Panda, but are now back to pre-panda levels. Because they had 50million pageviews a month and measured everything, they had the data to check stuff.
Anyway, he had a very specific checklist of things to do that would aid recovery. I didn't follow along as quickly as I should've, but one thing he said to do was split good content from bad so the bad doesn't affect the good. That's why the subdomain thing helped them - not because of subdomains but because the bad authors didn't kill the good authors.
There were other specific tips, sorry, I didn't keep notes. You should've been there :).
|Anyway, he had a very specific checklist of things to do that would aid recovery. I didn't follow along as quickly as I should've, but one thing he said to do was split good content from bad so the bad doesn't affect the good. That's why the subdomain thing helped them - not because of subdomains but because the bad authors didn't kill the good authors. |
The Algo should be smart enough to do this, this is one of the main reasons sites are getting hammered so badly. The algo runs at a domain/subdomain level not at a page level.
It makes sense to me if you have a page with:-
left navigation that appears on each page.
top navigation that appears on each page.
a thin bit of content, say a hotel with a 2 line description and all data coming from a feed that everyone else uses.
3 adsense blocks....
That google is going to go. "MAde for adsense site - not useful for our viewers - google panda penalty."
But if you get that description out to 6 paragraphs, add some nifty stuff like a distance from hotel to airport, theme park, restaurants and shopping..?
Then its a useful page, and having adsense on it is fine.
I don;t think the adsense and panda teams are at odds at all. We are told dont make sites for adsense by the adsense team. We are also told this by the webspam team.
If you are putting 3 blocks of adsense on a thin content page, then you are making sites for adsense and deserve a panda penalty.
| This 40 message thread spans 2 pages: < < 40 ( 1  ) |