Welcome to WebmasterWorld Guest from 184.108.40.206
joined:Mar 20, 2011
When Panda came along, I got dropped to anywhere from #50 to #110 for these phrases. The retail store's site is still in the same place.
Actually, I've done more work on writing original content for those widgets on my site than I did for the retail store's site.
4. The retail store's site is clearly designed for the store. My site is a mix of information, ecommerce, and advertising for other retail stores in the same niche.
How's that for an example to ponder?
After Panda it seems that most of this site's Google traffic is for the "core" subjects (the one's you would, at first thought, expect to find on this site). It's like Google has decided what topics I have a right to be be an authority on.
There will be no high rankings given on the happenstance of some text matching, which is what we used to see and also learned to leverage.
And that's a sharp observation too. It's looking more & more to me that Panda/Google decides what each site is about; determines where in a ranking hierarchy that site belongs (vis-a-vis other contenders in the same niche); then in a complicated query match-to-SERP analysis, assigns the position, with only slight variation up or down. Those variations are impacted by local availability vs online only; surfing history profile; what Google "thinks" we want (as opposed to what we request), etc.
It's a hodge-podge of forums, dealer directory, classifieds, and other things.
These suggest user engagement with the site, which is something that Google likes. Again, no single factor wins, but some combinations may outweigh other combinations. Here, user-engagement (to the degree it's happening), may be more important to Google than organization.
In the case of this particular site, there's literally just a handful of posts in the forum (meaning almost no user engagement)...
joined:Mar 20, 2011
joined:Mar 20, 2011
In my niche there are so many anomalies that I'm now wondering if only a portion of the web is being rated by Panda, this could explain why we see so many scrapers outranking "real" sites.
What I'm seeing ranking fine:
--a site with 90% (at least) thin content (image and one or two sentences per page). Site has thousands of pages.
--sites with 10+ ads per page
--sites with only the title above the fold (the rest are ads and theme elements)
--scrapers (no original content)
My main, biggest site was hit by Panda 2.0, my other smaller sites (most super thin, even thin affiliate sites) have either risen or remained flat throughout the pandas. This doesn't make sense (my main site has the primo backlinks, completely organic, social activity, yada yada).
What I'm wondering is if it's possible that once a site reaches a certain threshold, panda is applied, otherwise a site stays under the radar? I don't think site size has anything to do with it, it's more about traffic amount. Something along the lines of:
--Once google sends a specific amount of traffic to a site, panda evaluation occurs.
--Or, if a site's overall traffic has more than x% from Google, panda evaluation occurs.
--If a % of a site's content is ranking for high volume keywords, panda evaluation occurs.
What's the difference between our two sites?
You theory may enter into the equation, on the basis that the hammer hits the nails sticking up the highest. But just to be clear, have you analyzed some of the other differences? You say the content is similar but presumably you are using different hosting services? Dedicated or shared? What about page speed download times? Backlink profiles? etc etc. We know there are hundreds of factors, so there's more to the picture than just content. And even though that may be the primary, all the other stuff will add up so will inevitably make a difference in the outcome.