Over what time period was each of the parts of your experiment performed?
I never understood why they included that % high pr thing. My sites always have a handful of pages with high PR, a larger segment with medium PR, and 95% of the site with low PR.
I interpret your results in a completely different way.
Assume for a moment that every site has traffic throttled to some degree. (I am personally convinced of this, but know others will dispute)
What it sounds to me is that your blocked content is being "virtually" placed in SERPs, taking your exposure-allocation for the day. However, it's not being actually shown, so the exposure-allocation does not convert to traffic-allocation.
To me, this is strongly suggestive that Google throttles exposure, not traffic.
I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result.
Since you are mentioning GWT and panda, I have noticed that though traffic has greatly diminished, GWT is still showing almost the same average ranking positions for most of the traffic pulling pages.
Is anyone else experiencing this?
|I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result. |
From the point of view of monitoring the user behaviour, it would make more sense to throtle the exposure rather than the traffic.
If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic?
I always thought (and I see a few examples too) that they do it by ranking the page differently in different parts of the world.
One of my pages from a pandalized site is on page 1 in US, page 2 (#20) in Aus and page 3 in UK.
|If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic? |
Throtling traffic: G. keeps your exposure longer until you reach allocated traffic limit.
Throtling exposure: G. puts the limit on the number of impressions (and wish you a good luck with the CTR).
If they do throtle exposure, would good CTR (and the low clickback rate) result in the increase of the exposure limit?
[edited by: aakk9999 at 5:08 pm (utc) on May 27, 2011]
|They could keep your exposure longer until you reach traffic limit. OR - they could throtle the number of impressions and wish you a good luck with the CTR. |
hoe are they different from exposure?
When you near a traffic threshold, they throttle the exposure and this could also affect the number of impressions (if the page doesn't show up at all for the query).
@indyank, edited my post, read the above, perhaps this answers your question?
I think we are actually in agreement. My point was in any case control on traffic (traffic throttle) can be achieved only by varying or limiting the exposure.
Yes, however the difference is that if the traffic is throtled, tough luck - there is a limit and once you reach that - no more traffic.
If the exposure is throtled, then working on better CTR (listing title, meta description, increasing the brand recognition via other means so that you get more clicks) will still get you more traffic despite exposure being throtled (limited to some number).
This thread has been hijacked by throttling :D
At any rate, perhaps the two are connected. Maybe your site has a set amount of horsepower and Google does have a restricting plate on it makes sense that if 30% of your pages are unusable (ie, locked up in a removal request) that your traffic is slower and adding more horsepower parts(pages) serves no use since the restriction is keeping output low.
In either case opening up the throttle(pr) and letting the site breathe is in order. I was just proposing that instead of adding more pages it might be best to add more content to existing pages, beef them all up so to speak.
It adds up very well with my experiences. If I add more topic A pages to my site, topic B pages gett less traffic/exposure(?) then before and overall traffic level stays the same. I was always convinced that a page "can`t make it on its own" and that the "it`s all about pages, not sites" mantra was wrong years ago.
Like Pagerank there seems to be some kind of "Exposurerank".
1. Onsite/offsite factors:page-->Exposurerank
2. Sum of all Exposurerank on site-->max. exposure allowed:site
3. Max. Exposure allowed-->traffic/CTR measurement:site and pages
4. Traffic/CTR measurement:pages-->redistribution of Exposurerank
5. Traffic/CTR measurement:site-->adjustment of sidewide Exposurerank
6. Changed onsite/offsite factors+redistribution(4.)+adjustment(5.)-->Exposurerank
The tricky part would be 6.: If you allow webmasters to add shallow pages without negative consequences, the whole calculation gets worthless.
So Google needs stable rankings and webmasters in fear of adding shallow content and therefore disturbing/exploiting the calculation.
This doesn`t mean that Google isn`t after shallow content, excessive ads or whatever.
But maybe many of the things often seen as cause are in reality only effects or much more effect than cause.
Had the same thoughts as Shaddows before i read his post.
One thing i've noticed a few times now - when i've accidently placed noindex on some crucial pages - traffic dropped off... i spotted the drop, figured out the mistake, and BANG! A whole lot of traffic, more then usual, that then dies down to normal levels.
Shaping is real. I have sites with 40 or so popular pages, and a get a almost flatline (sam each day) of traffic to the entire site.
Humans are not that predictable!
bunch of hits...no matter what...
even more hits...no matter what... (pervs)
Kind of tough to compete. :)
As regards OP's original post, what came to my mind is that BLOCKED URL (folder) okay though G will scan it anyway (they lie) and notices that NEW CONTENT is going to BLOCKED (well golly gee must be doing something) so... down a point or -60.
One test to confirm is to .htaccess (refuse) that folder so they (whoever bot) really can't see it.
In WebmasterTools you can see the data for SERP "impressions" - that's another way of saying "exposure", and it's been in the reports for a long time, as well as being mentioned in many patents. Seems clear that Google does measure SERP impressions for the domain name as well as for individual URLs.
I'm convinced that getting a major jump in impressions after you make some kind of change can trigger a manual inspection. At the same time, Google may intentionally increase impressions to see what happens with click-throughs, as well as how that affects click-throughs to other sites in the rankings. For highly competitive query terms, the CTR threshold to beat may well be whatever the previous URL at that position was generating, rather than just some "absolute" percentage.
"SER-P-i-m-p-ressionpossibilityrank" would also be fine. It won`t be the only factor, so I used "possibility". It triggers the badword filter by the way^^
I used "exposure" because it has been used in this thread before and to avoid a confusion with "pageimpressions".
I used "rank" to highlight the dynamic/liquid nature people already know from pagerank. Therefore I chose a similar name.