Welcome to WebmasterWorld Guest from 54.196.232.162

Message Too Old, No Replies

Simple mistake causes traffic loss, and is repeatable

     
7:12 am on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member

joined:Apr 14, 2010
posts:3169
votes: 0


GWT offers the option to remove single urls and entire sections of a site from the internet. Here's a simple experiment that sheds a little light on how Google works.

- request a removal of an entire directory, traffic will fall accordingly.
- later restore the content but don't undo the completed removal, traffic will fall some more
- undo the removal in GWT and traffic will bounce back with a vengeance.

That second part is what I find interesting. Adding content to a section of your site that is blocked via GWT somehow causes a loss of rankings, the more content you add to the blocked directory the more you lose traffic to the entire site.

It makes sense in a way, you're bleeding pagerank into parts of a site Google can't follow anymore but still I can't help but think this might help some Panda victims who are busy trashing large sections of their sites right now.

It's apparently not enough to continuously add fresh content, you can improve rankings by pumping up existing content. I think that's something to think about when setting up a new website. You might be wise to choose a list of keywords you need to rank for and create a page for each and then add content to those pages on an ongoing basis. Turn "a post a day" into "a posts worth of data added to existing articles" and you may be rewarded. There isn't strength in numbers with Google, there's strength in rankable quality.

Of course I may also be reading too much into Google behavior, it happens :D

note: I noticed GWT no longer tells you what % of your site has low PR, medium PR and high PR anymore, they did once upon a time I think.
9:55 am on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 25, 2003
posts:2527
votes: 0


Over what time period was each of the parts of your experiment performed?
11:23 am on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 7, 2003
posts: 750
votes: 0


I never understood why they included that % high pr thing. My sites always have a handful of pages with high PR, a larger segment with medium PR, and 95% of the site with low PR.
3:58 pm on May 27, 2011 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1269
votes: 44


I interpret your results in a completely different way.

Assume for a moment that every site has traffic throttled to some degree. (I am personally convinced of this, but know others will dispute)

What it sounds to me is that your blocked content is being "virtually" placed in SERPs, taking your exposure-allocation for the day. However, it's not being actually shown, so the exposure-allocation does not convert to traffic-allocation.

To me, this is strongly suggestive that Google throttles exposure, not traffic.

I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result.
4:08 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


Since you are mentioning GWT and panda, I have noticed that though traffic has greatly diminished, GWT is still showing almost the same average ranking positions for most of the traffic pulling pages.

Is anyone else experiencing this?
4:44 pm on May 27, 2011 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2511
votes: 142


I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result.

From the point of view of monitoring the user behaviour, it would make more sense to throtle the exposure rather than the traffic.
4:58 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic?

I always thought (and I see a few examples too) that they do it by ranking the page differently in different parts of the world.

One of my pages from a pandalized site is on page 1 in US, page 2 (#20) in Aus and page 3 in UK.
5:03 pm on May 27, 2011 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2511
votes: 142


If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic?


Throtling traffic: G. keeps your exposure longer until you reach allocated traffic limit.

Throtling exposure: G. puts the limit on the number of impressions (and wish you a good luck with the CTR).

If they do throtle exposure, would good CTR (and the low clickback rate) result in the increase of the exposure limit?

[edited by: aakk9999 at 5:08 pm (utc) on May 27, 2011]

5:06 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


They could keep your exposure longer until you reach traffic limit. OR - they could throtle the number of impressions and wish you a good luck with the CTR.


hoe are they different from exposure?
5:08 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


When you near a traffic threshold, they throttle the exposure and this could also affect the number of impressions (if the page doesn't show up at all for the query).
5:09 pm on May 27, 2011 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2511
votes: 142


@indyank, edited my post, read the above, perhaps this answers your question?
5:14 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


I think we are actually in agreement. My point was in any case control on traffic (traffic throttle) can be achieved only by varying or limiting the exposure.
5:18 pm on May 27, 2011 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2511
votes: 142


Yes, however the difference is that if the traffic is throtled, tough luck - there is a limit and once you reach that - no more traffic.

If the exposure is throtled, then working on better CTR (listing title, meta description, increasing the brand recognition via other means so that you get more clicks) will still get you more traffic despite exposure being throtled (limited to some number).
7:01 pm on May 27, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member

joined:Apr 14, 2010
posts:3169
votes: 0


This thread has been hijacked by throttling :D

At any rate, perhaps the two are connected. Maybe your site has a set amount of horsepower and Google does have a restricting plate on it makes sense that if 30% of your pages are unusable (ie, locked up in a removal request) that your traffic is slower and adding more horsepower parts(pages) serves no use since the restriction is keeping output low.

In either case opening up the throttle(pr) and letting the site breathe is in order. I was just proposing that instead of adding more pages it might be best to add more content to existing pages, beef them all up so to speak.
5:01 am on May 28, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:June 29, 2010
posts: 87
votes: 0


It adds up very well with my experiences. If I add more topic A pages to my site, topic B pages gett less traffic/exposure(?) then before and overall traffic level stays the same. I was always convinced that a page "can`t make it on its own" and that the "it`s all about pages, not sites" mantra was wrong years ago.

Like Pagerank there seems to be some kind of "Exposurerank".

My theory:

1. Onsite/offsite factors:page-->Exposurerank
2. Sum of all Exposurerank on site-->max. exposure allowed:site
3. Max. Exposure allowed-->traffic/CTR measurement:site and pages
4. Traffic/CTR measurement:pages-->redistribution of Exposurerank
5. Traffic/CTR measurement:site-->adjustment of sidewide Exposurerank
6. Changed onsite/offsite factors+redistribution(4.)+adjustment(5.)-->Exposurerank

The tricky part would be 6.: If you allow webmasters to add shallow pages without negative consequences, the whole calculation gets worthless.

So Google needs stable rankings and webmasters in fear of adding shallow content and therefore disturbing/exploiting the calculation.

This doesn`t mean that Google isn`t after shallow content, excessive ads or whatever.

But maybe many of the things often seen as cause are in reality only effects or much more effect than cause.
5:29 am on May 28, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:July 28, 2010
posts: 66
votes: 0


Had the same thoughts as Shaddows before i read his post.

One thing i've noticed a few times now - when i've accidently placed noindex on some crucial pages - traffic dropped off... i spotted the drop, figured out the mistake, and BANG! A whole lot of traffic, more then usual, that then dies down to normal levels.

Shaping is real. I have sites with 40 or so popular pages, and a get a almost flatline (sam each day) of traffic to the entire site.

Humans are not that predictable!
6:15 am on May 28, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:6160
votes: 284


Brad Pitt

bunch of hits...no matter what...

Angelina Jolie

even more hits...no matter what... (pervs)

............Widgets.............

Kind of tough to compete. :)

As regards OP's original post, what came to my mind is that BLOCKED URL (folder) okay though G will scan it anyway (they lie) and notices that NEW CONTENT is going to BLOCKED (well golly gee must be doing something) so... down a point or -60.

One test to confirm is to .htaccess (refuse) that folder so they (whoever bot) really can't see it.
6:31 pm on May 28, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Exposurerank

In WebmasterTools you can see the data for SERP "impressions" - that's another way of saying "exposure", and it's been in the reports for a long time, as well as being mentioned in many patents. Seems clear that Google does measure SERP impressions for the domain name as well as for individual URLs.

I'm convinced that getting a major jump in impressions after you make some kind of change can trigger a manual inspection. At the same time, Google may intentionally increase impressions to see what happens with click-throughs, as well as how that affects click-throughs to other sites in the rankings. For highly competitive query terms, the CTR threshold to beat may well be whatever the previous URL at that position was generating, rather than just some "absolute" percentage.
6:52 pm on May 28, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:June 29, 2010
posts: 87
votes: 0


@tedster

"SER-P-i-m-p-ressionpossibilityrank" would also be fine. It won`t be the only factor, so I used "possibility". It triggers the badword filter by the way^^

I used "exposure" because it has been used in this thread before and to avoid a confusion with "pageimpressions".

I used "rank" to highlight the dynamic/liquid nature people already know from pagerank. Therefore I chose a similar name.