homepage Welcome to WebmasterWorld Guest from 54.243.12.156
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Simple mistake causes traffic loss, and is repeatable
Sgt_Kickaxe




msg:4318427
 7:12 am on May 27, 2011 (gmt 0)

GWT offers the option to remove single urls and entire sections of a site from the internet. Here's a simple experiment that sheds a little light on how Google works.

- request a removal of an entire directory, traffic will fall accordingly.
- later restore the content but don't undo the completed removal, traffic will fall some more
- undo the removal in GWT and traffic will bounce back with a vengeance.

That second part is what I find interesting. Adding content to a section of your site that is blocked via GWT somehow causes a loss of rankings, the more content you add to the blocked directory the more you lose traffic to the entire site.

It makes sense in a way, you're bleeding pagerank into parts of a site Google can't follow anymore but still I can't help but think this might help some Panda victims who are busy trashing large sections of their sites right now.

It's apparently not enough to continuously add fresh content, you can improve rankings by pumping up existing content. I think that's something to think about when setting up a new website. You might be wise to choose a list of keywords you need to rank for and create a page for each and then add content to those pages on an ongoing basis. Turn "a post a day" into "a posts worth of data added to existing articles" and you may be rewarded. There isn't strength in numbers with Google, there's strength in rankable quality.

Of course I may also be reading too much into Google behavior, it happens :D

note: I noticed GWT no longer tells you what % of your site has low PR, medium PR and high PR anymore, they did once upon a time I think.

 

internetheaven




msg:4318467
 9:55 am on May 27, 2011 (gmt 0)

Over what time period was each of the parts of your experiment performed?

deadsea




msg:4318502
 11:23 am on May 27, 2011 (gmt 0)

I never understood why they included that % high pr thing. My sites always have a handful of pages with high PR, a larger segment with medium PR, and 95% of the site with low PR.

Shaddows




msg:4318657
 3:58 pm on May 27, 2011 (gmt 0)

I interpret your results in a completely different way.

Assume for a moment that every site has traffic throttled to some degree. (I am personally convinced of this, but know others will dispute)

What it sounds to me is that your blocked content is being "virtually" placed in SERPs, taking your exposure-allocation for the day. However, it's not being actually shown, so the exposure-allocation does not convert to traffic-allocation.

To me, this is strongly suggestive that Google throttles exposure, not traffic.

I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result.

indyank




msg:4318660
 4:08 pm on May 27, 2011 (gmt 0)

Since you are mentioning GWT and panda, I have noticed that though traffic has greatly diminished, GWT is still showing almost the same average ranking positions for most of the traffic pulling pages.

Is anyone else experiencing this?

aakk9999




msg:4318676
 4:44 pm on May 27, 2011 (gmt 0)

I've always thought that traffic was throttled, and exposure was the tool they used to do that. If exposure is the metric, and traffic is incedental, that would be a fascinating result.

From the point of view of monitoring the user behaviour, it would make more sense to throtle the exposure rather than the traffic.

indyank




msg:4318686
 4:58 pm on May 27, 2011 (gmt 0)

If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic?

I always thought (and I see a few examples too) that they do it by ranking the page differently in different parts of the world.

One of my pages from a pandalized site is on page 1 in US, page 2 (#20) in Aus and page 3 in UK.

aakk9999




msg:4318687
 5:03 pm on May 27, 2011 (gmt 0)

If they throttle traffic, they will have to do it by throttling exposure. How else can they throttle traffic?


Throtling traffic: G. keeps your exposure longer until you reach allocated traffic limit.

Throtling exposure: G. puts the limit on the number of impressions (and wish you a good luck with the CTR).

If they do throtle exposure, would good CTR (and the low clickback rate) result in the increase of the exposure limit?

[edited by: aakk9999 at 5:08 pm (utc) on May 27, 2011]

indyank




msg:4318693
 5:06 pm on May 27, 2011 (gmt 0)

They could keep your exposure longer until you reach traffic limit. OR - they could throtle the number of impressions and wish you a good luck with the CTR.


hoe are they different from exposure?

indyank




msg:4318697
 5:08 pm on May 27, 2011 (gmt 0)

When you near a traffic threshold, they throttle the exposure and this could also affect the number of impressions (if the page doesn't show up at all for the query).

aakk9999




msg:4318698
 5:09 pm on May 27, 2011 (gmt 0)

@indyank, edited my post, read the above, perhaps this answers your question?

indyank




msg:4318701
 5:14 pm on May 27, 2011 (gmt 0)

I think we are actually in agreement. My point was in any case control on traffic (traffic throttle) can be achieved only by varying or limiting the exposure.

aakk9999




msg:4318702
 5:18 pm on May 27, 2011 (gmt 0)

Yes, however the difference is that if the traffic is throtled, tough luck - there is a limit and once you reach that - no more traffic.

If the exposure is throtled, then working on better CTR (listing title, meta description, increasing the brand recognition via other means so that you get more clicks) will still get you more traffic despite exposure being throtled (limited to some number).

Sgt_Kickaxe




msg:4318761
 7:01 pm on May 27, 2011 (gmt 0)

This thread has been hijacked by throttling :D

At any rate, perhaps the two are connected. Maybe your site has a set amount of horsepower and Google does have a restricting plate on it makes sense that if 30% of your pages are unusable (ie, locked up in a removal request) that your traffic is slower and adding more horsepower parts(pages) serves no use since the restriction is keeping output low.

In either case opening up the throttle(pr) and letting the site breathe is in order. I was just proposing that instead of adding more pages it might be best to add more content to existing pages, beef them all up so to speak.

seoholic




msg:4318909
 5:01 am on May 28, 2011 (gmt 0)

It adds up very well with my experiences. If I add more topic A pages to my site, topic B pages gett less traffic/exposure(?) then before and overall traffic level stays the same. I was always convinced that a page "can`t make it on its own" and that the "it`s all about pages, not sites" mantra was wrong years ago.

Like Pagerank there seems to be some kind of "Exposurerank".

My theory:

1. Onsite/offsite factors:page-->Exposurerank
2. Sum of all Exposurerank on site-->max. exposure allowed:site
3. Max. Exposure allowed-->traffic/CTR measurement:site and pages
4. Traffic/CTR measurement:pages-->redistribution of Exposurerank
5. Traffic/CTR measurement:site-->adjustment of sidewide Exposurerank
6. Changed onsite/offsite factors+redistribution(4.)+adjustment(5.)-->Exposurerank

The tricky part would be 6.: If you allow webmasters to add shallow pages without negative consequences, the whole calculation gets worthless.

So Google needs stable rankings and webmasters in fear of adding shallow content and therefore disturbing/exploiting the calculation.

This doesn`t mean that Google isn`t after shallow content, excessive ads or whatever.

But maybe many of the things often seen as cause are in reality only effects or much more effect than cause.

anteck




msg:4318910
 5:29 am on May 28, 2011 (gmt 0)

Had the same thoughts as Shaddows before i read his post.

One thing i've noticed a few times now - when i've accidently placed noindex on some crucial pages - traffic dropped off... i spotted the drop, figured out the mistake, and BANG! A whole lot of traffic, more then usual, that then dies down to normal levels.

Shaping is real. I have sites with 40 or so popular pages, and a get a almost flatline (sam each day) of traffic to the entire site.

Humans are not that predictable!

tangor




msg:4318916
 6:15 am on May 28, 2011 (gmt 0)

Brad Pitt

bunch of hits...no matter what...

Angelina Jolie

even more hits...no matter what... (pervs)

............Widgets.............

Kind of tough to compete. :)

As regards OP's original post, what came to my mind is that BLOCKED URL (folder) okay though G will scan it anyway (they lie) and notices that NEW CONTENT is going to BLOCKED (well golly gee must be doing something) so... down a point or -60.

One test to confirm is to .htaccess (refuse) that folder so they (whoever bot) really can't see it.

tedster




msg:4319056
 6:31 pm on May 28, 2011 (gmt 0)

Exposurerank

In WebmasterTools you can see the data for SERP "impressions" - that's another way of saying "exposure", and it's been in the reports for a long time, as well as being mentioned in many patents. Seems clear that Google does measure SERP impressions for the domain name as well as for individual URLs.

I'm convinced that getting a major jump in impressions after you make some kind of change can trigger a manual inspection. At the same time, Google may intentionally increase impressions to see what happens with click-throughs, as well as how that affects click-throughs to other sites in the rankings. For highly competitive query terms, the CTR threshold to beat may well be whatever the previous URL at that position was generating, rather than just some "absolute" percentage.

seoholic




msg:4319065
 6:52 pm on May 28, 2011 (gmt 0)

@tedster

"SER-P-i-m-p-ressionpossibilityrank" would also be fine. It won`t be the only factor, so I used "possibility". It triggers the badword filter by the way^^

I used "exposure" because it has been used in this thread before and to avoid a confusion with "pageimpressions".

I used "rank" to highlight the dynamic/liquid nature people already know from pagerank. Therefore I chose a similar name.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved