homepage Welcome to WebmasterWorld Guest from 54.226.10.234
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 84 message thread spans 3 pages: 84 ( [1] 2 3 > >     
Google Traffic Throttling means we have to reduce user services
internetheaven

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 9:28 am on Aug 5, 2009 (gmt 0)

Been working on this most of this year and sadly, data shows that when we provide more services for users in our free sections, traffic to our commercial sections drop.

To clarify my interpretation of Google Traffic Throttling before continuing: we get almost exact traffic amounts from Google and have done since late last year (Yahoo, Bing, Referral and Direct traffic always varies). The Google traffic graph is so predictable I could draw next week's and not be far off.

My understanding of how Google is accomplishing this with my site (I'm sure some sites are throttled differently) is moving us around in the rankings. We bounce from the ever-theorised-number-4 position to no-where, from No.9 to No.900 and so on throughout the day but the jumps are not predictable. I never know which page is going to be tanked from day to day but it is always a few decent traffic pages that suddenly vanish.

Why does this mean I have to reduce services? Well, almost 90% of my site's pages are user-based - industry news and consumer comments. Ads do not convert well on those pages. Our commercial sections convert very well.

This means that if our "traffic allowance" is used up by people visiting the news/discussion pages, we see a dramatic drop in visitors from Google to our commercial pages. If you put the news pages graph over the commercial pages graph it is simply amazing. The drops in commercial traffic correspond with increase in news traffic. When news traffic is low (Sat, Sun, Mon in our industry) the commercial traffic is much higher.

But the totals never pass our allotted Google traffic.

What would you do? Would you stop providing free services altogether so that your whole traffic allowance goes toward commercial pages?

Someone said that links get you out of traffic throttling, but after almost a year and thousands of links we have seen no movement ... other than an increase in our daily Google allowance of 100 visitors a couple of months ago. Is that the rate? Every 12 months and 5,000 links we are allowed another 100 visitors per day?

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 5:53 pm on Aug 5, 2009 (gmt 0)

What kind of result is in position #4 when it's not your site - is it a Universal result of some kind, or is it a competing website?

[edited by: tedster at 9:36 pm (utc) on Aug. 6, 2009]

londrum

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3965979 posted 6:02 pm on Aug 5, 2009 (gmt 0)

i suppose google are constantly trying to test out which are the best pages, so they can improve their index.

the only way to do that is to shift the sites around the listings every now and then. otherwise the sites that remain on the first couple of pages are always going to get a big boost. but some brand new site on page 5 might actually be 10x better.

rather than wait around for that site to gather loads of inbound links, which might take ages, they can shift it up for a few days and see what the user behaviour is like when they click it. user behaviour is becoming more important then inbound links, i reckon. what with all these analytics and google toolbar programs.

and also... i am sure that google shift sites around depending on what time of day it is. what might be good at 10 AM, when the east coast of America is awake, might not be so good when the east coast is fast asleep in bed.

because you might have a site which better caters for the west coast, for example, and it makes sense for it to temporarily jump up the listings.

tigertom

10+ Year Member



 
Msg#: 3965979 posted 7:18 pm on Aug 5, 2009 (gmt 0)

Two quick suggestions:

1) I think you answered your own question, 'though I'd hate to dump perfectly good pages I made just to please G00gle. (What a state of affairs to arrive at!)

2) Better: Make more sites.

2b) Redirect from the main domain to the 'freebie' site domain to set it up. Use a 301 .htaccess redirect to stick the freebie pages on the .net rather than the .com of your domain, without moving any files at all. Just a thought.

Then your money pages are on the old domain and the freebie ones are on another. Your new domain passes some 'PR' to the new. Then let Google throttle away.

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 7:52 pm on Aug 5, 2009 (gmt 0)

i suppose google are constantly trying to test out which are the best pages, so they can improve their index.

Or maybe they are moving pages around because...

[a] They've decided that variety is a good thing for them and for the user, when 2 sites are approximately equal in fulfilling a query;

[b] It goes along with their philosophy of "fairness" to give lots of people some decent traffic, rather than a few sites getting most of it and the rest getting considerably less.

All I know is when I look at my monthly stats, like internetheaven said, I can predict my next Google level +/-5%, so there seems to be something to the "Google throttling" observations that webmasters are making. Or perhaps I should refer to it as "Google balancing", because in some cases, they may not be taking from me, but rather, giving to me, for the [a] and [b] reasons above. If that's the case, then I'm grateful.

..............................

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3965979 posted 3:16 am on Aug 6, 2009 (gmt 0)

So, if Google does traffic throttling what would be the reasons why? My first thought is because of server speed (Searchers do not like slow servers or page load time)

So the real question is what can you do on your server to improve the speed? Besides the server speed, what else can you do? May be reduce some bloated coding on pages? I am sure there are other thoughts out there on this.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 3:41 am on Aug 6, 2009 (gmt 0)

Check out the current discussion on Google's 2007 backlink patent [webmasterworld.com]. There are many good insights from our members there -- about traffic throttling, the yo-yo, plus what the long term "fix" appears to be.

internetheaven

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 8:58 pm on Aug 6, 2009 (gmt 0)

So, if Google does traffic throttling what would be the reasons why? My first thought is because of server speed

Not likely in our case. Our hosting provider is recommended by most top WW members. Our coding is optimized. Databases checked regularly for ways to increase speed. Images optimized. etc. etc.

Google download speed average is 312 for this particular site. How does that compare? (I have plenty of sites not showing this throttling!)

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3965979 posted 3:47 am on Aug 7, 2009 (gmt 0)

I tend to believe that it is not traffic throttling, I would venture to say it is more like the yo-yo or some filter.

If you think about it, why would Google slow down traffic to a real good website if the server can handle the traffic?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 5:56 am on Aug 7, 2009 (gmt 0)

I used to think that the yo-yo was the primary or root phenomenon in every case. That is, until I saw a daily graph from one site's analytics with Google Search traffic volumes graphed hour-by-hour, and then day-by-day. The daily totals graph was a nearly perfect plateau - I've called it a "buzz cut" in another thread. The day-to-day variation was well under 0.1%

And the hour-by-hour graph was jiggling along every day, normal ups and downs, until that plateau number was hit. Then NO more Google traffic until the next day. It looked like a total traffic number cut-off, and not a certain time cut-off as I'd seen in other cases.

So now, I'm more of a believer. In some cases at least, it really looks like there is intentional traffic throttling as a primary phenomenon. Consider this quote from a Google patent:

[0102] ...Search engine 125 may take measures to prevent spam attempts by, for example, employing hysteresis to allow a rank to grow at a certain rate. In another implementation, the rank for a given document may be allowed a certain maximum threshold of growth over a predefined window of time.

DOCUMENT SCORING BASED ON LINK-BASED CRITERIA [appft1.uspto.gov] - 2007 patent application filing

I know that a patent mention doesn't prove that Google actually IS throttling, but it sure indicates that they've thought about it - and that makes the accounts of throttling from website owners a lot more credible for me.

I also appreciate that few of us want to think that our website is "spam" in Google's eyes. However, Google can be pretty free using the label "spam". i've heard it applied it to, for example, thin affiliate sites. So it doesn't surprise me to read the label "spam" when a backlink profile is too far away from the "natural" picture that Google expects -- at least from the viewpoint of their patents.

fishfinger

5+ Year Member



 
Msg#: 3965979 posted 10:50 am on Aug 7, 2009 (gmt 0)

But surely if the backlink profile is not what Google expects then they would just give less weight to it?

With the scenario you describe they have to
(a) identify the profile
(b) allow the links to count fully
(c) THEN apply some control measure - throttling.

Why not instead
(a) identify the profile
(b) count the links in a different way

Added : If this was something that went on for a few months it could be explained by Google trying out different link profiles for certain terms, measuring the user experience, then making a permanent algo adjustment to favour sites with the profiles Google users prefer. But INH says he's seen this for a year which seems like an awfuly long test time with the size of Google's data set.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 12:54 pm on Aug 7, 2009 (gmt 0)

I'm assuming that a trust issue is the main factor with unbalanced backlink profiles. Improved trust, through improved diversity in backlink types, is the factor that can remove throttling.

So just how do you do that? Well, "you" can't do it -- it needs to sot of "happen" because your site becomes so appreciated by others. At least that's the wavelength I'm getting from Google's public statements.

rros

5+ Year Member



 
Msg#: 3965979 posted 3:39 am on Aug 8, 2009 (gmt 0)

I think exactly like Reno (#b).

I take their word of "do no evil". In this respect, and in the greater scheme of things they may view the index as a way to distribute traffic as best as possible and give a shot to as many site owners as possible. Thus, the yoyoing.

Back in '05, I had a feeling that aff's. sites that were promoting few if not just one sponsor were demoded and replaced by sites that compiled listings and acted as distributors of traffic. Before this, directories were made prominent. Today, these attempts may not be enough to get traffic to as many sites as possible, therefore the need to rotate. And in so doing, test which sites may make it to the bigger league to play with the big boys. There may be so many good sites by now representing each keyword that Google might be able to compile several top ranking page results indistinctly.

To an outsider, the index may look confusing. But to them, it may only reflect their vision. *Resistance is futile*.

Aside for those sites that have excelled and would always show at the top no matter what, the rest of us may be mere mortals that have yet to prove ourselves.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 4:01 am on Aug 8, 2009 (gmt 0)

I take their word of "do no evil"

I recently read a comment from Google that clarified the "slogan". It's "Don't BE evil" -- it's truly impossible not to DO any evil at all. I think I get it - it's more about motives or intentions.

So I'm building an algo that will sort out what Google intentions really are. I'll be doing sentiment analysis of all the published comments from Googlers. I'll also hire a a few thousand human raters to assign an Evil Rank score to each statement. Then I'll combine all the data with some secret sauce (hey, it's my algo, I'm entitled) and the final Evil Rank score should be zero, right? But I won't tell Google what their score is - it will just be N/A. If they care, they'll have to guess ;)

----

But seriously, with regard to affiliate sites, I think it was John Mueller who said something like "You need to build a useful site first and then you can add some affiliate links and you'll be fine. Don't build an affiliate site first and then try to add some extra value."

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 6:53 am on Aug 8, 2009 (gmt 0)

build a useful site first and then you can add some affiliate links and you'll be fine

Sage advice tedster. The way I look at it, at this point in the evolution of online commerce, when it comes to any given query in all likelihood there are a lot of sites that are useful, well designed, and deserving. In 2009, I think we all know that. We work hard on our sites and want to prosper, but most of us can only do that with a decent position in the Google SERPs, so we want/need an edge to make that happen. In 2009, I think Google knows that is their position in our lives.

So given that there are a lot of "worthy" websites that can fulfill a query, to me it makes sense that they would spread the traffic around. Again, I'm talking about quality sites -- not "made for affiliates" auto generated templates with a ton of duplicate content.

If they're throttling my traffic at one site but pushing up one of my others -- to keep variety before the public -- then in the greater scheme of things, in regards to this particular issue, I have no complaint. In fact, I hope "throttling" is real, and continues -- if its inclusion in the algo is for the reasons I've expressed.

..........................

santapaws

5+ Year Member



 
Msg#: 3965979 posted 8:59 am on Aug 8, 2009 (gmt 0)

Hum, you seem to be saying that the way forward is to have multiple sites but on the same topic? Would this then not be a self defeating on Googles part? Recognising there are many good sites so spreading traffic/rankings around thus to maintain traffic webmasters need to have multiple sites to balance the throttling across domains.

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 3:09 pm on Aug 8, 2009 (gmt 0)

you seem to be saying that the way forward is to have multiple sites but on the same topic

No no no -- I did not say "same topic". I'm saying that if you are dealing with multiple sites (multiple topics) and you lose a bit from one but gain at another, you're doing OK. And again (if there is anything to this), from Google's point of view, spreading traffic to various sites that are approximately of equal value does in its own way make the internet a healthier place in the sense that numerous players may be doing pretty good, as opposed to just a very few doing great.

........................

santapaws

5+ Year Member



 
Msg#: 3965979 posted 6:49 pm on Aug 8, 2009 (gmt 0)

Its just that when you say as some go up others go down, this would only make sense if the network was being throttled or if the sector was being throttled. Otherwise it seems amazing that theres such a balance across your sites with throttling. That as much goes up as goes down.

tigertom

10+ Year Member



 
Msg#: 3965979 posted 8:03 pm on Aug 8, 2009 (gmt 0)

You can have one niche but multiple sub-topics.

Like 'web design' embraces 'hosting', 'software', 'freelancing', 'internet marketing', 'SEO' etc.

santapaws

5+ Year Member



 
Msg#: 3965979 posted 7:43 am on Aug 9, 2009 (gmt 0)

Well then i come back to my question, why would multiple sub-topics have balancing peaks and troughs? It sounds more like network load balancing the way its been described. Which would actually make sense. If google recognises a large network of sites why treat them individually? It would help to DIScourage the breaking up of sites for SEO reasons by still treating them as a single entity.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 1:20 pm on Aug 9, 2009 (gmt 0)

That's an interesting direction of thought, santapaws. We know that Google treats links between "associated sites" differently than links between sites they see as independent. It could make sense, theoretically, for them to treat a whole batch of related sites more like one site in other ways, too.

signor_john



 
Msg#: 3965979 posted 2:01 pm on Aug 9, 2009 (gmt 0)

It could make sense, theoretically, for them to treat a whole batch of related sites more like one site in other ways, too.

Especially when there are so many cases where separate sites are being used purely for SEO: e.g., scintillating-santa-clara-hotels.com, scintillating-santa-cruz-hotels.com, scintillating-santa-domingo-hotels.com, and so on. Such networks clearly are single sites for all intents and purposes, so wouldn't it be logical for Google to treat them as such?

wingslevel

10+ Year Member



 
Msg#: 3965979 posted 3:50 pm on Aug 9, 2009 (gmt 0)

for those of you noticing throttling, any sense of whether the traffic flow surges and interruptions are page, keyword or site wide?

Adastra

5+ Year Member



 
Msg#: 3965979 posted 3:57 pm on Aug 9, 2009 (gmt 0)

Is it possible that it's just a replication issue?

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 4:14 pm on Aug 9, 2009 (gmt 0)

for those of you noticing throttling, any sense of whether the traffic flow surges and interruptions are page, keyword or site wide?

That's a great question. In appears that the only person in this discussion who has seen a quantitative analysis of "throttling" is tedster (see 08/07 post above). Because Google has not come out and confirmed this behavior, the thread remains conjecture but with a lot of circumstantial evidence.

I cannot provide any hardcore data that says "there -- you see -- no doubt about, Google is throttling traffic at some of my sites". What I see is a Google visitor referral level consistency that seems to be more than coincidence because it's happening over a relatively long term (more than just a month or 2)

So logic tells me that if Google is slowing traffic at some of my better sites because I've reached some click-through level that triggers a lower SERP (thus a slowdown in traffic), then for other of my sites they are likely pushing me up at some point (because websites unrelated to me with better SERPs are being throttled). That's my point about it somewhat balancing out in its own way.

But again, all of this is conjecture based on what appears to be occurring. I would certainly welcome any feedback from webmasters who, like tedster, have studied this phenomenon with program analysis.

..............................

tangor

WebmasterWorld Senior Member tangor us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 3965979 posted 7:05 pm on Aug 9, 2009 (gmt 0)

The number of "billboards" available is immense AND GROWING. The number of ads is finite (whatever is in the hopper at the moment...) and they want to make bucks from ALL OF THEM. Hence the shuffle/throttle. Site A (the authority site) makes X... but Site B, a newcomer, makes zilch. G knows B needs to make a little NOW so they get hooked. X has made enough for the day so give some of that to B...and a few other B sites as well.

There's a bit of a shell game going on (opinion backed by observation of woes reported here) to keep the money machine working. Freely admit G has it working...for now... but also see that the continual parsing and futzing, and introduction of "penalties and perks, etc. etc. etc." via arcane and obtuse algos will eventually break it.

G made their mark initially by being the fastest with the BEST results... and since that high point keeps slipping back a notch or two each year. I call that "messin' with perfection" and all you fiddlers out there futzing with what you know works...and breaking it... know what I mean.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3965979 posted 7:35 pm on Aug 9, 2009 (gmt 0)

for those of you noticing throttling, any sense of whether the traffic flow surges and interruptions are page, keyword or site wide?

What I've seen in practice has been related to one keyword, but that was the only keyword that the page ranked for, so it could have been page specific throttling. Somehow, I doubt that, but I haven't proven it one way or the other. It's not easy to get traffic data for a big enough sampling of sites!

Similarly, I've never seen a site-wide traffic throttling unless the site only ranked for one keyword. And that just means I haven't seen it, not that it doesn't happen.

rros

5+ Year Member



 
Msg#: 3965979 posted 8:20 pm on Aug 9, 2009 (gmt 0)

Not sure how much of a contribution this is, but in regards to Reno's inquiry I have studied my logs to the minute for many months and this is what I noticed. In very strong, stable pages this particular site shows up consistenly for certain pairs of keywords/pages results without fail, searches mostly associated to the title of the site and similar. Google must be convinced there is a good match for the pair. But on weaker pairs of keywords/pages the search results for this site come and go.

If I click on the google referral link from the logs many times I just cannot find in the index the website page the click was referred from. The referral is always a pair "keyword/page" so if there is really any throttling it might be associated to the pair. Instead, other pages from other sites are part of the results.

But I am sometimes able to catch the referral in the index. This happens day in day out and one possible conclusion could be that Google would only show such page for such keyword at such page result only in certain occasions, in a discretionary way. Why? Who knows... Maybe it is just a lack of insufficient backlinks compared to other possible pages from other sites. Or maybe it is indeed throttling to send traffic to sites that otherwise may never get much or to cap traffic in some way.

Something to note... when clicking on the google referral link I made sure to check 5 or more pages up and down in case the listing had moved up or down. Also, I made sure this wasn't a bot. The visitors referred by Google spent at least 10 minutes on the site browsing other pages and downloading data. And as said, if the website page stopped showing up for a search of the keyword in question, it most likely would appear next day only to disappear again the day after, according to the logs and my clicking on the referral links.

icedowl

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 9:06 pm on Aug 9, 2009 (gmt 0)

rros, I've been doing and seeing exactly the same things as you. Hope that helps. It is a constant, everyday shuffle. I have many pages that are usually in the top 10, mostly at position 4, on a consistent basis. Other pages bounce around all over the place from first page, to second page, to nowhere to be found and back to first.

internetheaven

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3965979 posted 12:13 pm on Aug 10, 2009 (gmt 0)

That's an interesting direction of thought, santapaws. We know that Google treats links between "associated sites" differently than links between sites they see as independent. It could make sense, theoretically, for them to treat a whole batch of related sites more like one site in other ways, too.

So if several different people were to buy a bunch of domains from Godaddy, use their privacy services (so all whois is the same) and use Godaddy for hosting too (so IPs are similar) -- all those people might be seen as one individual?

This 84 message thread spans 3 pages: 84 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved