Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Traffic Shaping / Throttling Prior to Deindexing

2:51 pm on Jul 16, 2012 (gmt 0)

New User

joined:July 13, 2012
votes: 0

I launched a few "test" sites back in April to see what could happen if I played everything as safe as possible. 3 sites in all. I hired well-known freelancers/experts/professors in various niches to produce great content for these sites. We worked the PR angle really aggressively and most of these sites ended up with between 10 and 15 backlinks each, pretty much 100% brandname/url anchors from great sources.. edu/gov resource pages, interviews on forbes/cnn.com, etc.

Anyway - the sites followed a very strange pattern overall.

At first everything was cool.. these were EMD domains and ranked just fine right out of the gate after the first deep spider. Then things got a little weird. After the initial links started landing the homepages started going away and random subpages started ranking in the 30-60 range for the homepage keywords.

Then everything went back to normal for about a week or 10 days, depending on the site. Then it got weird again :)

During the next couple of weeks there were lots of big drops.. keywords that were in the top 10-20 dropping down to 150 or past 200. It was really surprising the day-to-day drops of hundreds of spots.

Now, starting about 2 weeks ago I started seeing some weird traffic patterns.. the sites wouldn't get any traffic at all during the day. Say, 7am to 7pm, no traffic. Then overnight there'd be a ton of visitors for the target keywords from Google. US traffic too, not overseas. I was getting a lot of the &cd=2 (4, 12, etc) strings appended to the Google referral url, so that seems like a sure bet of testing being conducted.

Over the weekend all 3 sites were completed dropped from the index. None of them pull up via [domain.com], only site:domain.com. No alerts in WMT at all.

Anyone else had any experiences like this? I'm beyond frustrated with this.. I feel like we approached these by-the-book.
10:19 am on July 20, 2012 (gmt 0)

Junior Member

joined:Apr 27, 2012
votes: 1

@Robert Charlton ....ranking, because...( I might posting a bit out of topic here sorry..)
Believe me, I'm seeing (and probably you see) crap pages (IMHO) at top one key word positions from "brand name" sites, that pay millions in adds, to rank over dedicated pages to the subject in the organic results. how do you explain that?, not to mention the notorious "crowd hosting" that follows the 2nd, 3d, 4th, 5th, 6th..... page of results from the same usual odd brands.....
6:35 pm on July 20, 2012 (gmt 0)

New User

5+ Year Member

joined:June 13, 2010
votes: 0

As others have implied, the "shuffling" could be a form of testing by Google of searcher's interest. Clicking on a result is one obvious signal. Google has shown that it is aware when people come back to the search results page, so it might be encouraging searchers to examine various sites... and using their returning to the search results page as a negative signal. But to do such things, Google should shuffle the results so data can be gathered for a bunch of similarly-rated sites.
3:32 pm on July 21, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Jan 1, 2011
votes: 18

I'll trust tedster and Robert Charlton that the shuffling is statistical sampling. But I think Google's behavior in many areas has changed dramatically since the FTC began looking at them - we Web-Speed pointed out. I don't want to sound tin foil hat, but I think Google's getting extremely aggressive about trying to take over everything, and it may just be that they figure eventually there has to be an anti-trust suit, so they've got to take over what they can while they still can.

The shuffling could also then serve the purpose of making the algo so confusing that even if an anti-trust suit DOES involve making the algo open source (as Congress suggested), webmasters still won't know what to do with it.
4:28 pm on July 21, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Feb 17, 2004
votes: 0

dresden87 - Sorry, but this is how it's going to be. Over the last year all my new websites have exhibited what you have described, regardless of "quality", be it content quality, link quality or any other metric.

Is there logic to it? Sure, after all, it's a program that's doing it and as such - it must follow logic in order to operate.

My opinion is that while the algo will return a logical result, it has been so messed up by Google's tinkering in the past year or so, that the result itself is not "logical" to a human being. For example - if you have a good website with good links, it's only logical to you that you will rank good. However, now you know that this is not the case.

I can only hope that Google will abandon the weekly "update affecting 1% of the websites" and start a new, more basic, algo from scratch. Until then I am taking on the spray-and-pray approach :)
This 34 message thread spans 2 pages: 34

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members