Welcome to WebmasterWorld Guest from 188.8.131.52
One of my sites got hit.
1. One year old website
2. Niche terms with low competition and been number #1 for 2 terms for more than 6 months
In mid december my #1 got to around #6 position but fluctuating sometimes back and sometimes around #6 and now got stuck on #6
* I have keyword in the domain - e.g. www.keyword.net and that term got hit (+ some deep pages optimized for terms)
* The site is misspelling site - the site is ranking on mispellings of very competitive words. On these misspellings there is very low competiton and mostly forums/old sites which are not optimized for the misspelling at all.
* The site was entirely ranked on SEO. No PPC budget and no brand recognition
* Site was still getting some back links but the quality could be questionable - paid links but relevant
* All 3 terms that I was ranking for had lots of links with the same anchor texts and only small variations were present
* All the traffic went down, not only these 3 terms. Also my brand name - which is generic name ranks on #6
* I am using Google Analytics and other google products heavily. The site was interlinked with other of my sites but these have not been penalized.
* The homepage was changing constantly in last months and there have been relevant outgoing links to my other sites, which have not been hit.
* One of the deep pages that got hit, have been redesigned about 2-3 weeks before it got hit, with new content and template
[edited by: tedster at 6:05 pm (utc) on Jan. 5, 2008]
links is where i'm looking at
The idea of newly "released" supplementals is interesting, but in the examples I've looked at, the new results at 1-5 were not previously supplemental. They've been on page 1 all along and just moved up.
could there be a new influx of links valued to these site from the immerging supplementals. where it could be the profiles or even the quality of links to the sites that haven't dropped. causing the flux
Just trying to put some pieces together here...
We know that Matt is saying he didn't know about this (it was not intentional) and since he would probably know of a penalty, then we might be safe in assuming this is not a penalty of sorts.
Going back to the unintended consequence theory, it's not so much that the results were previously supplemental, it's the fact Google is treating everything as one big index now. So perhaps this tweak in supplementals twisted rankings elsewhere enough to pop some sites into higher positions (again, the unintended consequence theory).
I know we are seeing a lot more traffic (+45% on Google and 120% increase from AOL) especially on long-tail searches (4 + word phrases) than we were before, so something has changes for us.
also recently sub-domains are being treated slightly differently are you guys/girls that have dropped linked heavily from this section of the web or maybe the sites that haven't dropped are linked heavily from them? Was it the first week of December Google mentioned the change.
For people who are experiencing #6, what are your internal links like?
this might be what Marcia is thinking about?
just thoughts i was getting while reading the thread.
As an added are the sites affected from proxy pages? home pages are normally where they start the crawl.
[edited by: Vimes at 12:35 pm (utc) on Jan. 3, 2008]
I'm not suggesting that the DMOZ listings are the whole story but rather an important subset of a set of links that have been downgraded in value. I'm still of the opinion expressed earlier that the positiion six is actuallly a "transitional relief" extended to sites badly affected by the changes, but i'm aware no-one else seems to buy this.
This was an active site with fresh content 4 times per week that also added backlinks agressively
My assumption is they are monitoring the backlinks...but how to verify this? as there were maybe 30 added in the past 30 days, but similar keywords..
Sent a reinclusion request today to see if they will dislose why this occurred
<disclaimer>i have no evidence that this may have happened, i just put it out as a scenario not yet explored and may be a logical step down the road anyway</disclaimer>
Some of this may be based on query analysis. For pages that have been in the first 4-5 results or so, above the fold, if those pages have received fewer clickthroughs when compared to other pages listed within that set, it can cause a bit of a drop in rankings. So a drop to #6 in those instances could be a very real possibility and wouldn't be a "penalty" as such. It would just mean scoring bit less than other pages for that particular metric.
 According to an implementation consistent with the principles of the invention, one or more query-based factors may be used to generate (or alter) a score associated with a document. For example, one query-based factor may relate to the extent to which a document is selected over time when the document is included in a set of search results. In this case, search engine 125 might score documents selected relatively more often/increasingly by users higher than other documents.
[edited by: Marcia at 5:53 am (utc) on Jan. 5, 2008]
If those pages have received fewer clickthroughs when compared to other pages listed within that set, it can cause a bit of a drop in rankings.
That is one possibility I've been considering for a while. These are some of the positive and negative thoughts I have about that possible explanation:
-- The Positive Side --
1. We know from GWT that Google is measuring both impressions and clicks for the results.
2. As you said, it would not be a true penalty (part of the reason why I put quotes aronnd the word in this thread title.
3. At almost the same time as the #6 reports, we saw a number of communications from Google about meta descriptions and snippet generation. This included new features in GWT, and also additional blog entries.
-- The Negative Side --
1. Why so many reports of going from #1 to exactly #6 and all at the same time? There are almost no reports of going to #5 or #7 (except for clustered results of #6 and $7)? Are there also such drops to other positions, but those folks have just not found this thread? Or perhaps they feel that their drop to a different position doesn't line up with what we're discussing?
2. I know of two examples where click-throughs to the #1 result seemed quite healthy, right up to the demotion to #6. The meta description seems accurate, unique and quite well written to my eye.
-- More Brainstorming --
Maybe the click-through factor interacts with some other factor - say the Universal Search infrastructure - and that creates a strong likelihood of going to #6?
A further complication might also come from backlink stagnation.
Or perhaps the click-through algo does have a "ceiling" at #5 that comes into play, protecting the top 5 from urls that perform poorly? There are some parallel reports about falling from #1 to #11 - page 2 - and that would be another logical "ceiling" number.
I know of two examples where click-throughs to the #1 result seemed quite healthy, right up to the demotion to #6
Here's exactly my issue with the "click-thru rate" theory.
A. SEO of all things is based on math. I don't subscribe to any chicken-throwing myths. Since I have absolutely no way to test this theory, nor break into the plex to study their data, it's effectively a non-theory.
Even if it is the cause of the "ceiling", I have no way to test it, therefore it's falls into chicken-bone throwing.
B. Without being arrogant or naive, it seems less probable than probable that our listings of #1 + #2 indented listings would be "demoted" because of "low click-thru rates" in comparison to any other listing.
Study after study has shown a double indented listing gets far more click-thrus than any other individual listing on the same SERP page.
C. Without being arrogant or naive (lol again), many of our descriptions are beyond "click-worthy" and approved by top copywriters to precisely get higher click-thru rates than our competitor's Titles and Metas.
D. If indeed it's click thru rates, Goog again has missed the mark. Recently, Goog is extremely sensitive to Title changes. Goog can't be overly-sensitive with pages trying to "game them" by testing titles for SEO rankings and then also be penalizing sites that are testing -- or effectively not testing -- their titles for customer satisfaction purposes.
For the rare few who saw this "demotion" roll out across the DCs, Goog appeared to be testing a new algo "tweak" that had nothing to do with anything I could notice (except for several terms going from #1 to #6).
Now if someone could really get Mr. Cutts to explain recent "tweaks" to the algo, maybe we could gain some insight into this issue.
But his categorically denial of "awareness" is either extreme negligence of what goes on at the plex. Or complete ignorance of how the tweak would effect the sites Goog found worthy of #1 listings for over a year.
[edited by: tedster at 6:13 pm (utc) on Jan. 5, 2008]
So I'm thinking that the clustering filter in these cases is "promoting" a result from the bottom of the first page so that it joins its partner url from the same domain. The #7 result would naturally have also been a #6, except for the clustering filter applied to the rankings at the last minute.
Here's another observation. I recently found several examples of a fall from #1 to #7, instead of #6. In each of those cases the top five positions included an indented (clustered) result.
I've seen this too, on one example page, with that page staying in the #7 spot for about three weeks.
Several days ago, though, I began to see some movement in the top results. For a short time, the clustering above disappeared and, simultaneously, a competing page dropped below the example page, which then moved into the #5 spot.
Now the clustering is back, and the example page is at #6, with the newly dropped competitor now in the #7 spot.
Hard to generalize from this one example, but it might be that they're rotating results out of the top five, perhaps based on whatever perceived weakness... perhaps just a test... and then trying the same with other pages.
To the earlier post suggesting a devaluation of dmoz links, I've suspected this since late October. That dial was probably turned way down. It used to spare you from a lot of grief but now it may do very little. Which is not an unreasonable adjustment.
After all, it's a vote from only one site, reviewed by one person, who has essentially been asked to add a link. Further, most dmoz links were added years ago... stale links... and typically don't get re-reviewed for current quality or continued relevance.
How many Google algo teams are there? How many men and women control or manipulate our online fate?
Yes, this is something I have noticed quite a bit. More recently it seems there has been a larger addition of clustered results throwing the pages to #7. A clustered result doesn't always seem to result in a drop down to #7 instead of #6 but when two clustered domains appear, it almost certainly results in a position #7 ranking for the keywords.
Recently I posted that a keyword keyword mysite.com would still result in a position #6 filter. This is no longer the case. I'll flatter myself and say Google read it, agreed that a keyword keyword yourdomain.com SHOULD show that a user was seeking a specific page on your site and subsequently shouldn't filter it to spot #6. :)
A clustered result doesn't always seem to result in a drop down to #7 instead of #6
This makes some sense to me. The difference would depend on where the natural or "pre-clustering" rank of the second result fell. If it naturally fell in the top 5, then clustering does not push the #6 result further down.
If I've got ths mechanism right, there also could be cases where a #1 fell to #8, if for instance there were two clustered results in the top 5, and both of them involved urls that were "promoted" from natural positions #7 to #10.
You can see this mechanism at work by changing your preferences to 50 or 100 results per page - that opens up the opportunity for more urls to be clustered.
Go to "advanced search" and select 20 results, then change the number of results in the address bar from 20 to 6.
[edited by: Robert_Charlton at 6:21 pm (utc) on Jan. 6, 2008]
[edit reason] removed specific search link [/edit]
Marcia - I'd like to hear your ideas on what you noticed with PR distrubution/page segmentation.
Arctrust - Care to explain anything you did or did not do in your recovery process?
Sandbox - Of course, keep us updated on your reinclusion request (ignore those who mock you)
Donnajean - What exactly did you "change(d) too much too quickly on site".
Content? Internal anchor text? New pages?
Cain - Glad to see your noticing terms returning #1. Any clues to which terms are returning and why? ie. less competitive v. more competitive.
or more importantly, which ones are not bouncing back and why?
Re:MC - Of course MC doesn't know what happened. I don't expect him to. But he has been known to "ask around the 'Plex" and give hints as to what possibly, maybe, could be the problem.
With enough prodding by the "right people", he could easily give a clue to what is/was being tested recently, especially since it seems to not have anything to do with "beating the bad guys who corrupt Google's results"...
What did I change on my site? I edited groupings of related pages, adding additional unique content or splitting them up into smaller pages, then I added internal linking for the groups and back to the high level catagory page. (each page in the group links to the other and back up one level).
I also added a bunch of dynamic interium result pages by catagory of the product that we sell based on different criteria sourced from the database. These results pages grew quicker than I expected as well and cannot be easily edited to have unique characteristics so we "noindexed, noarchived" them until a long term plan is established.
My rankings have only dropped for very competitive phrases.
I still believe this won't last more than 3 months. Nothing adds up and there is really no reasoning behind this ie: penalizing somewhat trusted sites. Hell, if G wants to mess with good sites I can always go back to spamming because that is all Google is going to accomplish by this penalty.
[edited by: Timetraveler at 6:42 pm (utc) on Jan. 6, 2008]
Is anyone seeing this on pages that have been 950ed for other more competitive phrases?
Secondly, just to lay to rest the question of coincidence here... if we'd asked, say, "Is Google Using a Position #4 "Penalty"?, how many would say they've observed such a shift in ranking?
[added] - Now that I think about it, when the -950'd page came out of its penalty our entire site saw a large boost in overall rankings. Most all pages were ranking a few positions higher. There could possibly be some relation here.
In regards to your second question I'm sure plenty of people are thinking they have the #6 penalty (or presumably a -? penalty) when they really dont. But there is no way that a site can go from perfectly fine to having hundreds of pages sit exactly at position 6...and have it happen at the same time as other sites did. There is no coincidence here.
[edited by: Timetraveler at 6:56 pm (utc) on Jan. 6, 2008]