| 12:31 pm on Jan 3, 2008 (gmt 0)|
Here are some thoughts
links is where i'm looking at
|The idea of newly "released" supplementals is interesting, but in the examples I've looked at, the new results at 1-5 were not previously supplemental. They've been on page 1 all along and just moved up. |
could there be a new influx of links valued to these site from the immerging supplementals. where it could be the profiles or even the quality of links to the sites that haven't dropped. causing the flux
|Just trying to put some pieces together here... |
We know that Matt is saying he didn't know about this (it was not intentional) and since he would probably know of a penalty, then we might be safe in assuming this is not a penalty of sorts.
Going back to the unintended consequence theory, it's not so much that the results were previously supplemental, it's the fact Google is treating everything as one big index now. So perhaps this tweak in supplementals twisted rankings elsewhere enough to pop some sites into higher positions (again, the unintended consequence theory).
I know we are seeing a lot more traffic (+45% on Google and 120% increase from AOL) especially on long-tail searches (4 + word phrases) than we were before, so something has changes for us.
also recently sub-domains are being treated slightly differently are you guys/girls that have dropped linked heavily from this section of the web or maybe the sites that haven't dropped are linked heavily from them? Was it the first week of December Google mentioned the change.
|For people who are experiencing #6, what are your internal links like? |
this might be what Marcia is thinking about?
just thoughts i was getting while reading the thread.
As an added are the sites affected from proxy pages? home pages are normally where they start the crawl.
[edited by: Vimes at 12:35 pm (utc) on Jan. 3, 2008]
| 1:26 pm on Jan 3, 2008 (gmt 0)|
on a lighter note and please don't take this the wrong way if you've been affected I'm sorry but I'd like to call this "penalty"
Can we call this one the "The Prisoner Penalty"
"I am not a number, I am a free man!"
| 4:35 pm on Jan 3, 2008 (gmt 0)|
My affected site has always been presumed (by me) to be out-hitting its link popularity owing to its open directory listing. I'm certainly aware of affected sites which aren't dmoz listed, however I also notice that dmoz listings aren't showing up in back link queries that I've made where they used to prominently.
I'm not suggesting that the DMOZ listings are the whole story but rather an important subset of a set of links that have been downgraded in value. I'm still of the opinion expressed earlier that the positiion six is actuallly a "transitional relief" extended to sites badly affected by the changes, but i'm aware no-one else seems to buy this.
| 4:37 pm on Jan 3, 2008 (gmt 0)|
The six penalty is across the board for me too. I have very little now ranking above six except two specific terms still at one that we focussed on heavily in anchor text.
| 5:46 pm on Jan 3, 2008 (gmt 0)|
What might be interesting is how anything would explain recovery on two of my websites, I have changed nothing on any of my websites (as I believe this is a ceiling counter that is reset in some fashion)
No change in incoming links
No change in any aspect of the website
| 7:09 pm on Jan 3, 2008 (gmt 0)|
I have had a long standing top ranking site vanish on the 30th from hundreds of keywords
This was an active site with fresh content 4 times per week that also added backlinks agressively
My assumption is they are monitoring the backlinks...but how to verify this? as there were maybe 30 added in the past 30 days, but similar keywords..
Sent a reinclusion request today to see if they will dislose why this occurred
| 7:48 pm on Jan 3, 2008 (gmt 0)|
loannet - you say "vanish"? this post is about the #6 rank. where are you ranking?
| 7:56 pm on Jan 3, 2008 (gmt 0)|
|Sent a reinclusion request today to see if they will dislose why this occurred |
Good luck with that. I can't recall anyone ever getting a reply that actually disclosed why a site was hit. Anyone?
| 9:34 am on Jan 4, 2008 (gmt 0)|
Are you sure its a permanent across the board change?
I have seen my site moved back from #6 to #1 last week for 2 days but I think the results were coming from some old datacenter and then it went back to #6.
[edited by: tedster at 6:07 pm (utc) on Jan. 5, 2008]
| 10:55 am on Jan 4, 2008 (gmt 0)|
if google was to up its use of user clicks and actions within the algo i wonder how it would look at sites that become demoted via algo relevance changes but have formerley proved themselves in the wild. Perhaps one solution would be to use safety net levels that were occupied by sites proven to be selected by users when given the choice in the past, but now not ranked well in an updated algo that say has re-weighted backlinks and anchor text penalties. Could it be then that a fall to a safety net position #6, #11 etc is not a penalty but a safe landing spot.
<disclaimer>i have no evidence that this may have happened, i just put it out as a scenario not yet explored and may be a logical step down the road anyway</disclaimer>
| 2:52 am on Jan 5, 2008 (gmt 0)|
A reinclusion request because you rank #6?
| 3:58 am on Jan 5, 2008 (gmt 0)|
It is now called a "reconsideration" request, indicating that it can also be used for urls that are indexed but may be unfairly penalized. At the same time, I appreciate that a complaint about an "unjust" #6 ranking is quite likely to fall on deaf ears at Google. If Matt Cutts "isn't aware" of any algo feature that would create this behavior, it's unlikely that the troops who monitor the requests will know more.
| 4:21 am on Jan 5, 2008 (gmt 0)|
The reason I'm asking about internal linking is because of something called visual page segmentation. What brought it to my attention just since this last TBPR update is the peculiarity of the PR distribution on a particular site of mine.
| 5:30 am on Jan 5, 2008 (gmt 0)|
Oh, another thing comes to mind that can't be over-looked.
Some of this may be based on query analysis. For pages that have been in the first 4-5 results or so, above the fold, if those pages have received fewer clickthroughs when compared to other pages listed within that set, it can cause a bit of a drop in rankings. So a drop to #6 in those instances could be a very real possibility and wouldn't be a "penalty" as such. It would just mean scoring bit less than other pages for that particular metric.
| According to an implementation consistent with the principles of the invention, one or more query-based factors may be used to generate (or alter) a score associated with a document. For example, one query-based factor may relate to the extent to which a document is selected over time when the document is included in a set of search results. In this case, search engine 125 might score documents selected relatively more often/increasingly by users higher than other documents. |
Kind of gives thought to how snippets are generated from pages, and/or how to improve them.
[edited by: Marcia at 5:53 am (utc) on Jan. 5, 2008]
| 2:00 pm on Jan 5, 2008 (gmt 0)|
Marcia - could you talk a little more about what you mean about internal linking. For me, for the pages that dropped, I do have a considerable amount of internal linking going on but the internal linking makes sense as these are my "top catagory" pages.
| 5:34 pm on Jan 5, 2008 (gmt 0)|
|If those pages have received fewer clickthroughs when compared to other pages listed within that set, it can cause a bit of a drop in rankings. |
That is one possibility I've been considering for a while. These are some of the positive and negative thoughts I have about that possible explanation:
-- The Positive Side --
1. We know from GWT that Google is measuring both impressions and clicks for the results.
2. As you said, it would not be a true penalty (part of the reason why I put quotes aronnd the word in this thread title.
3. At almost the same time as the #6 reports, we saw a number of communications from Google about meta descriptions and snippet generation. This included new features in GWT, and also additional blog entries.
-- The Negative Side --
1. Why so many reports of going from #1 to exactly #6 and all at the same time? There are almost no reports of going to #5 or #7 (except for clustered results of #6 and $7)? Are there also such drops to other positions, but those folks have just not found this thread? Or perhaps they feel that their drop to a different position doesn't line up with what we're discussing?
2. I know of two examples where click-throughs to the #1 result seemed quite healthy, right up to the demotion to #6. The meta description seems accurate, unique and quite well written to my eye.
-- More Brainstorming --
Maybe the click-through factor interacts with some other factor - say the Universal Search infrastructure - and that creates a strong likelihood of going to #6?
A further complication might also come from backlink stagnation.
Or perhaps the click-through algo does have a "ceiling" at #5 that comes into play, protecting the top 5 from urls that perform poorly? There are some parallel reports about falling from #1 to #11 - page 2 - and that would be another logical "ceiling" number.
| 6:06 pm on Jan 5, 2008 (gmt 0)|
|I know of two examples where click-throughs to the #1 result seemed quite healthy, right up to the demotion to #6 |
Here's exactly my issue with the "click-thru rate" theory.
A. SEO of all things is based on math. I don't subscribe to any chicken-throwing myths. Since I have absolutely no way to test this theory, nor break into the plex to study their data, it's effectively a non-theory.
Even if it is the cause of the "ceiling", I have no way to test it, therefore it's falls into chicken-bone throwing.
B. Without being arrogant or naive, it seems less probable than probable that our listings of #1 + #2 indented listings would be "demoted" because of "low click-thru rates" in comparison to any other listing.
Study after study has shown a double indented listing gets far more click-thrus than any other individual listing on the same SERP page.
C. Without being arrogant or naive (lol again), many of our descriptions are beyond "click-worthy" and approved by top copywriters to precisely get higher click-thru rates than our competitor's Titles and Metas.
D. If indeed it's click thru rates, Goog again has missed the mark. Recently, Goog is extremely sensitive to Title changes. Goog can't be overly-sensitive with pages trying to "game them" by testing titles for SEO rankings and then also be penalizing sites that are testing -- or effectively not testing -- their titles for customer satisfaction purposes.
For the rare few who saw this "demotion" roll out across the DCs, Goog appeared to be testing a new algo "tweak" that had nothing to do with anything I could notice (except for several terms going from #1 to #6).
Now if someone could really get Mr. Cutts to explain recent "tweaks" to the algo, maybe we could gain some insight into this issue.
But his categorically denial of "awareness" is either extreme negligence of what goes on at the plex. Or complete ignorance of how the tweak would effect the sites Goog found worthy of #1 listings for over a year.
[edited by: tedster at 6:13 pm (utc) on Jan. 5, 2008]
| 9:03 pm on Jan 5, 2008 (gmt 0)|
Here's another observation. I recently found several examples of a fall from #1 to #7, instead of #6. In each of those cases the top five positions included an indented (clustered) result.
So I'm thinking that the clustering filter in these cases is "promoting" a result from the bottom of the first page so that it joins its partner url from the same domain. The #7 result would naturally have also been a #6, except for the clustering filter applied to the rankings at the last minute.
| 11:21 pm on Jan 5, 2008 (gmt 0)|
|Here's another observation. I recently found several examples of a fall from #1 to #7, instead of #6. In each of those cases the top five positions included an indented (clustered) result. |
I've seen this too, on one example page, with that page staying in the #7 spot for about three weeks.
Several days ago, though, I began to see some movement in the top results. For a short time, the clustering above disappeared and, simultaneously, a competing page dropped below the example page, which then moved into the #5 spot.
Now the clustering is back, and the example page is at #6, with the newly dropped competitor now in the #7 spot.
Hard to generalize from this one example, but it might be that they're rotating results out of the top five, perhaps based on whatever perceived weakness... perhaps just a test... and then trying the same with other pages.
| 11:31 pm on Jan 5, 2008 (gmt 0)|
Is the assumption that Matt Cutts initates, approves, or knows about every algo change valid? He's head of spam. "I'm currently the head of Google's Webspam team." Google isn't only about spam.
To the earlier post suggesting a devaluation of dmoz links, I've suspected this since late October. That dial was probably turned way down. It used to spare you from a lot of grief but now it may do very little. Which is not an unreasonable adjustment.
After all, it's a vote from only one site, reviewed by one person, who has essentially been asked to add a link. Further, most dmoz links were added years ago... stale links... and typically don't get re-reviewed for current quality or continued relevance.
How many Google algo teams are there? How many men and women control or manipulate our online fate?
| 4:53 am on Jan 6, 2008 (gmt 0)|
- In reply to Robert and Tedster.
Yes, this is something I have noticed quite a bit. More recently it seems there has been a larger addition of clustered results throwing the pages to #7. A clustered result doesn't always seem to result in a drop down to #7 instead of #6 but when two clustered domains appear, it almost certainly results in a position #7 ranking for the keywords.
Recently I posted that a keyword keyword mysite.com would still result in a position #6 filter. This is no longer the case. I'll flatter myself and say Google read it, agreed that a keyword keyword yourdomain.com SHOULD show that a user was seeking a specific page on your site and subsequently shouldn't filter it to spot #6. :)
| 6:20 am on Jan 6, 2008 (gmt 0)|
|A clustered result doesn't always seem to result in a drop down to #7 instead of #6 |
This makes some sense to me. The difference would depend on where the natural or "pre-clustering" rank of the second result fell. If it naturally fell in the top 5, then clustering does not push the #6 result further down.
If I've got ths mechanism right, there also could be cases where a #1 fell to #8, if for instance there were two clustered results in the top 5, and both of them involved urls that were "promoted" from natural positions #7 to #10.
You can see this mechanism at work by changing your preferences to 50 or 100 results per page - that opens up the opportunity for more urls to be clustered.
| 5:03 pm on Jan 6, 2008 (gmt 0)|
You can also set the number of search results to 6 then you will see the true affect of the "Position 5 Ceiling".
Go to "advanced search" and select 20 results, then change the number of results in the address bar from 20 to 6.
[edited by: Robert_Charlton at 6:21 pm (utc) on Jan. 6, 2008]
[edit reason] removed specific search link [/edit]
| 5:30 pm on Jan 6, 2008 (gmt 0)|
I'm going to assume that all the "doctors" still analyzing the symptoms means that everyone is clueless as to the cause of the sickness?
Marcia - I'd like to hear your ideas on what you noticed with PR distrubution/page segmentation.
Arctrust - Care to explain anything you did or did not do in your recovery process?
Sandbox - Of course, keep us updated on your reinclusion request (ignore those who mock you)
Donnajean - What exactly did you "change(d) too much too quickly on site".
Content? Internal anchor text? New pages?
Cain - Glad to see your noticing terms returning #1. Any clues to which terms are returning and why? ie. less competitive v. more competitive.
or more importantly, which ones are not bouncing back and why?
Re:MC - Of course MC doesn't know what happened. I don't expect him to. But he has been known to "ask around the 'Plex" and give hints as to what possibly, maybe, could be the problem.
With enough prodding by the "right people", he could easily give a clue to what is/was being tested recently, especially since it seems to not have anything to do with "beating the bad guys who corrupt Google's results"...
| 5:40 pm on Jan 6, 2008 (gmt 0)|
One last obvious question that may have already been addressed.
Is everyone's site a site that "sells" something?
or are there any purely informational sites that have been hit with this?
Better yet, any terms that are not naturally associated with a product/service?
| 6:08 pm on Jan 6, 2008 (gmt 0)|
Yes, we "sell" something. I have not heard of a site that was hit yet that did not sell something. We also have "product" pages which are dynamic and have grown quickly, possibly too quickly. Product pages did not initially have unique title, description tags but this has been rectified. Many are probably in the supps.
What did I change on my site? I edited groupings of related pages, adding additional unique content or splitting them up into smaller pages, then I added internal linking for the groups and back to the high level catagory page. (each page in the group links to the other and back up one level).
I also added a bunch of dynamic interium result pages by catagory of the product that we sell based on different criteria sourced from the database. These results pages grew quicker than I expected as well and cannot be easily edited to have unique characteristics so we "noindexed, noarchived" them until a long term plan is established.
My rankings have only dropped for very competitive phrases.
| 6:40 pm on Jan 6, 2008 (gmt 0)|
Affiliate related site here so I guess we "sell" something. But yes, so far I haven't seen anything that would lead to a solution. Some small clues but nothing too out of the ordinary. Just a bunch of dead ends.
I still believe this won't last more than 3 months. Nothing adds up and there is really no reasoning behind this ie: penalizing somewhat trusted sites. Hell, if G wants to mess with good sites I can always go back to spamming because that is all Google is going to accomplish by this penalty.
[edited by: Timetraveler at 6:42 pm (utc) on Jan. 6, 2008]
| 6:41 pm on Jan 6, 2008 (gmt 0)|
To re-ask a question I'd posted in the first thread, looking for possible causes...
|Is anyone seeing this on pages that have been 950ed for other more competitive phrases? |
Secondly, just to lay to rest the question of coincidence here... if we'd asked, say, "Is Google Using a Position #4 "Penalty"?, how many would say they've observed such a shift in ranking?
| 6:48 pm on Jan 6, 2008 (gmt 0)|
Hi Robert. Yes we did see it for a competitive phrase on our german subdomain that was previous -950'd. It now sits at 6 but was 2-3 before the whole #6 penalty started.
[added] - Now that I think about it, when the -950'd page came out of its penalty our entire site saw a large boost in overall rankings. Most all pages were ranking a few positions higher. There could possibly be some relation here.
In regards to your second question I'm sure plenty of people are thinking they have the #6 penalty (or presumably a -? penalty) when they really dont. But there is no way that a site can go from perfectly fine to having hundreds of pages sit exactly at position 6...and have it happen at the same time as other sites did. There is no coincidence here.
[edited by: Timetraveler at 6:56 pm (utc) on Jan. 6, 2008]
| This 193 message thread spans 7 pages: 193 (  2 3 4 5 6 7 ) > > |