| This 193 message thread spans 7 pages: < < 193 ( 1 2 3 4  6 7 ) > > || |
|Is Google Using a Position #6 "Penalty"? - part 2|
< continued from: [webmasterworld.com...] >
One of my sites got hit.
1. One year old website
2. Niche terms with low competition and been number #1 for 2 terms for more than 6 months
In mid december my #1 got to around #6 position but fluctuating sometimes back and sometimes around #6 and now got stuck on #6
* I have keyword in the domain - e.g. www.keyword.net and that term got hit (+ some deep pages optimized for terms)
* The site is misspelling site - the site is ranking on mispellings of very competitive words. On these misspellings there is very low competiton and mostly forums/old sites which are not optimized for the misspelling at all.
* The site was entirely ranked on SEO. No PPC budget and no brand recognition
* Site was still getting some back links but the quality could be questionable - paid links but relevant
* All 3 terms that I was ranking for had lots of links with the same anchor texts and only small variations were present
* All the traffic went down, not only these 3 terms. Also my brand name - which is generic name ranks on #6
* I am using Google Analytics and other google products heavily. The site was interlinked with other of my sites but these have not been penalized.
* The homepage was changing constantly in last months and there have been relevant outgoing links to my other sites, which have not been hit.
* One of the deep pages that got hit, have been redesigned about 2-3 weeks before it got hit, with new content and template
[edited by: tedster at 6:05 pm (utc) on Jan. 5, 2008]
As I know you test (actually test), I'm going to take your results as-is. Thanks for weighing in!
Others - Although it's near impossible to isolate any variable, I'm sure others, like me, have enough pages at #6 to run separate tests ie. trinorth's added content suggestion, different unique anchor text to another page, whatever you think might be the issue, etc.
I've grown bored of this puzzle, let's figure it out and find a new one to play with. ;-)
I always suggest to people to add content when ever a page slips. It is easier to do that than to try to pinpoint what went wrong and the reason why. If you are worthy enough for a top 10 spot, then you are already doing well and those few extra paragraphs of wording can always shoot you up in the rankings.
In the extreme case that you have no more useful content to write, you can try to add photographs, a flash movie and may be even and external link to a relevant page (We do this with manufacturers from time to time). If you can not do those, then go get one good relevant inbound off site link. Keep in mind, you only need one good and relevant link typically so do not over do it.
My results have gone back to normal, atleast now for several hours - #6 back to #1
Is the anything you have done that may have contributed to your escape from Position 6?
Interesting idea Tri, I will try this on one of my websites. This is a good test, because the content on this website is easy to change, and the site gets cached daily.
Will let you know...
The only activity with the site has been a loss of link and a couple of new ones. I also removed the duplicate Meta Description Google showed in the Webmaster Tools a couple of days ago.
The key is to add to, not change, just add!
Hello Tri. Sounds good! My home page has in total about 500 words of content currently in the content container area. Would you suggest adding about 250?
[edited by: CainIV at 4:50 am (utc) on Jan. 15, 2008]
500 words is a lot for people to read, you might want to cosider writing a page of 250 words that supports a subtopic. Make sure you link them back and forth with some anchor text. Make sure its unique and make sure it support (No repitition)
Good to hear you popped back so easily, and may have a clue on a solution.
Did you add content to some pages, see them cached and pop back to top position, and repeat the process? Or was it a one time thing where you added content to all pages and they all were cached and reappeared at same time?
We worked on the pages as soon as we identified them. We are still picking through various keywords but as we work them and as they are cached they bump right up typically. Keep in mind though, the algorithm just does not look at text or links and here is an example:
We had one product we sell where we were forced by the manufacturer to use their description (Some manufacturers do that)
So, we are not the only ones who sell that particular product but were the second people to advertise the product on the internet. Typically in the serps for those keywords it came out like this
3.Competition who posted after us
Well, our site slid down. It was not due to the text (All the competition has to use the same description) what it was due to is that all of our competition posted up some photographs of the product with alt image tags and we did not have that.
What we did was to put up two photographs (Beating our competition) and we worded a paragraph of “customer testimony” describing what they used the product for and how it worked. It actually was informative and went into details of how to use the product effectively. In other words, it helped people understand a bit better and was useful. That paragraph did not contain the keywords for that product, but the wording supported the product. (Example Product Keyword: Automobiles. Example wording used in the support paragraph: Engine, Tires, and Steering Wheel) Page was cached, shot back up to the number 2 spot.
Remember, the algorithm takes into account hundreds of factors. Do not overlook the simple ones such as photographs, support wording for products, etc… Just make sure it is very useful to the end user. It is a lot easier to do something as simple as this rather than bang your head against the wall trying to figure out what made you fall. If it is useful, Google will reward you. Keep in mind; you are already in the top 10 or 20 so Google will actually like the additional information in most cases.
Bottom line, I would not really call this a penalty like the -30 or better yet the -950 because Google really is not doing anything drastic. I would call this the check your competition’s pages and see where they are outdoing you.
I'm seeing ongoing signs on one search affected by the "penalty" that Google is rotating different sites from the bottom of page-one into the top 5 and dropping the others to position 6 or below.
This could just be normal churn, but it's unusual for this search. It looks like a deliberate rotation and evaluation.
Anyone else observing such a pattern?
We have noticed churn in the past, typically it only lasts a few days or so. We all have to keep in mind when ever new content gets cached that Google likes to churn or everflux. The do look at human traffic patterns.
|We have noticed churn in the past, typically it only lasts a few days or so. We all have to keep in mind when ever new content gets cached that Google likes to churn or everflux. |
Just to note that this is not an accurate characterization of what I'm observing. I'm seeing what appears to be a systematic rotation, regardless of page changes.
|They do look at human traffic patterns. |
Well, yes and no. Using the Toolbar, Google certainly does look at traffic patterns. It's not clear whether they're using what they see to directly influence rankings.
Matt Cutts discussed this recently, and there's a lot of ambiguity, eg, what a very brief visit to a page might mean. Does it mean that the visitor immediately saw what he wanted, got the information, and left the page (or bookmarked it)... or does it mean that the page was unsatisfactory?
Does Google reward pages that are more "sticky"? If so, how quickly would they do this after a content change that resulted in increased stickiness?
It may well be that if there is a systematic rotation, as I'm seeing in one example, that Google might be testing position vs Toolbar data, and correlating this with their ranking algo... perhaps doing so in many market areas... but I don't know that they'd immediately rerank pages based on this data.
|Remember, the algorithm takes into account hundreds of factors. Do not overlook the simple ones such as photographs, support wording for products, etc… Just make sure it is very useful to the end user. |
Supporting wording and text changes could possibly be very helpful. But I'd doubt whether the kinds of rewards produced by, say, photographs which are useful to the end user, would have an immediate effect. These might result in more stickyness (and, ultimately, more inbound links), but those, I think, would be long term factors in the algo. I don't think that, by themselves, they'd quickly move you off of position #6.
It may also be that significant page changes would produce a freshness factor that Google sometimes likes. I'm wondering how long-lived such shifts might be in this position #6 situation.
[edited by: Robert_Charlton at 9:23 pm (utc) on Jan. 15, 2008]
I know we have been following this tread and our changes have stuck. As far as things such as photos, it is more than likely the alt text of the photo that Google is looking at. We have very trusted sites so when we make a change, Google tends to trust that change and treats the change in the appropriate way. What works for us might not work for everyone, but I am more than happy to share information like that because it helps the great webmasters out there who have helped us in the past.
How often are you seeing the data shifts? We hardly monitor serps unless we see a thread like this pop up on Webmaster world. I do have a staff of 30 people who work on web stuff though and we could monitor something like that if we knew what the cycle is and see what we can come up with. If it is weekly or monthly it might be something as simple as data center maintenance.
|Matt Cutts discussed this recently |
Robert, can we please have the URL to the discussion. Thanks.
|Anyone else observing such a pattern? |
I see a lot of new sites coming in to the SERPs, most of these sites are really smaller sites (with very less number of pages). [webmasterworld.com...]
|Matt Cutts discussed this recently, and there's a lot of ambiguity, e.g., what a very brief visit to a page might mean. Does it mean that the visitor immediately saw what he wanted, got the information, and left the page (or bookmarked it)... |
Poring over raw logs recently I noticed a Google Toolbar Bookmarking. I usually just see favico. It looks to me as if Google is tracking which sites get bookmarked. Which puts it only a small step from incorporating this data into its SERP algorithm.
The bookmark data is extremely relevant and arguably one of the most significant pieces of data Google acquires to determine website quality+relevance.
It's even more reliable than links from other sites *on scale. Link exchanges etc. are riddled with contrived webmastering and schemes. Bookmarks are more natural than anything else.
How is the vote of one webmaster more significant than the votes of 100 site visitors, for example, who vote for it (bookmark)?
Site A has top SERP but only one bookmark.
Site B has position #11 but 100 bookmarks.
Which site does Google think is more relevant and valuable?
Until now Google has only had the vote data from inbound links. But now it has bookmarks. This could and should lead to a paradigm shift in the Google Algorithm.
Not to get off topic but bookmarks are WAY too easy to manipulate to be a big factor in the algorithim.
As for adding text, I'll give that a shot along with maybe throwing in a pic or two with some alt text on one of the many affected pages. Couldn't hurt to try and it would be nice if it were that simple. our index page has seen no help from this type of change though, but I'll give anything a shot right now.
|Until now Google has only had the vote data from inbound links. But now it has bookmarks. This could and should lead to a paradigm shift in the Google Algorithm. |
A paradigm shift? More likely, it would be just another addition to the 200 or more factors that Google claims to take into account.
If you run analytics you can see what Google can measure and has the capability of measuring. Now, where do you think that data comes from when you do not run analytics? From a variety of sources, Internet Service Providers are a very large source and most have deals with Google. Web surfers with tool bars, Ad Words, Ad Sense, Google Check out, and probably even many other sources we could brainstorm up.
If you do not think they are measuring human interaction on some level and including that in their algorithm you are a fool.
|I'm seeing what appears to be a systematic rotation |
FWIW, I'm not seeing any rotation. My #6 rankings have been stuck there pretty much unchanged since mid-Dec (flat lined at #6). My remaining 1-5 rankings have behaved as they previously did: pretty solid in their positions with a few fluctuations.
An observation worth noting is very few of my demoted #6 rankings were previously rock solid at #1. My keywords hit tended to be those that were not as strong and previously fluctuated in the top few spots.
Here's my best guess of what's happening based on my observations:
It looks like a filter that does final re-ranking of serps that have a reasonable amount of search volume. For each "penalized" domain in the top couple results they knock a few points off the URL's final ranking score.
If knocking those points off drops a URL down a position or more, then the filter demotes the result all the way down to #6. If a URL's ranking stays the same after knocking the points off then it stays in its original position.
Does that model fit what others are seeing?
|Site A has top SERP but only one bookmark. |
Site B has position #11 but 100 bookmarks.
How many people bother to bookmark the number one site, especially when it is a one or two word search? I know I never do. It is the #11 site you need to bookmark.
I'm not arguing that G couldn't use bookmark data, but agree with EFV and others that it would probably have to play a minor role.
|Matt Cutts discussed this recently. |
Robert, can we please have the URL to the discussion. Thanks
This wasn't online... it was in person. I've heard Matt discuss the ambiguity of Toolbar data several times. I believe, but am not certain, that the last time was at PubCon. Since he's the head of Google's spam team, he's pretty much aware of the "softness" of Toolbar data.
Again, my assumption is that Google correlates Toolbar data with other data it collects, as well as with new and traditional on and off-page factors, and looks for consistency, improved user satisfaction, etc. I don't think they'd trust the Toolbar by itself to make ranking changes.
Again... using stickiness as an example... what does just a brief visit to a page mean?
The rotation I mentioned, and I'm only seeing it on one search, appears to be happening on roughly a ten-day cycle, if I can call it that after only a few sightings.
PS... re bookmarks, I wasn't talking about Google Bookmarks. I just meant add to browser favorites or bookmarks and leave the page to come back later... something I often do when researching a topic.
I wasn't in any way suggesting that Google would use Google Bookmarks in this situation to change rankings. I was suggesting that I thought page changes that relate to stickiness in particular (like photographs) might perhaps affect ranking over the long term, by encouraging more inbound links, but not so quickly as trinorth might be assuming when he refers to "human traffic patterns."
I agree that the use of personalized search data as a ranking factor is a whole separate discussion.
[edited by: Robert_Charlton at 9:05 am (utc) on Jan. 16, 2008]
|They do look at human traffic patterns. |
From my observations this doesn't seem to be a huge factor. I initially suspected this might be a factor and dug into it.
I have a couple different categories of pages that have widely varying bounce rates (~10-60%). I'm seeing examples of all the different categories of pages impacted by the position #6 "penalty" regardless of bounce rate. I use GA, so google knows all about my bounce rates.
When it comes to bounce rates most webmasters get it wrong with the information Google could or might use for ranking. Here is an example of how Google can look at bounce rates to rank or rate a page.
1.Google knows the average person is capable of reading 250 words per minute.
2.Your page has 500 words on the page so it should take the person 2 minutes to read your page.
3.People come to your page, spend an average of 2.2 minutes (You have a good page and you should be rewarded)
4.People come to your page, spend an average of 1 minute on your page and only read 50% of what you wrote (That’s an average page)
5.People come to your page, spend an average of ½ minute on your page and only ready 25% of what you wrote (Mmm, getting a bit worse here, may be not worthy of the top 10)
6.People come to your page, spend an average of ¼ of a minute on our page only reading 12.5% of what you wrote (Even worse, make the page supplemental)
Now, that is assuming Google is taking the appropriate keyword and driving it to the right page. Now, do you see how Google can measure bounce rates? There is more to that mathematical formula than meets the eye. Now are they using that information? Maybe, but now everyone can understand how it could be used.
<speaking very softly as to not scare off the good luck>
I'm seeing some terms at #5 ..shhh
I'm seeing some terms back at #1 ...shh
disclaimer - It's American Idol season which has been for the past 2 years, the time of day when Goog does major testing which revert back when the show stops airing (about 3-4 hour)
If it lasts more than a day, i'll explain what changes we've made.
ok actually i'm seeing it for several sites that i know have been hit with this "penalty"... but shhh.. let's wait a few hours. :)
( edit: should've waited for whitenight to stop *watching* american idol and post the secret info [j/k] )
...isn't this the other end of the phrase co-occurrence filter?
When Google finds everything allright, except that it can't call a page on 'widgets' complete without the mentioning of the word 'blue'? ( Where: all 'widgets' are evidently 'blue', but some are more 'blue' than others, and these are called 'red' ). Or enough variations on the word itself... 'widgety widgetly widgets'. Or make that 'car', 'cars', 'vehicle'.
As the -950 filter penalized on-page content that wasn't supported by off-page signals ( mostly abscent nav or IBL anchor text / irrelevant sources for IBLs ), this may look at some not-so-present on page factors.
#6 as the LACK of co-occurrency? The phrase that's penalized is there, used properly, in IBLs, but other important, semantically relevant, on-topic words ( or natural variants ) are *not being present*?
That'd explain the 'was #1, then -950, now it's #6' and the 'added content, bounced back' reports to me.
Apparently nothing else would.
To me at least.
That or perhaps a 'no ALT attribute featured the otherwise perfectly targeted phrase on this page' filter.
Heh, wow, this latter sounds wicked.
Wicked enough to actually be in use.
[edited by: Miamacs at 2:47 am (utc) on Jan. 17, 2008]
|That'd explain the 'was #1, then -950, now it's #6' and the 'added content, bounced back' reports to me. |
Not sure where you got the "then -950" part from (unless you're reading my posts) :P. Most people here were #1 , down to #6.
But I still like your theory.
< Complete rant mode on.>
Once again, I will go into my "Don't ever listen to MC, EVER!" speech.
Are you kidding me?!
This is/was obviously some tweak, test, modulation to the algo that has now been show to "without-a-doubt","i-will-shout-down-anyone-who-concludes-otherwise" exactly that.
Whether it sticks or not, I'd like someone who does think MC is upfront to ask him what "tweak, test, change" was conducted on, or about, 6:00 PST on January 16,2008.
Can't wait to hear the "I'm personally unaware of anything"
< rant over, but never fully finished > :P
|I'm seeing some terms at #5 ..shhh |
I'm seeing some terms back at #1 ...shh
I observed this yesterday for some terms, then today it reverted back. I have actually seen this bouncing back and forth two or three times in the past week. I think it boils down to different data on different datacenters, yada yada yada
| This 193 message thread spans 7 pages: < < 193 ( 1 2 3 4  6 7 ) > > |