Forum Moderators: Robert Charlton & goodroi
I have two urls that were 950'd on Friday. Unfortunately they are my two most requested urls (other than the home page) - they'd been #1 in the SERPS for at least three years, and now only show up on the last page for the most common search phrase. They both showed PR previously, but TBPR has been greyed out since the last update. However, in certain permutations of the search phrase, they they still rank #1. The search string usually comprises the city name and the event, and often includes the year.
Example:
city event - 950'd
city event 2007 - 950'd
event city - #1
city state event - #1
city state event 2007 - #1
As far as I can tell, it is ONLY two urls, out of around 500, that fell into this (so far, anyway)
What it all means, I have no idea.
[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]
They didn't all come back to #1, but they're all in the top five. The odd thing about this site has always been that the pages (up to this point) either rank #1 thru #3, or else they are completely buried. There's no in between for some reason.
I'm glad your site popped out, but it might not have been anything. Two of my "control" sites that were EOSed that I hadn't modified popped back in over the weekend. I too am hopeful that it sticks, but when I look at the current EOS sites it would appear as though some new quality sites got thrown in. The only thing interesting about the new batch is a higher percentage of them look like doorways at first glance and their inbound links are junkier (more sitewides from irrelevant sites).
In all, the current batch of EOS sites isn't as good as the recently released batch, so that could be a good sign that the re-rank filtering is getting better. There's still some quality stuff down there though.
Cygnus
I noticed a huge drop in traffic yesterday. Could it be because of the links? Should I add rel="nofollow" to those links?
God, I had the same problem in December. I lost most of my traffic. I repaired the duplicates I had and everything. Now that I managed to get the traffic back, Google is giving me hard time yet again.
Yes, by the search methods suggested, it’s a 950 type penalty, maybe worse. For every keyword phrase I’ve tracked in the past, ranking anywhere from #1 to #150, we’re now at the bottom of the last page. This includes a search for the company name. It also includes all the “top search queries” reported in Google Webmaster Tools, including obscure ones not being remotely targeted by seo. When “repeating the search with the omitted results included” for some keyword phrases, there are as many as 45 results returned from the site at the bottom of the last page. Have only 95 pages indexed by Google. Domain was created 2-1/2 years ago, but site’s been up less than 4 months. 14 incoming links recognized by Yahoo’s backlink tool, no outgoing links. Have never purchased links.
Like trakkerguy reported in his case, we are using AdWords. Use six different destination urls, trying to give a searcher the most relevant result. These destination urls were formerly the ones returning the best results in the serps. Have 55 different ad groups running, most targeting one specific keyword phrase. Some are exact, some phrase, some broad. In some cases, mine is the only ad appearing for a keyword phrase, and that used to be coupled with a high organic listing. If I was a paranoid, conspiracy guy, I might speculate the 950 penalty is intentional to drive more adword clicks. My Google organic traffic is now stopped dead. I am 950'ed for every single AdWord keyword phrase being run.
One thing I have been lax about is meta tags: title, description, keywords. Although each page has some unique content, it’s all related to the theme of the widgets we make: big, blue widget; small, yellow widget, etc. Probably 80-90% of the words in the title, description and keywords are the same on all pages, because of the related theme. Regarding page content, in some cases the only thing unique is the model number, different images, and a slight twist on product description and specifications. Probably wouldn’t hurt to make these pages more unique, but that’s really stretching the hamburger helper. But would this trigger a 950 penalty? I have seen my more established competitors falling into the same lazy habits without incurring the 950 penalty.
Another thing I’ve done is to submit content to a few industry related PR (public relations) type sites. This content is now ranking more highly in the serps than my own pages. Is this a possible trigger for the 950 penalty?
Of the 95 pages indexed by Google, 50 are in the main index (45 supplemental). The content hasn’t changed substantially since the site went live about 4 months ago. But more pages are now included in the main index. Could the increase over time of indexed pages of “similar” content trigger the 950?
From what I’ve described, can anyone chip in with what they think might and might-not be triggering the 950? I'll change everything, but I'd like to avoid changing things needlessly if possible. Thanks.
Probably wouldn’t hurt to make these pages more unique, but that’s really stretching the hamburger helper. But would this trigger a 950 penalty? I have seen my more established competitors falling into the same lazy habits without incurring the 950 penalty.
Nobody knows, but what could it hurt to try? If you read the previous posts in this item, I had some of my most key pages in 950 for about ten days - I made some changes to the meta description tags and some text on the page (actually removed some keywords that might be been repeated too often) and they went back up to the top, and they've stayed there so far.
One thing I have been lax about is meta tags: title, description, keywords.
Get them sorted, if there's nothing unique about them how do you expect any search engine to feature them correctly?
Your title bar should match your on page descriptions, H1 tags etc, and ensure that you have complementary information in your description and keywords.
in some cases the only thing unique is the model number, different images, and a slight twist on product description and specifications.
Yep, read above, I have thousands of pages like this all ranking #1, give G, Y! and M$ some assistance otherwise you'll only have yourself to blame.
Another thing I’ve done is to submit content to a few industry related PR (public relations) type sites. This content is now ranking more highly in the serps than my own pages. Is this a possible trigger for the 950 penalty
Duplicate content is one possibility I've often considered. Whether a site has been scraped, copied, or like you have submitted the content to sites that Google may mistakenly see as the originator.
Matt Cutts talks about discerning original content, and says it is difficult to use cache date, so they will also rely on the reputation of the site. They assume an authority site would not copy from a mom and pop. If your site is new, smaller, or has lost trust, you may come out on wrong end of that decision. If your pages are deemed to be duplicate content copied from somewhere else, it has no value, and may end up in the "omitted result" section at the end of the serps. If a significant portion of your important urls are deemed duplicate content, that is a problem.
Frazz - The most recent -950 victim I consulted on sounds much like yours, where the whole site is hit, even for domain name, and sometimes you must look in the "omitted resuts" section to find pages. I posted earlier that it must be due to bad incoming links, or too many links too fast. But the one other possibility I didn't bother to mention was possible duplicate content. Is a very small site, and several have scraped or copied the index page.
Perhaps when evaluating pages, "lack of trust" issues prompt google to decide yours is the duplicate and you end up at -950?
Of course this behavior is very different than the phrase based -950 filter where only certain phrases are hit, so this discussion quite likely doesn't apply to phrase based -950
However, more importantly, I also updated the main page using my main keyword (in context, I thought) twice more.
Last week I dropped to position 550, or there abouts, for my main keyword on google .com
Strangely(?) I'm still in position number 2 on google .co.uk though.
Currently de-optimising this 350+ page site.
If I ever get back into a decent position I will never update the site again for fear of disappearing into oblivion.
I also have several other sites that I'm too scared to update.
What a ridiculous situation!
I can't believe 6 months ago I actually said it seemed okay they thru top sites to the end of results, because it was a clear sign there was something they didn't like. I naively thought it was something we could figure out and could be fixed. But some pages never come back, some come back with no changes made. Sometimes something we do SEEMS to help, and sometimes not. It doesn't make any sense.
Try not to overanalyze and "fiddle" with the page too much yet. Many come back with no changes made at all. And G definately does not like changes. New content ok, but little changes, no. If you're fairly positive something is overoptimized and the cause of the problem, it may be good to put it back how it was before, but really need to let it sit whenever possible.
The duplicates were totally unintentional and were actually relevant however I could appreciate why an algo considered them so. Subtle alterations in phraseolgy were used for similar but substantially different product lines.
I then checked as many of the major directories etc I could do and I found one specific glaring error on Dmoz.org whereby someone had altered our company page from the .com to the .biz!
I immediately contacted Dmoz to amend it, which they have now done, however as soon as I saw this I 301'd the .biz to the .com and within days pages returned to the top plus the company name re-appeared after being MIA for ages and ages.
No matter how infintessimal you may think a link or duplication etc, check it out.
From start to finish I was 4 months resolving this issue and believe me, I read every single posting here, there and everywhere before I resolved it by going through every page on every site.
Good luck!
Another thing I did was remove a section on the site with about 10 links on the 950'd pages. My niche is very specific and I tried to broaden the site, by getting links to that section with new keyword phrases directed to that page. Not sure if this had anything to do with the return. I'm thinking the deoptimized pages with similar keyword phrases did the trick. Then again, it might not have had anything to do with what I did. We have been around for 7 years and have been scraped to death. Could be that Google reranked after the dropping MFA's from Adsense. It's all a guess.
The HTML has been static for months. Other than that, over the last week I received a total of 12 links pointing to my site (not paid links). The 12 links used three different anchor text combinations for my term (for example, "Widget World", "Widget Weekend", "Widget Farm"). For one of the terms, the page went in to -950 land while the other two terms dropped about 5-6 spots.
Is anyone else seeing issues today?
A control site of mine that had popped out got hit again today as well; it too naturally gained some links, but being on the first page it also gained a ton of funky .pl and .ru links that register as a link in the sitemaps tools, but are strangely absent on the actual page source. The junky links normally wouldn't be such a cause for concern, but when the quality balance is tipped, the trigger seems to be flipping.
When the site fell out again this time I noticed an increase in wide authority sites throughout the top 10 pages...the filter threshold change may or may not be releated to why these sites are now getting love (note: I find them largely irrelevent, but Google loves them more than they love their own news).
Cygnus
[edited by: JoeSinkwitz at 5:07 am (utc) on June 27, 2007]
You can never be 100% sure, but I'm thinking it was my changes --not just a Google reindex thing-- based on the fact that the three untouched are still 950'd.
The changes: removed ALL cross-page nav links except one back to the top-level menu (think of it as index). These were good on-subject links; no mixing of themes or topics.
It could have been the linking itself, or it could have been the keyword content I used in the link text. One more possibility: I had multiple use of the same nav menu. Once on the left, once in a footer, and, on long pages, in a div contained within the body. The multiple usage could have been the determining factor. I will never know.
I will never again use cross linking nav in any site. It will be back (up) to the index, and then down to whatever a user wants. No jumps from red widgets to red widget repair, or from red widgets to green widgets. Certainly not the best for users, but then what's the use if no-one can find you?
I think the 950 issue is probably a catch-all for several different "violation" schemes. My experience has perhaps uncovered one of them. But it would be a mistake to proceed in further discussion as though this was the whole thing.
[edited by: dibbern2 at 5:25 am (utc) on June 27, 2007]