Forum Moderators: Robert Charlton & goodroi
I don't think this is related to reciprocal linking. If it is overdone to the point that Google sees a big red flag some other penalty might click in but not this 950 thing.
Phase based seems a lot more likely. And there may be something about the words or phrases used in internal linking involved as well. Or that could just be a part of the phrase based thing.
[edited by: tedster at 9:15 pm (utc) on Feb. 27, 2008]
Sorry to dissapoint but themeing has nothing to do with it. I have dozens of sites with links from non themed pages with literally 100 of thousands of links. Only one site I have has the -30.
This matches with what I see looking at my competitors sites. At least themeing does not exist as it is supposed to in some posts here.
Miamacs has a very good list of factors, but he doesn't mention duping issues, and I'd have to throw that into the hopper.
To turn to some 950s I've been watching, and to use my own vocabulary about what I'm seeing (and forgive if some of this has been said before in different ways)....
I've been observing a bunch of 950s on a client site, and some have had what I consider dupe page issues. Some are direct dupe issues... and one is a top ranking page that is scraped to death in a very competitive area. It is ranking up at the top for its less competitive phrases, but for its most competitive phrase it's sitting in the 980s.
I could also call these dupe issues linking issues, because linking is inextricably related to which page is considered the dupe. For this highly scraped page, the most closely matching inbounds for its most competitive phrase simply aren't as strong as the links for the less specific terms. And for the most competitive phrases, the bar is higher.
What I think is interesting with regard to the scraping / duplication issue I feel I'm seeing on the client page is that if I search for dupes of long strings with &filter=0 appended, I sometimes won't find any... but if I search for strings of 4, 5, or 6 word exact matches, I'll find hundreds, on pages that have largely been scraped from mine. This is the way that scrapers are working now... they're breaking pages up into fragments... and it's likely that there's some collateral damage in Google's efforts to filter these out.
I'm also seeing that pages, which used to rank for searches including less heavily targeted synonyms, are now down in the 950s for these "peripheral" searches only. If I wanted to blame Google, I could say that Google is seeing these synonyms as excessive variation, and that my phrases aren't in the Google lexicon.
Playing devils advocate (with regard to my client pages), I could also say that the pages wouldn't read well without the synonyms... they would sound spammy. It may be that these synonyms and phrases indeed are in Google's lexicon, and that for spam evaluation purposes they are seen at a certain point to be equivalent.
If this latter is the case, then, by a new measure, the pages are simply over-optimized. There's too much optimized density ("density" referring not just to keywords, but to keywords and their synonyms).
I think I'm with Crush with regard to the theming. I don't think it's a factor, but that's just a superficial personal opinion at this point. Matt Cutts is currently ranking around #24 for [gadgets], and he's almost flaunting the idea of theming in demonstrating the variety of gadgets, etc, he can rank on. Not hugely competitive phrases, perhaps, but there's great variety.
(I think, btw, that Matt's site is very much an experimental and demo site, albeit he has a huge advantage of being a public figure, which makes his site atypical.)
As to why minus-950, I guess that beats throwing the pages out entirely. The sites I'm seeing here are in some way trusted in their niches, and it could be that Google knows that they're trying a work in progress. It's also clear that they're checking things over and over and over, cycling through the pages. Could be they're trying to be careful.
But in the other example... the .com, I can't explain it. Forget the thought that google can't associate auto and car as synonyms. Google finds those two words to be synonyms without any problem. I'm in the car insurance niche and my rankings have historically been nearly identical for the two terms regardless of which one I use on page or off. I just don't understand why one term is ok for that .com site, but the synonym is 950'd.
Tedster, I knew my post was likely a violation of the rules here, but if people would take the time to really investigate this example, I think we could learn a lot working from common fact. I appreciate you letting it pass this time.
What about the idea of sending a reinclusion request to Google, someone asked this before in this thread - yet unanswered?
Any experiences about that? Should that help Google in "being carefull" since they get more data on different affected site types?Or just a waste of time?
If your site has been blacklisted/removed from the serps and you have cleaned it up, then it is certainly worth the effort.
As sites hit by the -950 penalty penalty are not blacklisted/removed from the serps but merel;y not showing for certain phrases there would be no point.
If this latter is the case, then, by a new measure, the pages are simply over-optimized. There's too much optimized density ("density" referring not just to keywords, but to keywords and their synonyms).
This is quite the opposite compared with the "themeing"-theory mentioned previously, which would push your site even further into over-optimization by adding more phrases to titles and content to match with your anchors.
So which way should I go now?
Or do I misunderstand things actually?
I don't think anyone's going to be able to say 100% what you need to do, the -950 phenomenon is still vaguely understood. But I doubt that adding more traditional optimization will bring success.
Forget the thought that google can't associate auto and car as synonyms.
This is an interesting situation. Going back to some earlier work with semantics, the tilde search definitely shows these two words have been associated for a long time in Google.
I think the fact that using one word trips the -950 and using the synonym doesn't points once more to a different kind of semantic work -- one that is not tapping into Google's earlier art. That is, the phrase based indexing patents [webmasterworld.com] do seem to be in play, IMO.
In opposition to ours this site consists mostly out of article snippets, forum snippets, affiliate cr@p and a HUGE boilerplate templates :\
page structure
boilerplate template 140 words
Amazon affiliate with amazon page content inclusion
boilerplate template 230 words
1/3 of a well known website duplicate
includes of his forum.
Left minimal menu and a dynamic menu to his forum.
I have a boilerplate template of 4 words (as per recent advice)
I don't get it... :\ Are that much duplicates back in fashion?
The minimal menu seems to be the most obvious thing. 4 items ..
[edited by: mattg3 at 2:14 pm (utc) on May 4, 2007]
But I doubt that adding more traditional optimization will bring success.
Me too. Thank you for your reply...
I decided in my case that I will keep on going with buisiness as usual, that means adding new pages and actualize the site like I always did but keeping my fingers off from any SEO activities for the moment, since things start slightly to get better since two days.
Looking at competitors I find those still ranking well who obviously didn't take the whole SEO-thing too serious, though they're optmized in some manner, but more, let's say "accidently", which is quite typical for my niche.
If this shouldn't bring my site back, I'll go over the whole thing to "deoptize it", in addition I'll go hunting some new quality IBLs and take care to avoid a too sudden growth of links. Making things look more "natural" to me seems to be likely the right way out of this mess.
Our site has been ranked number one in its two major keywords for the past 3 years. We rarely make changes or even check our rank, but when I checked yesterday I couldn't find either result. After reading about the 950 penalty, I looked through all of the results and our site was not found. I clicked on the "show omitted results" link and, boom, the result was back at number one. Our results didn't really drop in the rankings, they are just being weeded out by Google for some reason.
Can someone tell me if they think this is a "950" penalty or some other type of penalty so I can try to figure out some kind of response? Or at least direct me to a discussion where other people are facing this same problem?
Thanks in advance!
Has definite keyword loading in the KEYWORDS tag: 25
Description is a partial grab of one of his duplicate content things..
I have adsense
content
adsenselink
4 words boilerplate content
Keywords ONE :) what's in the title and what the article is about.
So should we now all reduce the menu to <A HREF="pleasedearuserguessthenextpage.php">Google takes the mickey</A>
Can you explain duplicate content?
It can be complex - check into the various threads in our Hot Topics [webmasterworld.com] area, which stays pinned to the top of the Google Search forum's index page.
Using another name for the war in all links and the title got the pages out of the 950 region but it no longer ranked well on the phrase people would use in searching for information about the war. I gradually added the original name back in the title and in one link. Now it ranks back in the top 10.
Here's what I think might have happened. Google had the first word of the war marked as a suspect phrase. (remember according to the phrase based patents the definition of a phrase includes single words) Eventually Google refined its filter to the point that the suspect word is acceptable if the word war comes after it.
I think that Google is still working on this kind of phrasing where adding one more word gives a completely different meaning unrelated to the spam phrase. This is probably why some pages have come back with no changes.
Meanwhile look at single words that could have a different meaning in spammy sites. That is what Google is targeting.
[edited by: annej at 4:01 pm (utc) on May 4, 2007]
I am looking for something like:
"Hit with the penalty, (did a,b,c, or did absolutely nothing), but saw the penalty automatically lifted in X amount of time."
I know it's not a cut and dry thing, I am just curious to know if anyone has been able to get out of it, was it automatic, and over what time period whether they did anything at all, or nothing at all.. thx!
So, today I go back to look into it a little more and... BOOM!.. It's now ranking in the top 100 for "auto loans". Now, I'm not saying top 100 is anything to be proud of (it's around result 90 give or take), but I think Tedster would agree that less than 14 hours ago... before a point was made of it on this thread... it was sitting in 950 land. Today it is back... and what is even more interesting for me is that the Cache date in Google is April 30th, which means that the improved rankings didn't come from a new cache or crawl of the page... but from something internal with google.
BTW, the Capitol One page is still 950'd.
Thoughts?
Say you are searching for a new car. One thing you are going to want to do is see reviews from the premier consumer rating service, namely, Consumer Reports. In fact, as a .org and with tons of articles across the internet linking to them on this subject, you might expect some decent results.
So search for that "new car" and try to find the main page for the topic on this gold standard of authority on the subject... Hint: this thread is called "Google's 950 Penalty."
Now, what is also interesting is that the homepage for Consumer Reports ranks around result 125... but that page is about all kinds of topics not just cars. In the past Google would have been picking up the "new cars" page and it would have been top 10. Now, they shuffle the topic page to the end of results, and give you the homepage much lower.
So, maybe this is a good one for people to put their theories to test on. Would love to hear your thoughts.
Another thing.
Cross-reference with Yahoo for stolen content. Searches in Google may not reveal the extent of theft or it may only appear sporadically. This seems to relate to the in and out nature of the supplemental index and disappearing pages. When I first spotted this I said there was simply no way Google was penalizing for this but I filed a DMCA with Adsense anyway. Twelve hours after Google informed me they removed the stolen content my site was back. If PR of offender is greater than your site’s or the scrape is a couple of paragraphs or more beware. Be particularly aware of any heisted content appearing on article sites.
1. Should I just wait and see what happens now that I'm back?
2. For those that came back and then got bounced again, how long did the process take, e.g. days, weeks etc.?
3. One out of 2 on topic link partners I've had since 1998 also came back (albeit not as strong as me) after being in 950 with us. Other one is still in 950. Any ideas if this is a release from penalty or just a temporary reorder by G?
Any comments or advice would be really appreciated. Thanks!
Also, earlier there was no result in the first 100 results from their domain.. only the homepage at around result 125. Now one of their other pages is ranking in the first 100.
I'm going to watch this one because this would be two times that within hours of me posting a specific example of a 950 issue, that the page is no longer 950'd. I'm about ready to sneak my own page in here as an example that has been 950'd since December as that is starting to look like a good solution to the problem.
I'm going to watch both of my specific examples over the next hours and days if Tedster is kind enough to leave them up.
Sorry if that sounds like whining
I know exactly how this feels. If my site fails, this means almost no new contacts to customers - really not funny and it's starting to seriously threaten my income.
Concerning your questions, there are many points of view, mine is the following:
1. I would wait. Never change a running system. If you change things now, you'll never know what the reason for disappearing was.
2. My site is now on its third trip to 950-land (#1: 5 days, #2: two weeks, #3:for two weeks now, still away but between #30-#60 now) between the trips I regained all my former positions as if never anything happened (for about ten days the first time, for two weeks the second time).
3. No idea. Anything is possible.
If so... and since there is no new cache of the page in the interim which might impact results, I'm starting to wonder if the fact that probably a couple people performed the search, went to the 950'd result.. and clicked on it... and might have stayed awhile investigating... with their google toolbar on... that maybe this could be a reason for it suddenly breaking out of jail.
Only other explanation is that Google turned some knobs.
950s certainly could happen for different reasons, but theme is nothing to worry about since Google has virtually no conception of it. Like the above I've aged several years wishing and hoping Google would learn (or at least try to learn) something about theme and niche, and still there is almost no evidence they can see a theme/topic/niche.