Welcome to WebmasterWorld Guest from 188.8.131.52
Just saw one 950+ and it does my heart good to see it.
And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).
Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.
So that's four:
3. Sloppy webmastering
4. Substandard SEO
No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.
The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.
[edited by: Marcia at 4:46 am (utc) on July 23, 2007]
[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]
Site 950d since beginning of this month I think.
As a first step, I removed a lot of links to deep pages from the homepage. They were links to stories inside, removed 50 % of them.
Pinged google using blogsearch.google.com/ping
Site back to earlier rankings in 24 hours flat!
This was only a first step to see what would happen and then start on checking internal linking and everything else. But this is all it took in my case. Will wait for a few days and see if it stays.
This is an old site which was practically untouched for the last 2 months. Was slowly building up the content with the intention to turn it into a proper content site.
Maybe the age of the site helped tip it over to the good side easily... Another site I have - also 950d - I have done the same and waiting for the first crawl.
The site ranks last for every search including its domain name - the 950 filter many of you are familiar with. I wasn't expecting to get steady rankings from the beginning, I am prepared to wait for some months before getting any free G traffic. I just want to get out of this sandbox or filter as fast as possible. I am a little worried because the same thing happened to some of my new sites earlier and after several months of waiting they are still ranking last.
I am now actively marketing the site which will bring me natural backlinks. I am also adding links to directories etc. These links should increase my Trustrank but will they help me get out of the filter sooner? I hear horror stories of people waiting for a year to get out and then drop back after some weeks.
Trust needs to be combined with relevancy signals telling Google *what* the site should be trusted for ( eg. if it's a travel site it's not selling cars ), and by whom ( eg. if it's a travel site, a car dealership's vote won't go all the way ). Collateral comes when accessibility/crawling/HTML/dupe content fallout problems, or all too creative ( irrelevant ) anchor text puts you in a different box at Google than which you were aiming for.
Waiting by itself is the biggest mistake for any new website, unless it gets its traffic ( and natural links ) flowing in at a steady pace from somewhere else. But even then, make sure that you have something to at least suggest what anchor text they should use to link to you. A good title ( which is needed anyway ), a motto/slogan with your keywords, the URL with which they link to you including the phrases... etc. You could have more than one motto. And a catchy slogan for every section. Trademarked ( ... )
For myself I've renamed this the "market protection filter".
Or simply "Get more links". ( and make them good this time )
ps. if you didn't have the trust, you'd be out, not -950.
What you lack is trust that's relevant... or relevancy that's trusted.
It would assume they have successfully escaped its grasp for say a month or more as many people, myself included, have thought they'd fixed it for a few days or weeks only to fall back again.
Honesty is needed here! Do you or have you had paid links? Have you bought links? Do you use syndicated content? Any old duplicate content on site? Etc etc.
Do you or have you had paid links?
btw, the correct question is:
Did you have any paid links on your site that were clearer than daylight to have been paid links, as they were irrelevant, used the code of a broker, and or even said that they were sponsored, partner, featured sites... in a box... at the right or the bottom... etc
Have you bought links?
again: did you buy links from pages just because they were selling them or did you seek out sites, on topic, very informative, your dream come true referrals, and asked them if they'd link to you but they said it's gonna cost money?
Do you use syndicated content?
Any old duplicate content on site?
No... not on my sites but assisted with some that had issues.
Classic www, non-www, mirror sites, etc.
1.: New site had 45%+ of its otherwise very high quality backlinks using the same anchor. Great branding.
2.: Site had its strongest backlink(s) using anchor text that were *semantically* irrelevant ( for Google )
3.: Site was never referred to with the exact phrase ( ultra competitive term, while was relevant otherwise )
4.: Site was out of balance because of a link ( see 2. ) to an internal page ( much higher PR, not even the canonical URL, sometimes anchor text used the URL which wasn't packed with keywords )
Not mine, but assisted with:
- Too many phrases targeting too many marketing sensitive stuff on a single page ( semantically unrelated )
- Unrecognized abbreviations ( site was thought to target a typo of another word instead of the short form of a phrase )
- Inconsistent navigation ( links to dead pages, links to pages that redirected to another page )
- Errorous redirect breaking navigation, pages dropping out, leaving some "orphaned" in the Google index.
- Site had navigation using a word that was monitored for an entirely other theme
- Some approached me with why their spam is -950. I'd like to quote what Matt Cutts is said to have said when asked about "the sandbox effect" ( new sites, low trust, not appearing in Index ):
" Ok, it works then. "
( paraphrased )
Spammers lurking, finding this page through an SE. Using off-topic, low trust, low relevance, low everything sites/pages in the bulk to ram your sites into the index does not work anymore. Link buyers who still don't get it. Using off topic, at the side, in the footer, irrelevant links bought from someone selling links at a broker for PR does not work anymore. That's what this filter is for. Dear everyone else - like me - watching your sites fall:
The filter is 'not perfect'.
I've posted about 100k of text in the -950 forums in the past half year, the latest 10 or so being what I could call not just a test but proven.
First step is to analyze your site as if this was the first time you've ever seen it.
...aaand... *this* is where most people fail.
Once your site is in shape, and you definitely feel it *should* work already, well, look through your inbound links with a critical eye. But first, correct any HTML, accessibility, dupe content etc. issues.
Granted, the meaning of the phrase "paid links" is more than a little fuzzy, but I think Miamacs (above) has clarified the phrase very well.
I will keep scrutinising the site then and looking for any links that look like they are paid for.
I agree with Marcia and Miamacs that you need to take a long hard look at what you have in front of you and not just blame google straight away. Do that then blame google...
So far I've fixed canonical issues, /index to / issues, multiple old re-directs, accidental paid links (put "nofollow" on them as they were genuine adverts) and quite a lot of dupe issues that have happened due to the site evolving and old pages getting left behind.
That worked for a week and life was good then fell again. Last year our traffic was stable (couldn't budge it) all year. This year it has been mostly up but bouncing a lot. The only seo we have done is based on Brett's classic, so perhaps old school.
To further blur the paid links line we have been approached by, presumably SEO's, to show their clients content on our site in return for cash (a few hundred £'s per month). The content would be unique to us and contain a couple of links to their clients sites. The clients would be blue chip.
1) Relevant/useful to site users
2) Original content
3) Links are within content - hidden
4) We don't write it
5) They pay us for it
An interesting twist to the paid links issue?
An interesting twist to the paid links issue?
FYI... this is not new, by any means. Brian White of Google posted about it on his blog back in May....
Paid Link Schemes Inside Original Content
From Brian's point of view (and he's on Matt's spam team), this violates Google's guidelines, no matter how you justify it.
Further discussion of this particular approach to paid links, though... whether it will work, whether it can be detected, etc... is really off topic on this thread.
[edited by: Robert_Charlton at 9:25 pm (utc) on Sep. 3, 2007]
To further blur the paid links line we have been approached by, presumably SEO's, to show their clients content on our site in return for cash (a few hundred £'s per month). The content would be unique to us and contain a couple of links to their clients sites. The clients would be blue chip....An interesting twist to the paid links issue?
I've had some feelers like that, too, from a couple of blue-chip companies. The discussion never got any further than "X is interested in buying sponsored editorial content" (an offer that I politely declined), so I don't know if the reason for the "sponsored editorial" was to provide a container for purchased links.
I have trouble seeing much value in Web advertorial, which is what "sponsored editorial content" is. In a magazine, an advertorial or "special advertising section" is flipped through by readers as they browse from page to page, so the advertorial pages get exposure. On the Web, where users have to select a page (via clicking) to see it, the point of advertorial is less clear--unless there isn't any point except the buying of links.
--it's duplicated elsewhere
--it's usually advertorial nowadays
--in exchange for a link from your site you are given content
this is OK?
sponsored editorial content:
--in exchange for a link from your site you are given content AND money
this is not OK?
(BTW I'm not saying our site has taken up this offer of money, we didn't, but we looked out of curiosity)
i see the ethical difference but am struggling to see why one will
probably be penalised and one wont?
i see the ethical difference but am struggling to see why one will probably be penalised and one won't?
Will either one be penalized? I think that's unlikely in most cases, since Google seems to err on the side of tolerance when pages or sites fall into grey areas. (If Google were as ruthless as some people like to think or pretend, marginal pages wouldn't show up in the SERPs and there wouldn't be any such thing as "reinclusion requests.")
Let us not try to say if G is tolerant or not. After reading WW for years, I think there is enough reason to say both are true. But either way, thats a never ending discussion...
Can we now get back to the -950 penalty?
Could you tell me what to check if one doesn't have paid links on the site, have only done basic SEO and natural one-way links, what else could be the cause of a -950 penalty?
Internal linking is the next potential culprit?
Lets say the site is about Doctors (its not, but widgets won't work well enough). For several years various pages have bounced around the top results for "doctors". It used to be one of the internal pages, but it got 950'd in May. Since then the homepage replaced the internal page for the "doctors" search due to more external links pointing at it over time. Eventually I guess the internal page came out of 950 because today I notice it ranks somewhere around 200 (although I stopped paying attention to it...it is now the only "doctors" page that ranks above 950)
The site has a lot of info about the doctors niche, but most pages are part of a geo targeted long tail directory of doctors. Thus, there are pages about doctors in each state that give some statistics, and those state pages link to city pages were various doctors are listed in a directory format (no links to doctors, just clinic name, specialty, addy, phone etc)
In the past all of these pages ranked high for any search about doctors in various cities and states. The linking is fairly pyramidal with "doctors" being in most of the anchor text (ex: Chicago Illinois Doctors"), in the title and in the bolded header on each page. However most to the traffic coming to the pages actually came from people searching for names... like "Dr. John Doe". I would rank #1 or #2 for those searches.
Today I woke up and the homepage is nowhere to be found for "doctors". Not 950'd... just gone. It still ranks for a search for the domain name. However, each and every internal page that in any way has to do with doctors is 950'd. What is kind of interesting is that they no longer rank for anything... even without the word "doctor" in it. Meaning, I don't rank for a search for "John Doe" anymore.
In my past 950 experience, it seemed like the penalty was more keyword specific where a page would rank for certain keywords and be 950'd for others.
I actually think that there is some connection between the homepage disappearing and the internals going 950. The best way I can describe the theory is that google ripped away the authority the site had for the term "doctors" and thus the authority in the niche no longer passes down to the internals... thus they go to 950 since google thinks that they should rank based on history as being a part of the result set, but can't find the basis of the authority any longer to actually rank them so they send them to 950.
Just a theory. With that background, I'll dive in an start working through it and let you know what works. Problem is that until I get it solved, revenue will be down about $1k/day. Ooops.
Using my "doctors" example from above, lets say a site has a set of pages on doctors for each city and state. Lets assume they also have a set of pages for "hospitals" and another for "nurses" Now there are three pages for google to pick from for each geographical region. Then, if you start interlinking the pages, it get even worse. It doesn't have to be geo terms, could be individual pages with small product variations.
Can anyone think of any harm, except for the loss of authority from the main domain, in creating subdomains for each vertical category. In other words, have a subdomain for nurses, another for hospitals and another for doctors. Would google be able to keep them straight and separated after such a move?
If I take those verticals that don't generate much revenue and move them out to subdomains, and keep the vertical that pays the bills on the main domain, might I find some relief?
Again, I think my primary problem is too narrow anchor text from a recent influx of poor quality links to the homepage. However, others insist that this has a lot to do with internal linking structure and page topic overlap. Would a subdomain solve it?
The domain will come back when Google has crawled and indexed the URLs which are the reason for -950. At the moment Google is hardly indexing, so it might take longer.
Just to make it clear, I'm talking about the whole domain being in -950 land, not single directories or URLs.
Right now I'm just trying to figure out how to deoptimize links to pages about states and cities. Right now I'm only using the state name and city names as anchor text. I'm not even including the accompanying keyword like "Illinois Doctors" or "Find Chicago Illinois Doctors", so it's pretty deoptimized already. (again, doctors is not my niche) Just think that having multiple pages where the inbounds use the same anchor text (like the links to my Chicago doctors pages and my Chicago Hospitals pages both use the anchor text "Chicago") might be part of the cause. I'll deoptimize the rest of the links to other pages too... but these are the majority of the site and the most likely cause if it is anchor related.
These are especially harmful when they are repeated over and over in a navbar or some other element common to all or most of your pages.
A safer linking approach might be:
Find Widgets in
Or, even better and much safer:
<a>Find Widgets in other cities</a> which links to a single page that consolidates all the Chicago, Detroit, Atlanta, etc links. In this example, there is NO repetitive link list in the nav scheme. Or, perhaps better yet, simply place all the consolidated links on the site's index page, and let your 'home' link take care of it.
A variation of this advice got me out of 950. I'm not saying to do this exactly, but instead trying to illustrate how many of us fall into dumb, spammy linking methods as our sites grow over time. When you think about it, these long lists that repeat a kw over and over are truly stupid looking.
this year, since April/may we have been up and down. I've done things that have seen our site out of penalty for a few weeks but then fall back again for a few weeks.
again if you have been out of the penalty for a month or more then it gives confidence that it is really fixed and the site is not just being re-evaluated after a fairly substantial internal/external change.
many/most of you are a lot, lot better at SEO (OK a bit of an arse lick...sorry) than i am and any specific examples of problems solved (like internal link over optimisation) would be really helpful in examining my own site - I hope!
ah! thanks dibbern2
[edited by: HoHum at 7:07 pm (utc) on Sep. 8, 2007]
by dividing somewhat similar content and hierarchy to separate subdomains that it might keep the whole site from getting hit next time
kdobson99 - That's not a guarantee, and probably more trouble than worth. A site I had hit last December, was -950 for the domain and all subdomains.
100% deoptimizing (No keywords on pages at all) didn't cure it.
I got rid of the unneeded subdomains, added a blog with lots of new content and strong links, and finally came out of 950 July 23.
Once I get a handle on it I'm going to start finding these nice old sites that are 950'd and not making any money and buy em cheap, get them out of the penalty, then smile really big. Saw a blog where this strategy is suggested... sounds like a winner.
completely strip a site down to a core set of pages... Has anybody tried it?
Yes, that is essentially what I did. And still didn't come out of penalty for some time after.
The content from old pages that no longer existed still seemed to keep it in penalty. Why do I think this? The site still ranked #1 for some unique keywords that only ever existed on those old, long gone pages.
My site completely disappeared March 6th, came back April 10th, disappeared April 14th, then yo-yo'd in and out until July 8th. Since July 8th I have had almost record traffic, at least as good as last year. I noticed my traffic slowly dropping around December.
I don't know for sure what brought my site back, but I made two significant changes before my site came back.
I removed all interlinking between my web sites. Not sure this was a problem, but didn't want to chance it anymore.
I also changed the navigation on my site. I previously had the same navigation menu on my entire site (several thousand pages). I had a main menu and another menu broken down into sub-categories. I still have the same main menu on all pages but I actually further optimized my site by adding more keywords! So I know I wasn't penalized because of overoptimization. For different categories on my site I have different sub-menu's with keywords for that category. This is also why my rankings are better than they were before the penalty, because I now have more keywords that correspond with the content on the page.
It seems like my penalty was some kind of a "trust" issue. My site has been an authority in its niche for a number of years and it has returned to it previous positions and even better. There is still some fluctutations in the results from day to day, but only by a place or two now.
Hope this helps someone!