Forum Moderators: Robert Charlton & goodroi
Other search engines seems to be ok with rankings including some Yahoo's #1.
Even now wiht half the usual traffic I get lots of bookmarks what is a good indicator to me.
Regarding Google: some pages are still #1 but other pages are dozenz or hundred of pages down now for older keyphrase.
However I noticed that if I pick just some words of my keyphrase my page ranks in first 10 or 30 results, while using the old 2 words keywords phrase that used to rank #1, results in #300
I haven't changed site so I wonder why Google went from love to hate whith those pages...
So what I'm doing is "deoptimizing" pages, lowering keyword density and wait, but I'm not sure about it.
I'm asuming a kind of -950 penalty [webmasterworld.com...]
Perhaps I just need to wait (in case my case matches this: [webmasterworld.com...] )
Site was banned in wikipedia months ago, but never had problems with google about it and I don't think Google use wikipedia to consider a site good or not.
[edited by: Robert_Charlton at 6:00 pm (utc) on July 23, 2008]
[edit reason] fixed link [/edit]
I've seen that have a negative effect, and it helped when I tested removing it from some pages. If suspected, it's easy to see in the text cache.
1. unique meta descriptions, or one across the site?
2. plain vanilla HTML, or cms with (possibly) duplicate URLs
3. is it in any way an affiliate site - in particular, does it use any 'standard' text in any serious amounts?
And, because it's Google, have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages').
Run xenu for good measure.
And, because it's Google, have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages').
Run xenu for good measure.
Is there any kind of boilerplate text, especially containing keywords, repeated at the beginning of the main page content, that runs across a lot of pages?
Bingo!
That may be the cause. Instead using image slogan I use text on that site, and I see is not helping at all.
About questions
1) Yes unique meta descriptions
2) I don't find duplicate urls
3 No
I'll check xenu.
have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages'
Don't you have any links to linkpages or reciprocal link exchange at all? I think I might find some of these not in this particular site but in others (ranking better than this one)
Any links placed by 'third parties' (eg guestbooks, forums) have nofollow attached.
I check ALL my outgoing links regularly, with xenu - and visual checking.
With Google, bad links are by far the single biggest cause of problems. High risk behaviour on money-earning sites makes no sense to me.
Is there any kind of boilerplate text, especially containing keywords, repeated at the beginning of the main page content, that runs across a lot of pages?I've seen that have a negative effect, and it helped when I tested removing it from some pages. If suspected, it's easy to see in the text cache.
Please explain how one can see this in the text cache Marcia...I have seen my traffic declining steadily over two months now. Slow and steady downward loss of traffic since May and so far it's down 40%. I haven't lost any good links that I can see, and even gained a few good ones. I do have very similar lists of keywords repeated across many pages at the top though. thanks
Searching
site:example.com + "example.com."
I see all my pages with new titles updated so that tells me google crawled pages.
I hope that means I will see those in search results soon again. But that is not happening now with 90% of my pages.
My understanding is site is -950'd and probably will take some weeks / months to come back to reasonable serps.
I've read about useless of filing a reinclusion report. So do you think sit and wait is the proper think in this case?
I've read about useless of filing a reinclusion report
This forum also has accounts of success. When you don't know the specifics of the site involved, you can't take every complaint your read to heart. Your situation could be entirely different.
The Reconsideration Request - when you've fixed all the problems you know of, and you're not trying to sneak anything past - can be an effective step. That's especially true, I feel, if your particular penalty involved a manual intervention.
Just be brutally honest with yourself. Remembering that the top SERPs may be evaluated by a human editorial staff, ask if your site makes a good showing. This isn't just an automated contest - Google is in the busines of giving their users the best resources that they can, especially on common query terms. It's our job to create something that measures up.
So, if a URL is ranking on the first page because of SEO, but the page is pretty hard to use or it offers "nothing special", then that's when the team of humans might all agree that it should be demoted. And one opinion won't do it - it takes agreement from several people who are all working independently.
Some terms still do well, but 80 percent don't anymore. Google must have made some type of a change that sunk a lot of sites. Ours are small, 2 and 300 pages each...
-s-
To oversimplify... Google has been saying that it wants to return those pages that have the best content and attract the most good-quality link votes. Unless your site is genuinely useful, sooner or later Google is likely to demote it.
It's getting harder and harder to "trick" your way back to the top. Content, and promotion of that content, are what will ultimately win out.
I had several 950d sites which were adjusted and got their rankings back. The first one returned after many months because I didn't know what I'd done wrong. The last one in a few days because by then I did.
I didn't file any reinclusion requests. I just removed all anchor text of competitive words or phrases. Substituted the text with graphics (icons+links with no alt tag). It's easy to forget about alt tags when you're reviewing a penalized site because you don't necessarily see the text.
Incidentally I'm starting to suspect Google is going to or starting to incorporate the idea behind the 950 anchor text spam penalty into its SERPs algo without all the drama of a 950 penalty. So you can lose position (even one position) because of spammy anchor text.
After I cleaned up my site with spammy anchor text, I moved to #1 (after being #2 for a long time). The former leader, I noticed recently, still has a fair bit of spammy anchor text all over his site.
p/g
If someone has a phrase-based "over-optimization penalty" I can see how removing a text menu might releive that -950 pressure.
There's no doubt that Google has OCR capability. I even heard about them experimenting with it several years ago. But to what degree they use it for image links, if at all, is a topic I've never heard or seen anything official about, or even rumored.
That information is available in the Hot Topics [webmasterworld.com], which is always pinned to the top of this forum's index page.
Since there's no penalty, there is no need for a reconsideration request. When the unique version of the URL is spidered and indexed, then it will not be filtered out.
However, I note that you began this thread asking about a 50% traffic drop. That sounds like more than one duplicate article.