Forum Moderators: Robert Charlton & goodroi
I just got my site rank #31 on its own domain name and bunch of keywords/phrases I usually watch were bumped from #1 to precisely #31. Those #2 through #10 are sort of all over the map but generally within the first 60 results.
Does anyone have some experience with this? What would the respectful audience here think a most likely reason for such penalty is? What do you suggest as the best strategy to fix this?
There has not been any major redesign recently, just routine adding pages here and there. Some unique, some syndicated industry-related content.
Thanks for any idea or comment!
D~
Have you tried the same searches on other datacenters? Google may well be doing something experimental here and there.
And no, I don't think there's any magic numbers; serps are dynamic, but the amount of rise or fall will depend on many factors; it isn't a constant. :)
It's 8 years old and yes, #31 is a pretty stable number so far
@soapystar:
I'm not sure but that is a possiblity. I had a site map broken down by thematic section where the anchor text was essentially most important keywords. maybe that threw a wrench in it?
Thanks!
please keep your suggestions coming!
D~
>> anchor text over optimization
this is stated a lot but can mean anything.
Do you think so? I thought Matt Cutts warning about link selling and the ease with which Google could spot identical anchor text was fairly straightforward. Or has someone said otherwise?
[edited by: Quadrille at 12:03 pm (utc) on Oct. 13, 2006]
anchor text over optimization
I thought Matt Cutts warning about link selling
yup, there you go. This is why people making the simple statement about ANCHOR TEXT would help us if they were specific. Alex said nothing about buying links, or even if he was talking about internal or external links. There is no way a comment about buying links can be applied to a basic statement about anchor text without making guesses and assumptions. Many pople use anchor text in s spammy way within their own site. Then others use it quite appropriatley within the site. Personally i have no idea what someone means if they say they had a problem with anchor text. Any text link invovles anchor text. I would be interested in examples of legitamate use of keyworded internal anchor where this may have tripped a filter.
"Google could spot identical anchor text" - and therefore a site may get into trouble for having it; you'd be the first to admit that Google gets it wrong, surely
ok, back to basics. Now of course its the whole point of anchor text that google records and remembers it. Yes, a site can get into trouble for it. But normally people talk about in terms of inbounds. This is a no-brainer since the watchword here is organic. Google have loong since been trying to profile just what an organic linking structure would be. The trouble is this is not an exact science and so how effective the filter would be is down to accpeted collateral. And yes, for those caught in the crossfire this would be read as google getting it wrong. Different types of sites grow at different rates. Some industries lend themsleves to exact ancher text while others dont. Even worse is that these patterns are changing overtime as the internet grows and fifferent types of users become the norm.
My own concern is the change to the dealing with internal linking structures. I am wondering if you can penalise your own pages with your own internal navigation because of optimising those pages and having that keyword pointing to it from the internal navigational structure. Even though these internal links are legitimate in that they describe the page to the user, because of other elements already in place it can be the straw that breaks the camels back so to speak.
Theres also the case where you may be using different anchor text for the same page. Its a basic spam method to have several links at the bottom of a page, all linking to say the homepage, and all using different keyworded anchor. I would expect this to cause a penalty. But what happens when you legitmatly have different linbks to the same page with different anchor. Youd have to assume theres a twighlight zone where google finds it hard to decipher between whats spam and whats legitmate for some sites.
[edited by: soapystar at 2:26 pm (utc) on Oct. 13, 2006]
My own concern is the change to the dealing with internal linking structures. I am wondering if you can penalise your own pages with your own internal navigation because of optimising those pages and having that keyword pointing to it from the internal navigational structure. Even though these internal links are legitimate in that they describe the page to the user, because of other elements already in place it can be the straw that breaks the camels back so to speak.
Wow, soapystar, that's a great suggestion but a frightening one. When talking about internal links, where do you draw a line of over-optimization? I have a rather large site about an industry where we have lots of standards names as well as part numbers are repeated in great many anchors. Say, you have a part and then you have pages about installing it, testing, troubleshooting etc and you may have literally hundreds of pages with the part's name in the anchor. Actually, I think this used to be exactly what brought my site to #1 SERP for that part's name. So, now it turns around and bites my a..? Wow, man, I do see that possibly happening but that would be a complete disaster 'cause the whole site's navigation is based on this sort of somewhat similar anchors.
And just to clarify things: I don't buy links and don't sell them either and there are not too many outgoing external links from each individual page. There are thousands from the site as a whole but only handful from each individual page.
I'm not too sure I understand what would be my back-off strategy: the part isn't going to change its name only because Google does not like it mentioned too often if you know what I mean.
I can try and remove the part's name/number from the anchor but that would mean even greater disaster because there will now be hundreds of links with just the word "installation" in the anchor whereas before the anchors were different because the parts names were different.
After all, I cannot NOT link to that page: the actual human visitors are not going to be able to find it!
Also, I would have to say that by giving me "-30" penalty Google did not really improve quality of their test results a single bit. There is all kinds of crap between 1 and 30. I'm not saying it's all bad but I'm saying that if the anchor text was helping, it probably gave me an edge against maybe 5 or 6 actually good pages, and the rest was junk worse or at least not better than my pages. So, what I think should have happened is that the little anchor text helping factor should have been removed and I should have slid down to #7, not #31
Anyways, if you can share some thoughts about a reasonable back-off strategy, I would greatly appreciate that.
Cheers!
D~
probably gave me an edge against maybe 5 or 6 actually good pages
I search for another my unique company name...and it's subdirectory name....walla...# 31....
This BS all started when Google hired that batch of interns
or whatever they called them bout 2 years ago to review sites manually. I believe they place these penalty tags on your SITE, not pages, SITE, and then for the most part forgot about them, leaving your site dead in the water forever.
PROVE ME WRONG
This BS all started when Google hired that batch of interns or whatever they called them bout 2 years ago to review sites manually.
That goes back to August 2003 [webmasterworld.com]. [see also Google's Secret Evaluation Lab [webmasterworld.com]] -- and the minus-thirty penalty was with us back in 2001. Not to say this new version of the minus-thirty couldn't be just as you suspect, because it certainly could be.
From my reading of Google's editorial opinion patent [webmasterworld.com] I would expect its results to be a lot less heavy handed than a clunky -30 effect. No, I can't prove that, but a dumb, flatfooted -30 just doesn't feel worth patenting. And indeed, the patent does describe a much more sophisticated effect.
I've seen urls get a -30 and then seen that penalty removed in stages over several weeks after some condition was fixed. It sure looked automated to me. No, I can't prove that, except to say that Matt Cutts talked about wanting to do some automated penalties with automated removal quite a while ago. And in general, Google always looks to automate wherever they can because "it scales".
But whether it is a hand applied or automated penalty, the main thing is to fix the condition that Google doesn't want to see in a first page result. One big factor the editorial crew looked for was many pages of mostly affiliate stuff with no "value added" for the visitor. Directory clones was another. Of course, by now almost any negative quality factors at all might be the magic slipper. Just pretend your domain is Cinderella and see what fits your footprint. You've looked and decided that anchor text doesn't fit in your case -- but something does.
[edited by: tedster at 3:35 am (utc) on Oct. 14, 2006]
This BS all started when Google hired that batch of interns or whatever they called them bout 2 years ago to review sites manually. I believe they place these penalty tags on your SITE, not pages, SITE, and then for the most part forgot about them, leaving your site dead in the water forever.
As I recall, GoogleGuy suggested that the purpose of the evaluation teams was to obtain benchmark or seed data (my words, not his; I'm paraphrasing from long-term memory).
That stands to reason: It just wouldn't be practical to play Whack-a-Mole with millions of thin affiliate sites and the like. It makes far more sense to gether good and bad examples for profiling, feed them into computers, and let the computers identify common patterns for "good" and "bad" pages.
My site is 8 years old.
Basically I did implemented mod_rewrite, which then meant googlebot crawled thousands (eg 5000 per day) of pages, mostly duplicate content.
About ten days later I noticed the drop in ranking for a 1 word search term of about 30 spots - ~30 to ~60. Other search terms weren't really affected.
I then excluded those pages via robots.txt.
Haven't really moved up in the rankings, maybe 1 or 2 spots since then, but have learnt my lesson and am currently working to produce good, quality content on my site, not duplicate rubbish.
Since I've been at the -30 penalty and lost 99% of my rankings since late April.. I wish I could say that the penalty expired after 2-3 weeks.. ha! My site is clean and was clean.. so it's very hard to try to find what google is penalizing.
More of those that pop back.. please refer to sig. changes that you may have done. I've already read/checked the loooooooong lists that people have posted on other threads that "brought" them back.. with no luck.
Great thread.. keep it going.. it's a serious issue w/o resolution.
Where's GoogleGuy? ;)
I've already read/checked the loooooooong lists that people have posted on other threads that "brought" them back
Have you ever heard of a big name site getting this penalty? They have their own affiliate sites, thin, duplicated, doorwayed, but don't seem to be targeted by such a penalty....manual imposition people....
It's more likely that those sites have redeeming factors that keep them from being hit by a profiling algorithm.