Forum Moderators: open
In FI, CW and DC, my site is in the index. Most search terms performing the same as usual, EXCEPT the top keywords.
Top keywords are keywords used a lot in anchor text and contained in the title of my page.
The reason I am posting this is not to blame google or what. I need to at least find our the reason why this is happening.
Anyone here having the same experience? Maybe we can discussed here and find out the reason. Maybe there is a new filter working againt us.
Unfortunately, I tried that this morning and didn't see a consensus, but maybe someone else can get us more info this way?
spica,
We are torturing ourselves trying to figure out why we were hit by this putative semi-penalty (BTW, I tend to believe that it is not a penalty, but the result of a change in algo. Instead of focusing so much on our own sites, wouldn't it be more useful to analyze instead what the sites that didn't lose ranking for the same keywords are doing or not doing?
I've noted eleswhere that we have some sites up and some down in this new go-round.
Common elements to the sites that are up: More pages, more diverse content across the body of the sites, more diverse internal nav patterns, less reliance on one- and two-word keyphrases....and fewer backlinks with one and two word keyphrases.
You gotta love the Web! ;-)
I think the penalty applies to exact anchor text matches, not partial matches. It is google bombing targeting a specific phrase that is what google is trying to filter out.
We see some evidence that words which come *earlier* in the anchor text are somehow given more weight, but having said that, your example doesn't support the theory we're looking at at my place regarding one- and/or two-word penalties...or perhaps better to call it 'de-emphasis.'
FYI, some of the discussion in here involves 'dampening' of homepages; not always total elimination. But I suspect that troi21's problem is the absence of the more recent data in the current SERP's.
I have take sometime observing other web sites. And for sure, this new semi-penalty is really going on an involve a lot of web sites.
Yes, I am quite sure this new add-on algo (the semi-penalty) is used to catch sites with too much SEO especially anchor text (i.e buying links from higher PR site, lots of guestbooks links ...).
It is the anchor text theory which make google the best search engine for years. But recently people are manipulating it to get higher ranking (link exchange, buying PRs, sign guestbooks, signature in forus, linksmanager, Zeus ...). Someone mention that even KIDS can get a good ranking with Google as long as they get some backlinks with the correct anchor text. This is absolutely correct.
What google is implementing in their new algo is a way to detect sites trying to manipulate anchor text to rank well with Google. And this is so-called the semi penalty.
(Ok, I know I know, you saw some spammy sites which are still there, or some sites that buy links are standing strong ... Come on, the algo is not perfect yet and it is not a catch-all solutions, there are always survivals. We are talking about majority. Google always has filters to catch spam but there are always survivors. Filters always have their weakness)
I myself is buying links. But my case, I didn't get the semi penalty from it because the backlinks is not showing up yet (I get it from too many link exchange). I've check the sites buying links which has the backlinks already calculated. They NEVER rank well with the keywords they advertise although the USED TO for the last few updates. Instead some inner pages pop up somewhere late. This match excatly what most of us are seeing.
So this new algo try their best to detect sites which try to manipulate site rank by anchor text.Once detected, they activate the semi-penalty for THIS PAGE alone (not the entire site) for only THE SPECIFIC KEYWORDS (so they still rank well with other keywords).
(Ok ok, don't ask me how the algo works and tell me that the silly algo will penalize a lot of innocent sites, YES, it do already penalize a lot of innocent sites and the algo is not perfect (far from perfect actually) and that's why we are all complaining here).
And remember the old say, as long as google provides good results to their visitors, no one will care if your site (and mine) vanished. They are always other websites ready to replace our position.
If you want to research, don't look for site which exist in the index, try to think of your competitors who used to rank well and observe them. Or try to observe those who is buying back links (and with their link already calculated in the index).
No one knows how this algo works (except those from GooglePlex) but for now, we know the reason why it exist and why it is so important, as I have explained in this post. What I hope is we could now work together to find our how to recover our site because we are alone here.
MHes,AthlonInside,All -
Do you see a difference between one, two, three, four keyword links? Or do you think that it's more an issue with *perfect text* - i.e., *any* exact match, regardless of number of words in the anchor text...
FYI, at my place, we're observing that the one and two word phrases are suffering more...but I believe this is all closely tied with the theming discussions going on. Meaning that if Google is trying to put more emphasis on pages/sites with more than just one or two apparent keywords, then sites that are SEO'ed around a single keyword will fall, and will *also* be penalized for the appearance of too many backlinks with perfect anchor text.
Thoughts?
Everyone is different, we've heard 1, 2 or 3 keywords encounter the same problem. And I don't agree they are putting more weight in on-page factors. Anchor text is still the most important factor.
What this new algo do is if it catch you, your anchor text no longer give weight (for that keyword only). So in a competitive area, your page might be lost while some inner page shown up. The inner page shows up because it is calculated as more relevant than your main page now. Why? Because the main page has lost the links weight. For non competitive keywords, people will see their listing droped down a few pages.
Isn't it things are getting more clear? We can only find out information on this new algo by ourself (and we should) because Google will surely keep this to themself. They no longer can afford telling people so much.
I would say I am guitly of having my main keywords in the anchor text for all my incoming links. I have my link info on the links page for people to pick up and use.
But... how much is too much. 100, 200 , 300 all with the same anchor text
if you have some links that are not using the exact text are you safe.
any thoughts
but whether im affected by any new filter seems to depend on the datacenter. Right now im ok on sj fi and aol. on www index page is dropped and sub page appears on page two instead.
[edited by: Eleveeze_Preslee at 3:44 pm (utc) on May 22, 2003]
If google were to start punishing sites for having lots of backlinks with the same anchor text - wouldn't it be quite easy to try and stuff other peoples sites?
That is the only thing which makes me doubt this theory, but I have to admit it would explain what I am seeing.
I.E. index pages not appearing in SEPRS, but sub pages are.
One thing I would add though, is that index pages DO appear when I search on google.com - only when searching on .co.uk for "UK only" sites do I see this issue - and not on all datacentres
Let's suppose this theory is correct - how do we go about correcting it?
right, but i see that only when the keyphrase in the anchor text is highly optimsed on the page as well. I remember a long time ago googleguy saying that it was possible to over optimise your site. perhaps they have upped the filter for it and the anchor text is the trigger. so you couldnt hurt someone just by tons of the perfect anchor text to their site unless google already considers they have over optimised. then again, maybe it really is to do with the incomplete update/reindex.
The semi-penalty is a new thing being applied. It require a lot of testing and tweaking. Since they have 8 datacenters, why not test tweak and test 8 different copies concurrently instead of applying a same algo to all datacenters? This can safe a lot of time and testing would be more effective.
UK_Web_Guy,
What happened to the theory that links TO your site cannot harm it's rankings?
Let's suppose this theory is correct - how do we go about correcting it?
IMHO, if you already rank well, the best way is to stop exchanging links anymore. If you continue, you might trigger the penalty. This is of course not the best way, but it is the safest way, for now. When time goes by, we can find more clues on this.
I would not spend the time doing this as you say, but if links to a site could harm a site, this would open up a massive can of worms that people could and would exploit.
Let's say a company angers a group of people for whatever reason, they could all say, hey let's all link to XYZ site, and that would harm it's google rankings?
It's like voting in reverse?
On the basis that this is part of the "new" algorithm - does anybody have any views on how you could combat it?
On page factors?
I think we should still encourage links in for all the pr benefit. Even if you have 1000 identical links in saying "widgets", the spider will still come along with a bucket load of pr and muttering "widgets eh? lets see" and if it then see's loads of heavy optimisation for "widgets" it will laugh and ignore.... but if it sees a nice sensible site with helpful optimisation without all the usual tricks it will be very happy!
So links in can't hurt you..... heavy site optimisation can.
As a reminder... early days to change anything yet ;)
What happened to the theory that links TO your site cannot harm it's rankings?
Oh that theory. Well, since spammy Webmasters were going out and setting all sorts of bogus backlinks, maybe G decided to have a go at putting a stop to it?
If they are doing this, the question of *how* they are filtering out the bad backlinks and still giving credit for the legit ones is way beyond my ability.
I'm eager to hear other theories that explain much of what I and others are seeing regarding index pages dropping in the SERP's while sub-pages don't.
In my view, three possibilities:
1) More focus on theming is causing some of this.
2) A possible SEO-spam filter (as being discussed here) is causing some of this.
3) All of this is nonsense and will go away when the newer data, backlinks etc arrive and the bugs are worked out.
Unfortunately, I don't believe that #3 explains what we're seeing, because the relative showings of different pages and sites don't seem to support it in our view. That leaves #1 and #2.
Regarding #2, MHess has offered the best explanation so far that I've seen (earlier in this thread).
I think we should still encourage links in for all the pr benefit. Even if you have 1000 identical links in saying "widgets", the spider will still come along with a bucket load of pr and muttering "widgets eh? lets see" and if it then see's loads of heavy optimisation for "widgets" it will laugh and ignore.... but if it sees a nice sensible site with helpful optimisation without all the usual tricks it will be very happy!
Yup, I think 'penalties' are rare (cue posts on penalties experienced!) but instead, google 'ignores' things it doesn't like, hence a drop in rankings . In this case, google is ignoring the h1 tag if it matches various links in, title, file name etc. and thus no h1 benefit is being applied to your ranking.
But, I suspect we are all in the habit of searching for 10 or so of our targetted keyword phrases and getting depressed about the rankings. How about all the other potential phrases? I think joe public is getting more sophisticated about searching and beginning to use longer search strings and sites will still be ranking well for these. We may find a small drop in overall traffic, but I suspect quality traffic often comes from more specific search phrases. This traffic will remain, as the seo algo won't kick in, and sales may still hold up.
I'm sure this may apply to some sectors more than others, but in the areas we cover our log files have a huge range of keywords. Also we log searches people do within our site.... it is amazing what is in there!
Are you ranking for "black widgets" or "black widgets with pink spots" or "black widgets with a green stripe" etc. etc. I suppose if you are in a sector which has limited variation of search phrases, then you are right, a few keyword phrases are very important.