Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Domains 4,5 and 6 are gone. So are all others that we registered this year.
Note that all domains which were filtered were less than 2 years old on November 14th when the update started. Does anyone have any other data to add to this?
When I say that the domains we filtered, I mean that the index page is still in the index somewhere but was removed from the top 100 results. All sites were optimised, and so are the ones that remain in the index.
Since I posted this earlier, I checked plenty of top 10 sites, and barely any of them were less than 2 years old. If they were, they were not heavily optimised.
Could it really be this simple?
This would also explain all of the amazon/ebay/bizrate, etc. links now being so high, since these sites have been around for a long time.
I have built at least 7 ecommerce sites that are competing against each other (all for different customers). There is no doubt about it, the older ones are doing better. It's not just a matter of having more links because they are older. As a matter of fact, I have a couple of sites that have lots of good links that losing to sites that have very few.
Surely, there is some truth to this.
Here is another example. I saw a domain that was not being used for many years jump to #1. It was registered in 1995, but it was never developed. Then someone came along and added some links, and presto!
Newer sites get loads of backlinks, reach their desired positions, and then forget about getting links. Perhaps a continuous stream of new backlink data will make a site rank higher. This would also imply the site is more relevant, zeitgeist-wise.
I don't think it is freshness of back links. My '97 site still has a lot of relevant info on the topic, but it is not one I update regularly. The topic is also on the wane, so I don't think new backlinks are being added. And, for my new sites, they are too new that the few back links I have were very recent additions.
James_Dale, seeing carnage industry-wide regardless of the fact that some sites have new links, some old links, a some with a healthy mix of both.
No doubt that it hasn't damaged any of the searches in the area that site represents. In fact, I would say the vast majority of them are greatly improved by the removal of the few troublesome sites that to occasionally venture into that field.
I can think of some pages that seemed to get knocked off the first page on some of the searches that should really be there, but no more damage was done to the SERPs (the sites knocked down are a different matter) than would be done by a normal update.
In fact, none of the sites, commercial or non-commercial, that I deal with has had any problems. Nor have I come across any bad results except when I attempted searches mentioned by others here. I don't gamble, I travel on the cheap, I don't understand why people pay for porn when there is so much available for free, and I just bought a house so I'm not looking for a realtor.
I don't know if I am ready to buy that there is a filter that has only been applied to some searches. It seems like all searches have been "cleaned up", it's just that some were cleaned with a toothbrush and some with a power sprayer.
I am interested in seeing what sensible conclusions people come up with. But the signal to noise ratio is very low in all these threads. Few seem to be interested in looking beyond their pet theory as to why they are having problems
It occurred to me this morning that it may not be the anchor text persay, but the context of that anchor text that's causing the problem. The way the links are being implemented so to speak.
SE's have a couple fundamental problems with anchor text, that being nav menus and link farms with keyword rich AT (anchor text). Basically they're just long lists of links with keyword rich AT that nobody ever looks it. They count as inbound links, but should have no real influence as everyone knows that nobody ever looks at them right? So... what if they've figured a way to check the proximity of links, one to another on a page and weigh the AT accordingly? Meaning, if you have 3 anchors on a page with keywords and they appear naturally on the page (spaced out), you're fine. But if you have 30 on a page and they're all in succession you lose the benefit of the anchor text.
I checked the SERPS in my industry for the popular keywords and while most ARE in fact spammers, the ones still left at the top have on thing in common. They all use affiliate programs. Now affiliate links are usually implemented MUCH differently than normal link exchanges. As there's usually an exchange of money involved the links are usually given more prominent display on the page, away from other links that might detract from the money link. This would protect that AT from suffering these losses.
While it's only theory, I have on last piece of evidence. I use a fairly popular (and to remain unnamed) piece of software to check keyword density and other factors. You know... to speed up the process and let me focus on creativity. Anyway, this program auto-updates every few days and just yesterday started to ignore my nav menu as anchor text. It used to report one on page link just fine. Now, after the most recent update it doesn't report any on page AT even though it's clearly present in my nav menu. I really don't care about this as... well... it's a nav menu after all. But it may be an indication that others are starting to catch on to something. Stranger things have happened.
I guess a quick way to check this is to have everyone sound off on HOW they implement their link exchanges and AT strategy. Is it affiliate? Or "link-list" based?
Meaning, if you have 3 anchors on a page with keywords and they appear naturally on the page (spaced out), you're fine. But if you have 30 on a page and they're all in succession you lose the benefit of the anchor text.
You might have something there... thinking in terms of a user, if I see a long list of links, the alarm bells go off... if I were Google, I'd look for that and call it spam. There's a chance that it's totally unfounded speculation, with nothing to do with reality, of course, but it's a great theory. Good one. No one else knows what's going on... maybe you stumbled across it.
When a query is carried out in Google, the top 1000 results are inititally decided on by on-page factors. After this, all are ordered according to pagerank scores.
Next, the interconnectivity of these top 1000 results is analysed, and the 'best connected' sites within this group are re-ordered again. This last stage is what makes up the current SERPs.
In order to get the best picture of sites that are interconnected in a themed way, and not just for the exact search term itself, Google uses stemming to find relevant word variations.
If your site's backlinks are mainly from sources that do not show up in the top 1000 for your search terms, then the chances are you've been kicked out or dropped dramatically.
On the other hand, if you are well connected within the top 1000 results for your search terms (and stemming, etc), you will have had a much better chance of maintaining your position.
What I see makes sense to what you have said - in my industry it looks like this was the ONLY criteria used to rank sites. Government, institutional and educational widget order forms and documents dominate. Most of them only have value and relevancy to employees. I guess the question is whether the current disregard for on-page factors and PR is permanent or a giltch?
If LocalRank were the answer then a couple of our booted index pages would have survived.
Since posting my domain age data I have also found a few exceptions. Don't forget guys that just because your domain is new and doing well it doesn't mean that age is not a factor. You would still have to trip the spam filter to get removed.
Let's add some pagerank figures and optimization data and see if they shed any light:
I=Internal site anchors to home page optimized for keywords
H=Headings cotained keywords
B=Backlinks contained keywords
T=Page title contained keywords