Forum Moderators: Robert Charlton & goodroi
Here's what I've observed in the past 2 weeks:
Observation #1: Pre-Allegra, I had over 50 legitimate search terms that ranked in the top 10 on Google, bringing in 70% of my web traffic. These terms have all dropped 100+ slots after the Allegra update, and Google now counts for less than 30% of my traffic.
Observation #2: Before Allegra, I was #9 on Google for the term "widget". After Allegra, I am now #571 in the Google SERPs. (I am still in the top 40 on Yahoo, and the top 30 on MSN).
Observation #3: In Google, I used to be a strong #1 for "mycompanyname" (past 6 years!), but now I show up #4, right under an "AsSeenOnTV" web site that doesn't have a link to me or my name ANYWHERE on their 3-page web site. (But I bet they are on some of the same reciprocal link pages that I am on...)
Here's what I believed caused it: I have been working on an (outsourced) link exchange program for the last 6 months, using the term "widget" mostly for the incoming text. I thought I was being a good SEO.
My Theory is that Google is punishing webmasters who have engaged in link exchange programs to try and improve their Pagerank. At least that's the only correlation I can draw from what I seem to be experiencing!
Think how easy it would be for Google to spot a link exchange program - compare a list of someone's outgoing links with a list of someone's incoming links - hmmmm - he/she has 150 INCOMING links from (slightly) off-topic sites X,Y and Z, and it so happens that he/she has 150 OUTGOING links to the same X,Y and Z! This smells like a reciprocal link exchange program to me! (Sergey – give these guys two cardboard cookies and send them down the river!)
Can anyone provide similar results based on these observations?
.
[edited by: ciml at 4:10 pm (utc) on Feb. 12, 2005]
[edit reason] No specifics please. [/edit]
Link campaigns are okay - even reciprocal link campaigns - but have high standards. Link only to those quality sites that make sense to link to from a user standpoint. And put only as many links on the page as a user can reasonably tolerate. If you think about the user, you will probably be in very good stead with the search engines as well.
Another possibilty is the relavancy issue or even possibly if you link to one page or two pages only on a big site, it could look unnatural as well.
I think the biggest factor is Google changing its mind about what is acceptable behavior and what is not. I don't think they want to state their position becuase their position is whatever suits them best at the moment and so the criterion constantly changes.
In the SERPs I monitor, it seems like Google is moving away from page rank in a big way at the moment with PR 3 sites now ranking above established PR 7 and 8 sites. I've never seen this before to this extent.
It's sad, but I'm really starting to look forawrd to Google losing it market share so businesses aren't so dependent on the whim of a single company. I hope some more new players enter the market at some point.
I'm currently taking my web profits and putting them into local brink and mortar businesses (for which websites are just glorfied business cards) becuase no one can make them dissapear.
They were always wrong, I mean come on we all knew what we were doing. Google did not intend PageRank to be manipulated that way. Well we learn the hard way I guess. Some of my sites tanked a couple of months ago. Oh well. New strategy: Makes sites people actually like and link to naturally. About 10000000 times harder but that's how it should have been from the start. I'll be happy when every crap site is removed from Google. I mean 95% is garbage. Google still has it all wrong but I think they are at least trying.
Say you are an accommodation provider in france. Why are sites going to naturally link to you? and at a greater rate than another accom provider that provides an incentive for partners to do so?
End your links program at your own risk. Better to link smart, than not at all.
without recip links, you can not seo.
Sure, you can. You just can't manipulate Google PageRank.
I think that a link campaign in which you use the same anchor text could be a dead giveaway that you're "abusing the system"
What if your domain name is a keyword?
And everyone links towards that keyword? i.e. internet
I am monitoring many sites whose domain name is the same as their inbound links, and yes they are all at the top.
Its all ridiculous to assume that you are getting penalized because your site's url was used in a spam directory site. You can choose who you link to, not who links to you.
secondly:
> Yes, I paid $X per link, but to a provider that used on-topic sites for the exchange
I don't see how anyone might seriously promise to do that for you. Maybe people do, but i doubt their criteria for on-topicness match googles algos.
My theory is, that google has added a number of yet undescribed linguistic (and other) filters to the relevance of inbound links. As I pointed out on various other occasions, my strongest "proof" for this idea is the progress google has made by adding particles of localization like "near" or "to" to the logical operators in ordinary search-queries, at least on the beta-pages at maps.google.com. So I think google is already a dozen steps beyond LSI, LSA and other "simpler" (LOL) explanations lazy spammers might be willing to do research on. I do not claim this will explain everything, but together with what eyezshine indicated in msg #16 we might find a partial explanation for Allegra within the next nine months or so.
But I also observed that stating this seems to kill threads (*g*).
But I also observed that stating this seems to kill threads (*g*).
You would only be killing this thread if your name was Googleguy!
--------------------------
Here's an addendum to my theory:
What is Google is employing the same type of strategy that credit card companies use to determine "unusual activity" on your account?
Ex 1: Growth Rate - Say the average accumulation of inbound links for authority sites on a topic is 20/month. However, your site has been accumulating 50/month, and they all say "Super Duper Widgets" - This is the web equivalent to someone using your credit card to buy $2,000 worth of software a day for 4 straight days - a red flag...
Ex 2: Location - Say 60% of your 50 new links/month are coming from pages with the words "Add your link here" or "Link partners" - This is the web equivalent of geolocation sampling - someone is using my credit card to buy leather underwear in Amsterdam (I live in Atlanta, GA) - another red flag...
In these cases, your credit card company simply disables your card until they can verify your charges, while Google assigns a dampening factor on your URL to push you down in the SERPs.
Sound reasonable?
.......
Nice analogy. And an interesting hypothesis.
[Now I know what to tell my wife if she wants to know about the leather underwear that I bought in Amsterdam. :-)]
copied from my other post: "Suppose I have a blog and forum about foot fetish (I don't. Really ). I also link sitewide to your political website, because I like it. It's just yourwebsite.com and it's listed along with the other dozen or so of my favorite sites, totally unrelated to foot fetish. Will Google penalize them for "having sitewide links" on 20,000 pages? How many of these scenarios exist? Way too many to start essentially banning sites based on that"
[webmasterworld.com...]
in this case I can link to Yahoo, MSN, europeforvisitors, BigDave's site, GoogleGuy's blog, Mother Teresa's site etc. etc. Are they going to be bumped to page 54 becuase they got more inbound links than usual?
Credit card companies actually call to verify if you're the one buying, they don't just decline it.
Actually, I tried to buy 4 Dell servers the other day on my credit card, and it was declined. I called in, and they said there was a "block" on my card - several charges had to be verified before I could use the card again. This has happened to me more than once.
The point to my earlier analogy is that you receive "Flags" for further investigation by Google reps (Aren't they advertising for quality analysts?) and then you might get a manual demotion if you fail X number of tests PLUS a manual review.
I know that sounds improbable - who can police the web?
Your foot fetish example is a good one, by the way.
The site has about 300 pages. Hence 300 links X 10 = 3000.I believe that google treats sub-domains and main domains as the SAME site hence it considered that all 3000 were to us - obvious SEO lets put it back in the sandbox.
I'm no expert, but, doesn't having 3,000 links from one site to another seem unnatural to you?
Most of my links come from message boards (from others finding something they like on my site and sharing it), so, that might be a good place to go see what natural anchor text look like (they range from "Check out this widget" to "Link" to just the URL as the anchor text).
Maybe too complex
I don't think so. I think it's pretty straightforward but a lot of stuff is taken into consideration...I think the confusion comes when people on message boards are posting individual causes/effects while algorithms use a conglomorate of factors. For example, too many repeated words in the title, penalty. Too many keywords on the page, penalty. Too many inbound links with the same anchor text, penalty. Too many reciprocal links, penalty. Too many of the same link from one site, penalty. It adds up.
We don't have any tricks on our pages and were doing well in google until update. My competitors are still there, but not us, we're now on page 10 or worse.
I went through our link directory and found affiliate type of sites we have links to. These sites are now pr0. They had good pr when we linked to them.
Now that google is going to wipe us out because of this, I'm not sure what to do next. Start all over again (what a mess that will be) or wait and see what happens, if we don't go under first.
Yahoo had a similar problem for months and months. Google has it now. I'd bet 99% of disappeared sites have this problem.
I analyzed 12 structural identical pages on my site. Only one of these pages disappeared from the SERPs completely and I wondered why. There is no duplicate content or hijacking problem. There were also enough internal incomming links to this page for the Googlebot to find it. In fact it was the only page with one link comming from another site, an on-topic directory listing with PR4. On another page on my site I have a link to this directory. After analyzing this drop I can only conclude that the following link scheme
mysite.com/a.html -> directory.com/b.html -> mysite.com/c.html
caused mysite.com/c.html to disappear. It was the only of the 12 pages with an incomming link from a PR4 site so looking from the PR point it had the best IBLs. Most internal pages on my site only have PR2 or PR3. Still this page completely disappeared from Google.
I have now removed the outgoing link from my site to the external directory to see if my page comes back again. If it comes back, I will reinstall the outgoing link. If the page disappears for a second time from the SERPs that will be the proof that the A->B->C scheme is hurting C if A and C are on the same server. It might take some time before I can post the result of my testing.
The credit-card analogy is indeed a good one. Another one that came to my mind is the way intrusion-detection-systems work. What both have in common is: On a huge statistical basis - and google definitely has that - you can do quite a lot; Growth rate and Location may be good, but you can also take any mathematical abstraction you want. And it is almost impossible to analyze such filters by trial and error as we tried in the past.
What I'd also doubt is that google primarily operates with penalties. Its a psychological question: Their glass is not only half-full after the IPO, its filled to almost bursting.
I do not beleive google has implemented any algo by which you may harm another site by linking to it.
Theory #3 - Google has really turned up the LSI (Latent Semantic Indexing) filter.
LSI means that a search engine tries to associate certain terms with concepts when indexing web pages. For example, Paris and Hilton are associated with a woman instead of a city and a hotel, Tiger and Woods are associated with golf. (Reference: IBP newsletter)
You can read more at: [javelina.cet.middlebury.edu...]
This might explain my earlier example of getting results about breast cancer, concrete and vitamin E when searching for "active server pages determine age"
..rehabguy
We just got slammed by google and the only thing I can come up with is linking to affiliate type of websites.
Are you using direct links with domain names like, "cj.com" etc? If so, have you considered using internal redirects, instead? Makes me wonder if the algorithm sees links or if it sees where it lands when it follows those links...
Doesn't sound like you have any penalty or you would be GONE from the serps even for your company name. Not in 4th place.
I'd speculate your problem is that you don't have external links coming into your internal product pages. As Google reduces the value of pagerank your internal links on your site have less effect. To rank well thse days you need other sites to link to your internal pages.
Since you have only requested links for "widget" in your anchor text and only links to your home page, you've not given Google much opportunity to recognise your internal pages for what they are. Pagerank is dead - you can't rely on it to get your internal pages SERPS any more.
I'd speculate your problem is that you don't have external links coming into your internal product pages.
That sounds reasonable to me. Unfortunately (or maybe fortunately...), I have 1,000+ pages, soon to be 5,000+ pages as I add two new product lines. It's hard to get links to those...
Maybe I need to work on some good content!
Its all ridiculous to assume that you are getting penalized because your site's url was used in a spam directory site. You can choose who you link to, not who links to you.
Not ridiculous at all. You can't control who links to you I agree to that extent, but you can control your own linking campaigns and therefore some of the people that link to you. Maybe even a huge proportion of the people that link to you. No one's talking about a link or two, but hundereds or thousands of links with the same anchor text and coming from sites whose content isn't really related to yours?
It's a simple matter to use probablities to determine the liklihood that the people are linking to you becuase of a linking campaign or similarly orchastrated effort on their part. For exapmle, if links were natural, I'd image you would not just get people linking to you as "Site Name" or "Keyword Combo" but as "click here" or "here" or "http://www.yourdomain.com" or "www.yourdonain.com" or "word A" or "word B" etc. Furthermore, if a heap of off-topic sites are linking to you, that could raise a red flag as well. If Google is a war with SEO, these are the main ways to manipulate Google's primeir contribution to search, their concept of PR. Wouldn't it make sense that Google is aware and is paying attention to these things?
Even if your website name were a keyword, all links to you would not be your website name (if they occured naturally).
As far as using your example of #1 ranking sites doing well, c'mon you can't possibly know what the anchor text of all the sites linking to them says. In other cases, there are rich untouchable sites that no rules that kill the rest of us seem to apply to.
I'm not saying that anchor text or the other things I mentioned is the cause, only that it is certainly something that could be pretty easily determined and so could be penalized in a systematic way.
There are some articles on #*$! that discuss the issue. (Wow looks like Webmaster World won't let us mention them... the #*$ is their filter, not mine).
Personally I just think Google has lost their fool mind, but that's not very helpful.
As a second choice you still can optimise the internal anchor text on your site so you have a few links to each product page with relevant keywords.
Not ideal, but if your competition do the same and have the same problems as you with the sheer number of pages then you can do well.