homepage Welcome to WebmasterWorld Guest from 54.166.62.226
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Dilution of anchor text in internal links
anchor text, internal links, keywords.
ppg

10+ Year Member



 
Msg#: 14400 posted 12:44 pm on Jun 17, 2003 (gmt 0)

News stories on the site I run get a good bit of google traffic in, my main pullers. At present I link to new stories from the home and news index pages with the whole headline in the anchor text like this:

"keyword keyword widget - the new widget software version adds new features which make your life complete!", or something.

I'm wondering if I'm diluting the anchor text benefit by doing it like this? Perhaps I should use only keywords in the anchor text and have the rest of the text delinked beneath?

Does anyone have any opinions on this?

Thanks.

 

wackmaster



 
Msg#: 14400 posted 3:27 pm on Jun 17, 2003 (gmt 0)

ppg, just an opinion...

In the past, I would have said yes, you are diluting your anchor text, and the answer would be to do just as you supposed.

However, in the world of Dominic and Esmeralda I'm less sure.

First, G seems to have de-emphasized *one word* and possibly *two word* keyphrases in favor of multi-word keyphrases. How, I don't know, but it seems to have to do with greater ability to interpret language on the page and draw linguistic conclusions about the page's content.

Second, some, including me, have hypothesized about the possiblity of penalties associated with 'over-optimization'.

My best advice would be to avoid using *just* two or three keywords in the link text (which is a bit aritificial anyway, from a user's pov), but to sharpen your link text (versus your example above) to shorter and very pithy headlines that contain important keywords.

ppg

10+ Year Member



 
Msg#: 14400 posted 6:01 pm on Jun 17, 2003 (gmt 0)

Interesting reply wackmaster, thanks.

G seems to have de-emphasized *one word* and possibly *two word* keyphrases in favor of multi-word keyphrases

I'm not sure I follow here - if I'm targeting a particular 2 word phrase say, then any other pages that target this phrase, intentionally or not, would be equally devalued. Or are you saying that you suspect 1 and 2 word phrases will now be getting less traffic as compared to multi-word keyphrases? Or perhaps a page that targets 1 and 2 word phrases will be given less relevance than pages which come up for the same word/phrase but target it less obviously?

wackmaster



 
Msg#: 14400 posted 6:56 pm on Jun 17, 2003 (gmt 0)

There was a lot of discussion when Dominic hit, including from GG, that implied the following:

1--People might expect less traffic from one and possibly two word keyphrases.
Given that most people optimize their homepages for their most important word or phrase, that sort of implies less traffic to very optimized homepages. This seemed to be why some webmasters freaked...the phrases that they had been mainly focussed on as far as optimization goes started doing less well in the SERP's with Dominic's arrival.

2--People might expect more traffic to come in thru their subpages.
Subpages are typically more narrowly focussed (versus the broader terms optimzed for on homepages); and often the subpages are found by inputting two, three or more keywords when searching (additional keywords are often needed to provide more detail).

Quite a few webasters reported seeing exactly the sort of phenomenon that GG predicted, and we were among them. Our subpages are definitely getting more traffic on balance (since Dominic) and our homepages (we have a lot of sites) are getting less traffic on balance.

In our case the Dominic and Esmeralda updates have been good since we're getting more new traffic from the subpages than we lost on the homepages, again, just as GG predicted.

At the same time, we have seen a few of our high level pages get murdered and these tended to be those optimzed for one or two keywords. Obviously someone else got that traffic, and what is less clear to me is what the key to success is for doing well on a homepage optimed for one or two keywords.

And this where the issue of SEO comes in...some think that so-called 'over optimization' is actually hurting their sites since Dominic.

Now, whether there's a penalty involved (I doubt it), or whether some pages that were highly optimzed (and thus doing well pre-Dominic) have now had that optimization discounted by G is anybody's guess...my guess is that those pages have been discounted somehow.

That's my take on all this. Hope that helps a little, though I'm not sure it will :-¦

papabaer

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 14400 posted 7:27 pm on Jun 17, 2003 (gmt 0)

applied semantics > metasemantics > rich language > linguistic diversity > more traffic.

I've been following my own pet theories regarding 'optimized language' and 'language constructs.' It seems to be 'bearing fruit.'

As an example, one page is ranking in the top 3 listings for eight keyword phrases. Six are two-word phrases; two, three-word phrases. All are very competitive terms, all pull over a million results.

Mind you, this is a single page: three @ #1, three @ #2, and two @ #3... What is very telling is this: mine is the only page to appear on all eight results pages. Each results page shows different competitors, most, appear only once (for an alternate keyword combo). The 'competition' is clearly targeting specific, single keyword phrases. While they are ranking for their 'target,' my rich-language approach is garnering great results across the board.

I'm seeing very similar results on other pages of this sort. I do believe, Google, has become 'literate!' Time to put away those First Grade reading primers and break out the heavy stuff.

Get robust!

wackmaster



 
Msg#: 14400 posted 7:49 pm on Jun 17, 2003 (gmt 0)

papabaer,

fascinating, intriguing, compelling, notable (I'm practicing for the future).

Great post, thanks. Now I'm curious about something, since I believe what you're saying...

Do you believe that without the variety of phrases you would *not* have done as well on any particular phrase?

What I'm getting at is that on any given phrase (if I got what you said correctly), different sites came up. So at least those other sites were perhaps still doing well against a single two word phrase (one of your phrases)?

Or, do you perceive that those other sites also share your characterisitc of "multiple phrases" - just perhaps different sets of phrases, some of which overlap with your phrases? In other words, they were also linguistically richer.

Bottom line, do you believe a homepage won't do well if it is very focussed on *only* one keyphrase of, say, two words?

I'm asking because we seem to have some sites that support the hypothesis that focus on one two-word phrase is a mistake now, and others that don't support that hypothesis ... I'm trying to sort out what other factors might be involved.

I believe it's possible that the same line of thinking also applies to the anchor text in backlinks.

papabaer

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 14400 posted 8:41 pm on Jun 17, 2003 (gmt 0)

Do you believe that without the variety of phrases you would *not* have done as well on any particular phrase?

Perhaps. What I can say, is that I am seeing excellent results from "rich-language" constructs. There are indications that "text-in-proximity" of targeted keyword phrases may be a contributing factor. This would be most likely if, in fact, Google has become more 'literate.' I do not see much evidence of 'others' following this approach... most I've noted are 'one-trick-ponies.'

In any case, the rich-lanquage, metasemantics approach, can help craft robust, high ranking pages, that are resistant to single keyword/key phrase catastrophies.

Sophisticated language for a sophisticated Google...

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 14400 posted 9:13 pm on Jun 17, 2003 (gmt 0)

"I'm wondering if I'm diluting the anchor text benefit by doing it like this?"

Definitely. Make the anchor text one or two words at most. From my experience, even three is suicide.

My anchor text is three words. My second word is #6 for allinanchor out of eleven million. My third is #36 (!) for a five million word search term. Those two can not exist in the same universe. (My first word is #64 for a fifty million word term.)

The third keyword is clearly being drastically devalued. In my opinion, this is supremely dorky search engine-ing, but on the other hand, Google has to at some point diminish how much anchor text it can weight, otherwise people will link a hundred words.

My third word is the only one that matters to me. I'm now going to go through the totally google-has-me-doing-something-silly process of de-linking my first two words on my own site, and linking to my home page only with my third word.

Link text should be two words at most.

wackmaster



 
Msg#: 14400 posted 10:10 pm on Jun 17, 2003 (gmt 0)

Link text should be two words at most.

steveb, our evidence pretty strongly suggests that the opposite is true, i.e., that going with only one or two keywords was right for the old Google, but is inconsistent with the way Dominic and Esmeralda are working.

Very sharp focus on just one or two kewords in backlinks seems to be the only clear difference between some of our sites that are doing worse and those that are doing better since the newer updates. The sites doing better have more diverse keyword combo's in the backlinks.

Further, with that as a preliminary conclusion (after Dominic), we got some changes in backlink text, and have further improved our positions in the new -fi. As always, it could be something else we're not seeing...but if so, we're not seeing it! ;-)

At the very least, I can say that adding words to our backlink text did *not* hurt us.

There has been more discussion about this as it relates to the *words on the page* as opposed to anchor text, but in my view, the type of de-emphasis on one or two words that we're seeing in searches applies not only to page text, but also to backlinks.

But what I can agree with is that the original example in this string definitely had too many words in the link text.

I'd be very curious about others' opinions on this, since we practiced the "less is more approach" before, but not now...

truth_speak

10+ Year Member



 
Msg#: 14400 posted 10:34 pm on Jun 17, 2003 (gmt 0)

Hello wackmaster, steveb, and other gurus of SEO :)

I will only comment on the narrow slice of this discussion that I've observed with my own site:

For a couple of pages on my site I had "X Widget - Y Widget' in the title. Througout my site I had a navigation menu with "X Widget - Y Widget" as the anchor text pointing to the main "X Widget - Y Widget" page.

In the old index my page came up #38 for X Widget, and #8 for Y Widget. Now the page comes up #58 and #12, respectively. I've been upping the content a lot, while other related sites have been asleep, so I don't believe there is any other explanation than some kind of duplicate word penalty in either anchor text or titles.

Interestingly, when you change the above queries to "big X Widget" or "big Y Widget" my rankings have improved from #4 to #3 for one search, and #3 to #2 for another.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 14400 posted 10:41 pm on Jun 17, 2003 (gmt 0)

"our evidence pretty strongly suggests that the opposite is true"

Vive le difference.

But I see zero evidence of what you suggest. Third and beyond words in link text seem to have been drastically degraded in the later stages of dominic and especially esmerelda. The Laser beaming of anchor text is what rules.

I don't know if having four words will hurt the first word, but it is clear to me that the fourth word might as well be "welcome" for all the good it will do.

Kirby

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 14400 posted 11:30 pm on Jun 17, 2003 (gmt 0)

Link text should be two words at most.

I disagree. Let's say Google Guy is looking for 'Silcone valley real estate' to buy when G hits the IPO jackpot. What anchor text would you suggest for these websites?

wackmaster



 
Msg#: 14400 posted 11:47 pm on Jun 17, 2003 (gmt 0)

truth-speak, your example, while not precisely what I was referring to, is consistent with what we feel we've learned...

steveb, actually I'm not certain that our conclusions are mutually exclusive...

Try this on for size:

Google Objectives
1) Google is sick of SEO'd sites dominating the SERP's. They attempt to tackle that problem with Dominic and later...(example: give less credit to webmasters who go to town on hot phrases like 'travel Hawaii' (mods, just an obvious real world example; I don't work in this area specifically).

2) Dovetailing with the above, Google also wants to return more precise results, especially with the more specific searches...

Problems
1) Ultra SEO'ers and spammers are creating lots of perfect text backlinks ("keyword1"; or, "keyword1 keyword2"; or, "keyword1 keyword2 keyword3"; etc.). Not much of "buy great Keyword1 products here" because they know there are too many waste words in there.

2) SEOers are all over the basics like titles and backlinks, i.e., using "X Widgets, Y Widgets, Z Wigdets' in the title, backlinks, etc. ... which sort of spams Widgets. (I use that word spam loosely here; take no offense anyone; lots of webmasters were doing this).

So what happens? In the new updates we see things like:
In the old index my page came up #38 for X Widget, and #8 for Y Widget. Now the page comes up #58 and #12, respectively... Interestingly, when you change the above queries to "big X Widget" or "big Y Widget" my rankings have improved from #4 to #3 for one search, and #3 to #2 for another.

...it is clear to me that the fourth word might as well be "welcome" for all the good it will do.

Possible Explanation
1) Google decides that longer strings of keywords in titles and backlink text is overdoing it, so to combat this, the first word gets full credit as a single keyword, the second a bit less as a single keyword, the third a lot less as a single keyword, and so on (steveb, this is your point and we see it too)...

2) However, there's a twist...if the searcher types in "keyword1 keyword2" or "keyword2 keyword three", or even better, "keyword1 keyword2 keyword3"...Boom...the phrases get good credit *as phrases*. This puts more emphasis on making sure that a given result in the SERP's for mulitple keyword phrases comes up strong, but takes a step towards combating simple but very common SEO tactics...

3) Now G adds in some of their newly acquired AS technology to further enforce and refine searches so that multi word searches sync up best with pages that have not only the several keywords searched on, but also more related terms in the body text...

Anyway, that's one theory that might encompass all of what we're seeing.

[edited by: wackmaster at 2:33 am (utc) on June 18, 2003]

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 14400 posted 12:02 am on Jun 18, 2003 (gmt 0)

"What anchor text would you suggest for these websites?"

The answer is obvious, but different depending on the circumstances. With "silicon valley real estate", no word is more important than the others. certainly this company couldn't care less how it ranks for "estate" or for "real". Having the value of "real" being degraded because it is the third word doesn't matter. Having it there is better than not. The only question is whether having "silicon valley" first is better than having "real estate" first.

Perhaps a better way to say it, if you can say your link text in two words, do it. If you need to use more than two words, put the main one first if possible. If you can't do anything but use four words in a specific order, do that. It won't kill you, but don't be surprised if you rank higher for allinanchor for the first word than for the fourth.

Kirby

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 14400 posted 12:21 am on Jun 18, 2003 (gmt 0)


don't be surprised if you rank higher for allinanchor for the first word than for the fourth.

steveb, why is that important?

GrinninGordon



 
Msg#: 14400 posted 12:58 am on Jun 18, 2003 (gmt 0)

steveb and wackmaster

I am interested in the link text aspect too. However, I would like to throw in a wobbler, that I reckon Google use serveral ranking algos. So search return position 1 will have gotten there by a different algo then #2. Which makes side by side comparison less easy. I can point you to a set of search results that illustrates this perfectly. The top page shows sites that have actively gone for reciprocal linking with anyone and everyone using a long (around 10 words) link / anchor text for them. Another has used the simpler two keyword approach in reciprocal link exchanges (and has fewer backlinks).

There are other factors, of course. Such as does the anchor / lonk text match the domain name? If it does, it would be logical for Google to accept that at face value without giving a "penalty".

I believe Google have now factored in a link page weighting (not penalty), and I am pretty sure I know how they have done it. What surprises me most is that they did not pick up on the most obvious (to me). Where every backlink contains the exact same anchor text - a sure sign to me that it was engineered.

Dolemite

10+ Year Member



 
Msg#: 14400 posted 1:02 am on Jun 18, 2003 (gmt 0)

Problems
1) Ultra SEO'ers and spammers are creating lots of perfect text backlinks ("keyword1"; or, "keyword1 keyword2"; or, "keyword1 keyword2 keyword3"; etc.). Not much of "buy great Keyword1 products here" because they know there are too many waste words in there.

2) SEOers are all over the basics like titles and backlinks, i.e., using "X Widgets, Y Widgets, Z Wigdets' in the title, backlinks, etc. ... which sort of spams Widgets. (I use that word spam loosely here; take no offense anyone; lots of webmasters were doing this).

Yes, these are big problems, both for Google and, by extension, for us as webmasters/SEOs. I think it comes down to what GG was saying about discovering webmaster intent, even though I don't think this is what he meant by it. If your intent is to be #1, perhaps you shouldn't get there, but if your intent is to create the best site for users, maybe you should. Granted, no content-focused site will ever get there in a crowded field without some effort put into SEO, but it needs to be a somewhat level playing field against those who will try every trick in the book. The problem is, sites that do genuine, honest SEO shouldn't (but might) lose out in the hunt for ultra SEO spammers. That's where determining intent is very difficult.

I do think we're all personifying Google a bit much by giving it goals, its own intentions, and intelligence. It can only get those attributes from those running it, who must be as judicious as possible and have very limited processor time and storage space to spend on analyzing each web page and its context/relationship to the rest of the internet. Doing anything like applied semantics is some serious processing...same thing in determining intent. Already I'm sure lots of code is simply ignored since it can't be intelligently processed.

As far as the long anchor text phrases...I can't say that I've seen this lose any points with google. I have a site that uses an anchor text phrase "blah Blue Widgets blah blah" quite often, so unless they have the applied semantics to weed out the blah's from the widgets, they'd have to penalize it as well.

I think in considering how Google works, you have to think about what calculations are both feasible in terms of processing time and can be universally applied. One thing that's relatively easy to calculate when you have a good picture of a web site is the quantity of something. Google can see the number of words in an anchor text phrase (again, I don't believe they'd do much with this, or at least they wouldn't cut you off after 2), the number of times that link/anchor text is used, and to some degree, how often it appears in a highly templated manner. Remember, google already has to detect duplicate content so seeing a template is just an extension of that. These are also things that should be universally applicable to 90% of websites. So given these factors, it wouldn't surprise me to see Google come up with a set of rules like:

X number of pages that are Y% similarly-formatted containing Z number of the same link/anchor text = Spam

I don't think that would lead to a real penalty...just a little bit of a drop in SERPs for what you appear to be heavily targeting as seen through this sort of formula. Yes, lots of good, honest sites that use a common navigation bar might be seen as spam by this sort of algorithm, and that's why it would only be 1 of those 100 different factors and a tiny penalty at most.

Think of it as just partially correcting the advantage this sort of arrangement creates, and hopefully catching those who exploit it intentionally.

kiril

10+ Year Member



 
Msg#: 14400 posted 1:17 am on Jun 18, 2003 (gmt 0)

Hi folks,

I need to bring the level of the conversation down a notch and ask a very basic question. I'm pretty new to this, and there's an implicit assumption in the initial question of this thread that I don't follow:

"I'm wondering if I'm diluting the anchor text benefit by doing it like this?"

I thought that the "anchor text benefit" was something bestowed upon the site being linked to rather than the site giving the link. Yet this thread seems to emphasize the benefit of anchor text to one's own site.

Or is this thread about internal links within one's own site? Still, I thought it was only backlinks from external sites that improve my own site's SERP rankings & PR.

Thanks in advance for any clarification.

wackmaster



 
Msg#: 14400 posted 1:52 am on Jun 18, 2003 (gmt 0)

Dolemite, yes, clearly different sites in the SERP's can get there any number of ways; some more from backlinks, some more from # pages, titles, etc...each one spit out in a logical way that lines up with, and is somehow ordered by, G's latest algo's.

What I'm most trying to understand is that in our case, we have a few sites where the homepage was optimzed for a primary two-word keyphrase, and several secondary phrases. Like many in here, we're down on the two word phrases for those sites. But I'm still not certain why. Much of the rest of what's going on makes more sense to me. But for the most part, the other sites that have moved up seem no less focussed on the same phrase. So, either we had overdone something, which is possible but we try to avoid...or there is simply something that we are doing that Google *now* doesn't like (yes, I mean Google's brainless, highly complex, 'let-em-fall-where-they-fall' algo).

Currently, my best guess is that our sites that seem to have suffered have an unusually high number of "perfect text" backlinks. I can understand for all the reasons noted above why Google might want to go after that characteristic with its algo. But what kills me is that the backlink text in these cases more or less just reflects the sites' names, which contain three words, two of which are keywords (not hyphenated). Unfortunately, there's not much I can do about that. :-(

Note to kiril... you have some say in how other sites link to you, especially if you have requested the link. Then there are those who run mini networks and link to themselves, and the spammers who set up link farms and link to themselves, etc....

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 14400 posted 2:18 am on Jun 18, 2003 (gmt 0)

"steveb, why is that important?"

Because anchor text rankings very closely parallel the search result rankings. In the example you gave Kirby the four words are all similarly basically meaningless except when strung together. But if one word is more important "large bright red roses", and you get more anchor text boost for "large" than you do for "roses", it is bad webmastering to structure your link text that way.

Much more often is it true that one word is more important than others. Seldom is it the case that none of the individual words in a phrase means anything to you in itself.

Even in the silicon valley real estate example, if your site is a real estate one and you are pointing links at the silicon valley page, then I'd rather direct more pop to "real estate"; if you have a large silicon valley website and you are pointing a link to your real estate page, I would rather get more benefit from "silicon valley".

Dolemite

10+ Year Member



 
Msg#: 14400 posted 3:17 am on Jun 18, 2003 (gmt 0)

What I'm most trying to understand is that in our case, we have a few sites where the homepage was optimzed for a primary two-word keyphrase, and several secondary phrases. Like many in here, we're down on the two word phrases for those sites. But I'm still not certain why. Much of the rest of what's going on makes more sense to me. But for the most part, the other sites that have moved up seem no less focussed on the same phrase. So, either we had overdone something, which is possible but we try to avoid...or there is simply something that we are doing that Google *now* doesn't like (yes, I mean Google's brainless, highly complex, 'let-em-fall-where-they-fall' algo).

This is interesting...the inconsistency makes it hard to generalize, though. Meanwhile, algorithms are the ultimate in generalization. If you're seeing this in a few sites, its something to consider and look deeper into.

I have a site where I recently changed all the site navigation home links' anchor text from "home" to "keyword1 keyword2" (really more of a keyphrase). This site has gone from almost nonexistent on that keyphrase to a top 30 result in the Esmerelda index. Mind you, this site was new as of mid March so it was really suffering in the Dominic confusion. But this added optimization seems to have helped rather than hurt. I would certainly not characterize it as "overoptimized", and the sites above it tend to be more "optimized" as far as having this keyphrase in more links and also more keyphrase-stuffed.

OTOH, this particular site is top 10 for allinanchor:keyphrase but only top 30 for the straight keyphrase search. I guess I'm just not complaining since it didn't even really register before.

I would look deeper into what you might have done recently on any of those sites. I'm sure you've considered that, but since changes from 2+ months ago are just now having effects, "recently" means different things.

It also seems that the current index is by no means final. Many people have complained about their index pages are ranking lower than internal pages for primary keywords/phrases...something that gradually seems to be improving. Its possible that this is happening for others to varying degrees and just being interpreted differently.

I'm trying not to make definitive conclusions based on current results, but its time to start looking at the newer trends.

mil2k

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 14400 posted 6:08 am on Jun 18, 2003 (gmt 0)

steveb you have made some extremely significant points. I find myself agreeing to whatever you said based on my own observations. :)

projectphp

10+ Year Member



 
Msg#: 14400 posted 7:15 am on Jun 18, 2003 (gmt 0)

Google Objectives
1) Google is sick of SEO'd sites dominating the SERP's. They attempt to tackle that problem with Dominic and later...(example: give less credit to webmasters who go to town on hot phrases like 'travel Hawaii' (mods, just an obvious real world example; I don't work in this area specifically).

Are they? Why and where? Vague, proofless conjecture.

2) Dovetailing with the above, Google also wants to return more precise results, especially with the more specific searches...

Hey? What the hell does that mean? When has ANY "SEO'er" created BAD results? Never! SPAMMERS do, SEO'ers DO NOT!

Problems
1) Ultra SEO'ers and spammers are creating lots of perfect text backlinks ...

Are they? What SERPs is this DIRECTLY effecting?

2) SEOers are all over the basics like titles and backlinks, i.e., using "X Widgets, Y Widgets, Z Wigdets' in the title, backlinks, etc. ... which sort of spams Widgets. (I use that word spam loosely here; take no offense anyone; lots of webmasters were doing this).

Hey? Another "point" I don't get. Even if this is true, any model is essentially "crackable" in the long run, and the recent chaos would be a BAD way to solve this.

That is a WHOLE LOT of presumption on your part, Wackmaster, without any real proof or example. Many people are speculating as what Dominic and Esmar..Esma..this one are to achieve. Seems to me, all this can be put down to a "New data Model" that Google is rolling out.

VIRTUALLY ANY enhancement, such as the timestamping of links, would require a Databasing change, and this fits the chaos seen.

Speculation as to what problems Google are attempting to solve, and with what methods, is pure conjecture unless a specific case can be pointed to, or evidence found.

So far, in ALL these post-Dominic threads, all I hear is a lot of noise and not a lot of fact. Proofless anaylsis is INTERESTING, fun even, but nonetheless pointless.

<edit>typo</edit>

truth_speak

10+ Year Member



 
Msg#: 14400 posted 7:56 am on Jun 18, 2003 (gmt 0)

projectphp,

I think we are having a valuable discussion here. Much of what I have read here is "real life" based on "real SERPs" (such as my own) and is therefore not idle conjecture.

Most of us here are just trying to determine the best methods by which to optimise our sites for high placement in new algo-driven SERPs.

Consistent changes in the SERPs of www-fi reflect real algo changes. It is important for us to develop the best explanatory model that we can for these changes.

I think the above discussion is a good attempt at deducing what some of those particular algo changes were.

Something is definitely going on with the algo's weighting of anchor text and page titles. I believe that Steveb and wackmaster are onto something here.

I'd rather have their informed "conjectures" than nothing at all. I need information to better deal with this new algo.

In my case, to recap in a better fashion what I said earlier, I had navigational anchor text and an important page that was: "Furry Widgets, Soft Widgets: Big and Small." My placement in the SERPs dropped for both "furry widgets" and "soft widgets" (two keyword phrase) in the -fi index, BUT placement in SERPs on -fi went UP for three keyword strings such as "Big Furry Widgets" and "Small Soft Widgets."

I am not a web genius, but I think this supports the above arguments.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved