homepage Welcome to WebmasterWorld Guest from 54.242.126.126
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 138 message thread spans 5 pages: < < 138 ( 1 2 3 [4] 5 > >     
Signs of Fundamental Change at Google Search
tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 12:07 am on Apr 5, 2007 (gmt 0)

In the 950 penalty thread [webmasterworld.com], randle recently posted that "...there is a broader picture here, fundamental changes are afoot."

I agree, and I'd like to collect people's observations about what "signs of fundamental change" they've may have observed in recent months.

One big one for me is that no matter how common the search term -- it could generate billions of results -- you always seem to bang into an "omitted results" link before your reach #1,000. In fact, i just checked out a search on the word "the" which google says generates 5,300,000,000 results. And even on this monster, #928 is the "omitted results" link. Hmmmm....

Now 5,300,000,000 also seems like a low number to me - unless it does not include any Supplemental Results. So my current assumption is that by fattening up the Supplemental Index, Google has pared down the main index to somewhere in the vicinity of 5-6 billion urls.

A related sign of fundamental change, I feel, is the problems Google currently has generating understandable results for the site: operator or the Webmaster Tools reports. It looks to me like the total web data they've collected is now broken up into far-flung areas of their huge server farm -- making it very difficult to pull comprehensive site-wide information together again.

 

bouncybunny

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 2:11 am on Apr 9, 2007 (gmt 0)

When you think about what is natural it is natural to have the anchor text in your navigation menu the same throughout the site. It is NOT natural that everyone who links to you would use the same anchor text.

But it is also natural to vary the anchor text in internal links. I often link to the same story on my site when referencing it. Depending on the reasons for referencing it, I often use different anchor text. It is the most natural thing in the world.

Sometimes I might link to the "red widgets story" at other times I might link to "that great article on widgets". Are Google going to give me a hard time about that? Seems odd.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 2:28 am on Apr 9, 2007 (gmt 0)

There's a difference between global (or section) site navigation, which even on smaller sites is often done using includes. I use PHP Includes even on an 8-10 page site, and so do others. So of course that's normal to be the same. That isn't the same as linking to an article from within text on other pages of the site using anchors relevant to both the page linked to and the page the link is on.

If there's a page with shoes and socks and a page about shoes that links to it, it would be expected that shoes would be in the anchor, since it's in the context of a page about shoes. If the link to that shoes/socks page is coming from text on a page about socks, there's nothing abnormal about linking to the shoes/socks page with socks in the anchor - it's an on topic link to an on topic page..

There's a difference between sites' common navigation and linking in body text within the context of what the page is about.

There's also a big difference between navigation ON and WITHIN a site and anchor text in external, inbound links. Unfortunately, this issue got muddied several posts back.

Thank you mfishy, for the clarity of your post.

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 3:11 am on Apr 9, 2007 (gmt 0)

LMAO: This was the post I started skimming at earlier:

There was a time when it became evident that an excessively high percentage of identical anchor text in inbound links caused a penalty. This was not just casual observation (or guesswork) - it was tested.

AND

Exactly the point. Over a certain percentage of identical anchor text on inbound links cannot be done with independent decisions. Not 80-90% of the people linking will use the same exact wording of "money keywords" in the anchor of links, it has to be contrived. Natural development would be indicated by a certain amount of variation.

At the time the testing was done, there was a very specific figure - under nn% of indentical anchor text = no penalty. Having nn% or above identical = penalty. I remember the exact figures.

Changing anchor text of existing links is a different matter, and so is internal anchor text in global navigation and otherwise.

…but the original post is about nav links.

Where is the original post, so I can keep up with the thread?

I just read back and it looks like this portion of the thread is about varying link text, both internally and externally, and the entire thread appears to be about "fundamental changes".

Justin

Happy Easter all.

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 3:27 am on Apr 9, 2007 (gmt 0)

Vary externally, and 'focus' link text internally would be the advice I would give.

Maybe even, only suggest externally, and allow most inbound links to occur naturally, but make sure your internal links are clear, concise indicators of what a linked resource will present to a visitor. (Personal opinion only.)

Justin

My apologies for the double posts, thought I should contribute a little. =)

lfgoal

5+ Year Member



 
Msg#: 3305282 posted 4:18 am on Apr 9, 2007 (gmt 0)

"Maybe Google has become smart enough to understand that 100 links with the anchor text "Flash Bang Wallop Cola" are likely to be manipulative unless they're pointing to flashbangwallopcola.com"

Or how about this: those 100 links are just fine as long as the page actually talks about "flash bang wallop cola".

"Varying internal linking in your navigation is obviously not natural and should be avoided."

I disagree and for this reason cited by bouncybunny:

"But it is also natural to vary the anchor text in internal links. I often link to the same story on my site when referencing it. Depending on the reasons for referencing it, I often use different anchor text. It is the most natural thing in the world."

Bloggers do it all the time.

[edited by: lfgoal at 4:18 am (utc) on April 9, 2007]

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 4:18 am on Apr 9, 2007 (gmt 0)

We have pretty much discussed what seems to be happening about anchor text internal or inbound. There does seem to be a change there but I think it's more than just a typical change in the algo looking at anchor text density.

There is something more fundamental going on. It's not just density but more a ratio of preselected phrases. (Remember a word can be a phrase in this definition)

That's why it's so much harder to know just what is happening. Also it explains why different people here are seeing different things. It could be if there are no flags on phrases you can be further off in the density department and still not be affected.

<added> Actually there are several people here who have spent a lot of time studying the 950 situation as well as the phrase based patents. I saw the message asking someone to sum it up but there is so much going on and so many uncertainties it's difficult to put in a paragraph or two. Hopefully someone can, I can't, my head just spins.<added>

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3305282 posted 6:05 am on Apr 9, 2007 (gmt 0)

Exactly the point. Over a certain percentage of identical anchor text on inbound links cannot be done with independent decisions. Not 80-90% of the people linking will use the same exact wording of "money keywords" in the anchor of links, it has to be contrived. Natural development would be indicated by a certain amount of variation.

There's a great quote that's often applied to Hollywood or the legal profession: "The secret of success is sincerity. Once you can fake that you've got it made." Are we talking about faking natural development, or about natural development? ;)

With that in mind, I'm wondering how much your page title influences natural linking patterns, and how much scrapers may have influenced what is seen as natural.

Also, getting back to the original topic... fundamental change... I'm thinking that many algo changes we're seeing are in fact driven by scraper spam and by purchased links, and it might be helpful to look at how those would be combatted to get a sense of what Google is doing... and why perhaps perfectly good pages might be suffering collateral damage.

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 9:13 am on Apr 9, 2007 (gmt 0)

I think what could be happening to some "quality" site might be attributed to the change in the way links / link text are applied and the better detection of "scraper spam". (I think someone noted there was a change earlier in this thread.)

I don't know if you can count link text as part of the document they refer to in a phrase based system the way you could in a boolean based system, because it seems you could "throw off" the expected phrase(s) calculations for the receiving document.

So, where certain types of links may have been helpful to some sites previously, it would seem reasonable "removing" the keyword(s) patterns from the receiving document could negatively impact rankings.

Also, I believe "scraper spam", which was previously undetected as spam would have counted as an inbound link, so I think it's possible "scraper spam" links actually helped some sites, and when the counting of links was adjusted, and/or the sites were detected as "spam" some sites probably lost large amounts of "links" or "link text" and went into "free fall".

Justin

glengara

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 11:42 am on Apr 9, 2007 (gmt 0)

FWIW, and since there's some focusing on link text...

G is increasingly "localising" web search results, where once no Irish site would turn up in G.ie for a generic term like "web design", now they do.

I recently looked at allinanchor:web design on the different English language Gs, results were similar but not identical and all had some regional sites turning up - "local" anchor text being weighted perhaps?

I then compared the normal search to the allinanchor results, there were a lot more discrepancies but when looked at most of them were tops for allinanchor variations such as "website design" or "web design UK" - a factoring-in of closely related anchor text perhaps?

Strikes me that if G wants to display more regional results for generic terms while keeping allinanchor as the "quality" metric, they would need to handle it differently than they "traditionally" have.

[edited by: glengara at 11:59 am (utc) on April 9, 2007]

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 11:57 am on Apr 9, 2007 (gmt 0)

Now that is interesting. Are you effectively saying that to rank in our home country we need to ensure that our links are local or have have I misunderstood? It would certainly make sense for many products.

glengara

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 12:24 pm on Apr 9, 2007 (gmt 0)

I got distracted and didn't investigate whether the local allinanchor boost was coming from the sites being local or the anchor text was coming from local links :-(

Either way allinanchor results appear to be a bit of a movable feast...

Miamacs

5+ Year Member



 
Msg#: 3305282 posted 1:00 pm on Apr 9, 2007 (gmt 0)

Fundamental?

I don't know.

To me fundamental would mean something bigger than what is happening. After all, everything I do still works. Every site I ever optimized still ranks well. The third ever project sits top 1,2,3 for everything I targeted, the newest site became top 1-10 within a few months. My approach has always been a lazy one, to get the highest quality of everything and not bothering with the rest. Well researched content, well laid out pages, consistent navigation, and only the best sites to link to mine. Meaning apart of a single site, all of them have inbound counts around a thousand links, and no more. But those are on actual pages. Not blogs, not forums, not community sites, not directories, not anything that would come and go by the turn of a knob at the plex. Their rankings have been in line with the amount of work involved. I'm lazy, I want to work as little as possible.

The change I see is how Google's different "save the day" hacks are slowly being integrated into their algo, and start to interact with each other on a much deeper level.

- Calculating relevancy scores by Trust. No more "general" offset.
- Devaluing links by lack of relevancy or pagerank ( supplementals )
- Ranking multiplied or decimated by internal anchor text relevacy
- Linking patterns observed by growth rate and anchor text variation
- Trust multiplied by link age
- Relevancy scores broadened or narrowed down by phrase-matching
- Sites filtered out because of current trends in spam detection

...and so on.

Most of this is not new, it's just that you must consider their effect on each other.

Ranks are still calculated based on the source-->anchor-->target relevancy, multiplied by PageRank and thematic TrustRank, extended to other phrases based on what the source is Trusted for.

...

On the percentage of inbound link anchor text variation

To me it seems that mystic percentage is different for each term.
Same as with TrustRank, not everything is monitored or judged by the same strict rules.

I just did a test.
Keyphrase was top money two word combination, travel sector.
Percentage was 50% if I look at the use of the EXACT same anchor,
66 % if the anchor had to include the two words as the first two.
Penalty seems automatic, site is popping in and out as I add new links to shake off its effect.
Also, TrustRank can get it out of the penalty but only for google.com ( US ) searches.
It'd be fun to match this up to other sites but I don't have anything to spare right now.

Natural vs. artificial
Just a few observations

Natural links tend to have 50%-40%-10% of the following three types of anchors:
- The d@mn URL of the page. So... yes, in the end a hyphenated domain or keywords in the URL will do you a lot of good. These are usually the lowest quality links, concerning their source which are forums, blogs, etc.
- The exact title of the page. Typed in, or copied. Some will have a few healthy typos to keep you at the top even for those. Medium quality links, bookmarking and community sites, quality editorial directories... etc.
- Paraphrased anchor created from somewhere the first 10 words of your own site description. Usually suggests at least some interest in your site, so probably the highest quality of the three, regarding the links' source. Define your mission on your site clearly, and pack it with the proper terminology so they'll have something relevant to go by.

...

I could add another paragraph worth of my view on the phrase based penalty, but I'm not sure if it was needed in this thread. We've discussed it to the bone on the other one, and every time we sum it up, three days later we get another clueless comment, followed by another debate and finally another half-satisfactory conclusion of the same cure.

Edited to add:
glengara, you're right on the regional links boost regional ranking.

[edited by: Miamacs at 1:03 pm (utc) on April 9, 2007]

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 1:52 pm on Apr 9, 2007 (gmt 0)

Direct "Boolean" matching of query terms has well known limitations, and in particular does not identify documents that do not have the query terms, but have related words.

I think switching from a "boolean" (term=term) base to a contextual, associative (phrase A "relates to" phrase B, phrase C, phrase D) base is a change in the fundamental system of indexing documents, whether there is a "large shake up" in the rankings or not.

Could there be a fundamental change in the heuristic, without a fundamental change in the rankings? Sure.

Or, the fundamental change in the rankings has been taking place slowly over an extended period of time, resulting in "penalty speculation", which may not be penalization at all, but rather a different method of "scoring" documents for relevance, which caused some documents previously considered "highly relevant" to currently be considered "not so relevant".

Justin

BTW: (Tongue In Cheek) Do they assess the Phrase Based Penalty using a minus sign? I can't remember.

<added>
Forgot, the above quote is from the patent app. which has something to do with detecting spam, but also seems to relate to the indexing of documents:

An information retrieval system uses phrases to index, retrieve, organize and describe documents. Phrases are identified that predict the presence of other phrases in documents. Documents are the indexed according to their included phrases…

I believe there is a typo in the quote above from the same app., which should read: "Documents are then indexed…"
</added>

Crush

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 4:07 pm on Apr 9, 2007 (gmt 0)

We have taken a mega trusted domain that we had other plans for, uploaded all the content that has been banned. The site is being crawled and we are re-ranking for phrases that we have been splatted for.

If you have a spare few grand to play with on a trusted domain maybe it is a short term route out. When we lose 1000's per day it is not an issue for us. The only problem is dupe content if we ever regain rankings with the original domain. I think a 301 will be the ticket then.

julinho

10+ Year Member



 
Msg#: 3305282 posted 10:28 pm on Apr 9, 2007 (gmt 0)

Re: Fundamental change at google search:

I think that this patent [appft1.uspto.gov] has not been given due attention by most webmasters.

From what I understood, this patent attempts to come up with new techniques other than term-based and link-based methods to determine the importance of documents.

How do I know that? Google is saying so.

[0008] Each of these conventional methods has shortcomings, however. Term-based methods are biased towards pages whose content or display is carefully chosen towards the given term-based method. Thus, they can be easily manipulated by the designers of the web page. Link-based methods have the problem that relatively new pages have usually fewer hyperlinks pointing to them than older pages, which tends to give a lower score to newer pages.

[0009] There exists, therefore, a need to develop other techniques for determining the importance of documents.


callivert

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 2:03 am on Apr 10, 2007 (gmt 0)

There exists, therefore, a need to develop other techniques for determining the importance of documents.

That patent is about ranking pages based on the amount of traffic they get in a one-month period, and the type of traffic they get (e.g., excluding people who are "affiliated with the content").
But implementing that wouldn't cause a fundamental change. In fact, it would be a stabilizing factor, because it's a sort of "rich get richer" system.

Instead, having read the whole thread, the most convincing answer to me (for what it's worth) was the theory that they're taking a much more aggressive stand on supplementals.
For one thing, this would explain the huge difference in responses around here, responses from "I don't see any change. What's everyone talking about?" to "help! my site has been trashed!"
It makes sense too. The web is expanding, in part driven by rampant spamming practices. Every increase in the web is an increase in resource expenditure for the search engines.
They can't index it all. They don't want to index it all. so they're actively making decisions about what they will and won't index.
And, again to combat spam, they may have even disenfranchised supplemental pages (ie they no longer get to vote).

And here's a wild conjecture: That could have a knock-on effect, where pages that had a lot of backlinks from supplementals lose their backlinks, and therefore themselves become supplemental- causing pages that they link to to lose links and so on. If that were true, you would see massive upheaval in some parts, and in other parts blue skies, a light breeze, and plain sailing.

<edit:> I just thought of a possible sting in the tail of the "ranking by existing traffic" algorithm. That is, google could look at a site and say "if we didn't exist, how much traffic would this site get?" and if the answer is "none" then down into the bottomless pit you go...

skyewalk

5+ Year Member



 
Msg#: 3305282 posted 7:39 am on Apr 10, 2007 (gmt 0)

Do you know what, I really think I've figured the 950 thing.

I have two sites, both with the same design and structure and pretty similar link profiles (though totally different content).

One has been fine on Google all the time.
The other was doing well, then went 950, then really good rankings, then 950...
This site has been in and out of the 950 thing four times in the last week, despite no real changes to what is going on on the domain.
Whereas the other site stays solid as a rock. Both sites are pretty comprehensively indexed, with very few supplementals.

The explanation - I'm serious - Google has got a some kind of data issue at the moment, and all will be back to normal in a month or so.

I actually think this is more likely than all this speculation about similar and different anchor text etc - which couldn't ever explain the performance of my sites.

My only other theory is that the good site links quite a bit to the site that does badly (for very good reason, for users), which could be hurting me. But I haven't heard anyone else saying anything like that, so I'm going with the data issues.

julinho

10+ Year Member



 
Msg#: 3305282 posted 9:47 am on Apr 10, 2007 (gmt 0)

That patent is about ranking pages based on the amount of traffic they get in a one-month period, and the type of traffic they get (e.g., excluding people who are "affiliated with the content").
But implementing that wouldn't cause a fundamental change. In fact, it would be a stabilizing factor, because it's a sort of "rich get richer" system.

If it was based on traffic alone, indeed it would be a "rich get richer".

1. A computer implemented method of organizing a collection of documents by employing usage information, comprising: receiving a search query;identifying a plurality of documents responsive to the search query; assigning a score to each document based on at least the usage information; and organizing the documents based on the assigned scores.

2. The method of claim 1, wherein the documents are hyperlinked pages from the world wide web.

3. The method of claim 1, wherein the usage information for a document comprises the number of users who have visited the document.

The method starts with "receiving a search query". Only later on does it mention visits to the document.

I may be reading it wrong, but I think that this makes a big difference.

rekitty

5+ Year Member



 
Msg#: 3305282 posted 2:21 pm on Apr 10, 2007 (gmt 0)

And here's a wild conjecture: That could have a knock-on effect, where pages that had a lot of backlinks from supplementals lose their backlinks, and therefore themselves become supplemental- causing pages that they link to to lose links and so on. If that were true, you would see massive upheaval in some parts, and in other parts blue skies, a light breeze, and plain sailing.

That's not wild conjecture Callivert; what you describe is exactly what I'm seeing. Months and months of slow boiling. Google has been cooking the "unimportant" pages out of their main index into the supplemental index.

This is proving very effective in taking out entire categories and neighborhoods of links Google doesn't want influencing their results: huge scraper sites, forum and blog comment spam, reciprocal links, crappy directories, etc. Most all have been boiled off supplemental. The result is a fundamental shift in Google's distribution of link popularity across the web.

Unfortunately this has also sent many important pages for obscure queries supplemental. Users can no longer count on Google consistently returning good results for obscure queries.

A friend of mine was extremely frustrated when she couldn't find a page a second time for a query: "STATENAME fingerprint card supply hours" It gave her the exact page with the hours of operation as the number one result the first time she tried. Magic! Two weeks later she needed another card after her prints for her bar application were smudged. The page was nowhere to be found and she was very frustrated. I did a bit of investigation and the result she was looking for was on page 3 and had gone supplemental. Yahoo and MSN couldn't find it either, but it was a step back for Google.

Bentler

10+ Year Member



 
Msg#: 3305282 posted 2:39 pm on Apr 10, 2007 (gmt 0)

If it was based on traffic alone, indeed it would be a "rich get richer".

From what I've seen, G will periodically drop a page-- some say penalize it in response to a spam trigger but the end result is the same. I think this is when they sample the traffic in earnest; if the dropped page continues to get traffic because of the page's natural importance, it will bounce back in the serps. (speculation, based on personal observations) No traffic? Supplemental.

So even the rich can get fluxed if they hit a bad traffic cycle, and such a sifting design would tend to open up opportunity for new pages to get traffic for an algorithm to use for comparison. I think of it as active experimentation on their part. I think the other search engines are sensitive to traffic too though, which would tend to dampen the sifting and shore up the rich pages, and so it bears out that the ones that perform well in other search engines and that buy advertising will likely maintain position.

timchuma

10+ Year Member



 
Msg#: 3305282 posted 6:57 am on Apr 11, 2007 (gmt 0)

How time flies, I've been using the internet and web in particular for 12 years now and people forget that once a search engine stops returning decent search results, people will go find a new one. It happened with Yahoo, AltaVista and countless other search engines and now Google's time has come.

I did have a quick look at Live Search and was surprised that it actually listed my sub-domain correctly, something which Google still hasn't managed to do even though I've had it on my site for three years.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 8:11 am on Apr 11, 2007 (gmt 0)

Thread here at WebmasterWorld from 2005 on the Usage Statistics Patent [webmasterworld.com]

This is something else entirely, though usage stats could come into play with using phrase-based indexing for personalization.

Exactly the point. Over a certain percentage of identical anchor text on inbound links cannot be done with independent decisions. Not 80-90% of the people linking will use the same exact wording of "money keywords" in the anchor of links, it has to be contrived. Natural development would be indicated by a certain amount of variation.

There's a great quote that's often applied to Hollywood or the legal profession: "The secret of success is sincerity. Once you can fake that you've got it made." Are we talking about faking natural development, or about natural development? ;)

Yessir, I believe that's exactly what some very sincere people are talking about. ;)

Hey,does anyone remember where in those phrased-based patents it talks about taxonomies being updated on the fly? Wonder how that would affect co-occurrence figures for phrases that could impact any given page kind of "on the fly", if there's that much flux - and how much in the way of unexplained ups and downs would occur as a result of it.

It can't get more "on the fly" than what some people are seeing and experiencing in the SERPs, it's like a yo-yo - up and down, down and up, up and down, down and up.

If you swung a shiny pocket watch on a chain back and forth, back and forth, like the SERPs move back and forth, you could hypnotize a cat.

[edited by: Marcia at 8:25 am (utc) on April 11, 2007]

decaff

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 8:30 am on Apr 11, 2007 (gmt 0)

There's a great quote that's often applied to Hollywood or the legal profession: "The secret of success is sincerity. Once you can fake that you've got it made." Are we talking about faking natural development, or about natural development? :)

Nice one Marcia..

The reality is their is a ton of very sincere "insinserity" online...people trying to accelerate what is a very natural and organic process of putting up a web site (and associated web pages) and working to grow this overtime...having the engines find your resource and eventually finding its place in the search listings...

Google has, no doubt, delved deeply into the dynamics of this and have worked to include some checks and balances into the algorithms ...

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 9:12 am on Apr 11, 2007 (gmt 0)

That's Robert Charlton's quote decaff, and I think he's more spot on than a lot of people would like to admit.

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 3:26 pm on Apr 11, 2007 (gmt 0)

At the risk of turning this into a 'natural development' thread, I wonder how 'natural development' would be defined at the inception of a website?

Assuming a website developer only has one website, which they have just completed building, is it 'natural' to 'promote' the website to other existing websites / website operators?

Because back before I was a WebmasterWorld member, I built my first website. The first thing I did was let some other people with websites, who offered 'directories' and 'resources' in the same area as my website, know it existed and asked them to help me 'promote' it with a link. It seemed to make sense.

Would this be considered 'natural' and 'sincere', or would it be considered 'fake' development?

Justin

<added>
BTW I agree with Marcia, Robert Charlton and decaff in thinking there many who go too far, but think we should maybe give future readers some insight as to what might be considered natural if we are going to point out what might be considered unnatural (fake) promotion.
</added>

Edited for more appropriate wording; to include decaff; trying to avoid verbiage, but getting close.

glengara

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3305282 posted 3:44 pm on Apr 11, 2007 (gmt 0)

*Would this be considered 'natural' and 'sincere', or would it be considered 'fake' development?*

Doesn't G even recommend doing this in their webmaster guidelines?

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 4:07 pm on Apr 11, 2007 (gmt 0)

I believe so, but in reading it appeared requesting links could be considered 'faking sincerity' in development, which I do not think is necessarily the case, and thought it would be beneficial to clarify.

(Briefly, without going on a 10 page tangent.)

EG If I built a new site and 'sincerely' requested two links with the title of the page in them, 100% of my inbound links would have the same text in them, which is greater than 80-90% inbound links with the identical anchor text.

Justin

I guess my question is: Is there some threshold regarding the stated 80-90%, 'you must be a manipulative spammer if your identical link text % is above X', or is the consensus you will be penalized for following guidelines and doing what seems natural?

fmimoso

5+ Year Member



 
Msg#: 3305282 posted 10:25 pm on Apr 11, 2007 (gmt 0)

The greatest mistake you can make in life is to be continually fearing you will make one.
— Elbert Hubbard (1856-1915), The Note Book

What do you think about this quote?

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3305282 posted 1:34 am on Apr 12, 2007 (gmt 0)

Nice second post fmimoso!

I think it's a great quote, and have been known to do some things which would be considered by many to be 'counter SEO' because I think our job is to build better websites and let visitors find something unique, informative, easy to navigate, understandable, and well written by an authority on the subject they present; then let search engines do their job.

I'm about sure there is an SEO site somewhere containing a page on SEO standards and spam which in paraphrase says, "the job of an SEO is to improve the quality, quantity, clarity, and value of a website.", not to make sure you vary inbound link text percentage by at least N or greater. (Which if you do intentionally, for the sole purpose of increasing (or not decreasing) your rankings in a given search engine would be just as manipulative and 'fake natural' as requesting identical link text for the same reason.)

I usually take what I read here with a grain of salt or two.
(and sometimes a lime with one of those 'shot' thingies.)

By your quote, if you mean: Don't worry as much about 'mistakes' and concentrate on building better websites internally, so the external portion will take care of itself. Then I would absolutely agree.

Justin

I probably shouldn't have gotten us all off topic. Sorry.
We should probably return to our fundamental change discussion now.
I promise no more tangents for at least 24 hours.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3305282 posted 2:10 am on Apr 12, 2007 (gmt 0)

I'm about sure there is an SEO site somewhere containing a page on SEO standards and spam which in paraphrase says, "the job of an SEO is to improve the quality, quantity, clarity, and value of a website."

Clarity: I've got something written on a page somewhere that describes SEO as just that, or something similar, without mention of spam, though. Roughly paraphrased, it's defining SEO as constructing a site so that it's made clear to the search engines what the site (or page) is about.

not to make sure you vary inbound link text percentage by at least N or greater. (Which if you do intentionally, for the sole purpose of increasing (or not decreasing) your rankings in a given search engine...

How about if people who insist or strongly "suggest" links be done in a certain way stop insisting on controlling the process and let at least "some" happen naturally, as nature intended. ;)

...would be just as manipulative and 'fake natural' as requesting identical link text for the same reason.)

I've theorized that it isn't the fact that some sites have reciprocal links that's a problem, but that just that very thing may contribute to a problem - insisting that certain link text and descriptive text be used for the recips back.

People

5+ Year Member



 
Msg#: 3305282 posted 12:11 pm on Apr 12, 2007 (gmt 0)

Read that Google recently released its corpus of words for the public.

Don't know if this adds to the relevancy of this post. It seems to me if I had lots of computational power and lot of data I would use context measurement methods in determining the relevancy of inbound links.

If link said "link to red widgets" - I would check, is this a text, is it a text about red widgets and if not how close did it come to texts about red widgets or similar subjects. This is just ordinary text recognition stuff which is easily stored and indexed.

The conclusion I guess is unless you have spam or bad texts on your web sites inbound links variation or non variation shouldn't make a difference.

Of course everything is relative if a lot of text about red widgets is spam than the then measurer might start to believe this is how texts about red widgets look like, but then again so might you if you hadn't been on the other side of the wall.

This 138 message thread spans 5 pages: < < 138 ( 1 2 3 [4] 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved