Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Natural vs. Un-natural - in SEO and the Google Algorithm



6:27 pm on Nov 8, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I have been reading threads here for a few years and once in a while, I'll have the proverbial "lightbulb" go off in my head when looking at my sites and others. So I would like to discuss...

What is natural and what is not.

I run a very small web development company that develops and hosts sites for other companies. I have hunted and found posts from some of the Google posters (ie. Adam, Matt, Googleguy, etc.) here and on other forums. Every once in a while, some terms pop up that I really try to grasp. Things like "manipulation," "natural," "geniune," "trust," "contrived," "authority" and so on. I think the brain just had one of those light bulbs...

I consider myself fairly good at development... kind of a hack in SEO, but I still get fairly good rankings for a lot of my web sites. Not sure how, but that's the truth. But there are some things I do when it comes to development that actually may help in the SEO process. These are not things that I have conceived, but part of a "natural" process that I really never thought of before.

For example, let's look at the meaning of the word "natural." Here are just the first 8 I found...

1. existing in or formed by nature (opposed to artificial): a natural bridge.
2. based on the state of things in nature; constituted by nature: Growth is a natural process.
3. of or pertaining to nature or the universe: natural beauty.
4. of, pertaining to, or occupied with the study of natural science: conducting natural experiments.
5. in a state of nature; uncultivated, as land.
6. growing spontaneously, without being planted or tended by human hand, as vegetation.
7. having undergone little or no processing and containing no chemical additives: natural food; natural ingredients.
8. having a real or physical existence, as opposed to one that is spiritual, intellectual, fictitious, etc.

Of course, all of these have to do with nature, but I bet if you read these closely, you will see what I am getting at. All of these definitions could be reworded slightly and made to fit a "natural" progression of a web site from its infancy to maturity...

1. existing in or formed by naturally (opposed to artificial): a natural web site - When I get a call from a perspective client, I make it clear that I know nothing of their business, how to run their business, market their business, how to make their widget, etc. My clients are the ones who know their product and expertise the best. This is why they actually have a business. If they are selling red widgets, I need them to supply me with information on red widgets. I know development, but nothing of red widgets so I need to depend on them for the information, copy, images, or anything else that has to do with red widgets. I don't run out on the web and find information on red widgets for copy. I don't take images from other sites to use. I don't take anything that I think I may know about red widgets into consideration at all. They are the experts and I need to rely on them and their expertise. They dictate what their information is and how that information is linked. They also tell me what they beleive is the most important information and where that information should be and how they would like their clients to find that information. This is what I would call a natural website. It really had nothing to do with me, I just am able to interpret what they want on their site. Sure, I will make suggestions and sometimes even fight for a keyword in a link, but ultimately, it is their business.

2. based on the state of things on the web; constituted by websites: Growth is a natural process. - Duh... I laugh when I read posts like "I added 10,000 pages to my site and it was dropped from the SERPs!" Did this guy actually sit down for a couple of years and write all those pages of genuinely unique content? I doubt it. If best, it was ripped off or it is pages upon pages of slightly different database calls for multiple product descriptions. Let's face it folks, there is no need to have G index 10,000 pages of your site. My largest database is about 10,000 products and I don't care if they get indexed or not. I would rather have the user hit a category page before getting to the actual product... it gives them choice. All my sites are developed the same... thay start with the home page with a menu on the left and as I get information, I add links to the menu. These pages are the ones that I think are important in the SERPs. Kind of like a tree growing... it starts small and branches off into other parts. These parts get bigger and could branch into other parts. It's the large "trunks" that you need to pay attention to. Sure you want them to view a product or some piece of information, but the first goal should be to get them there. I tend to break all products into a few main categories and go from there. It seems to work for me. And if some of these pages get indexed, I would consider it a bonus.

3. of or pertaining to sites or the web: natural beauty. - Things have to look good folks. And information has to be readily available. A page of 10,000 links pointed to everywhere is not very pleasing to the eye. This may not have anything to do with your ranking, but I bet a pleasing, well thought out site will get you more repeat business. By this I mean the ability for a user to find useful information in a fast, easy and pleasant way. I have rule, especially for e-commerce type sites... "Four clicks and your out." What does this mean? It means that if a user cannot find information or order a product, find a phone number or address, find directions or anything related to the company in four clicks, you better seriously consider a better way to arrange your site. I have been doing this for years and I find it very helpful to keep this in mind. Take this site, for instance... I type in "webmasterworld.com," click on "Google Search News," click on a subject, and then I post a response... simple, elegant and very easy to do. Think about it, folks.

4. of, pertaining to, or occupied with the study of website science: conducting website experiments. This is something we, as SEO people, do all the time. You need to experiment to see what works and what doesn't. The problem I have found is that a bit of experimenting can be devastating, but you must experiment none the less. I have a site, a hobby site, that generates no income. It started out as an experiment to see how and why SEO works. Now the site in question happens to be something that I am very interested in. I have a forum, informational pages, lots of links to other related sites, a very comprehensive directory of related sites (very much like yahoo) and this site actually does very well in the SERPs. That is not the point, though. The point is it started out as something I could experiment with. Granted, I now really enjoy the site and have over 5000 members in the forum, but I will still tweak it once in a while to see how changes affect ranking. The bottom line is that if this site drops out or ends up on page 50 of the SERPs, it is not that big of a deal for me. I do not end up on Welfare and hop on the forum here and slam G for doing me wrong. Let's face it folks, if you want to get into the SEO business, you need to experiment. I would recommend a throw-away site just to see what works and what doesn't.

5. in a state of uniqness; unique, as in uncultivated information. This has been beat to death on the forums here. YOUR CONTENT HAS TO BE UNIQUE! What exactly does this mean? Take a look at this directory structure...


There are probably hundreds of thousands of sites out there for lodging. Optimizing for such a site is going to be tough. Not only are you competing with those thousands upon thousands of sites, you are probably going to be competing with some pretty savvy webmasters, especially in the SEO business. Likelyhood of you getting any of your pages in the first ten is going to be pretty close to impossible. That's just the way it is. If you are planning on starting a lodging site like this, you are pretty much going to be adding to the Web something that it has way too much of. Lodging in North America gets a bit easier, but that is still going to be a tough nut to crack. Even narrowing it down to the US will be difficult. Now we get to Minnesota. It starts getting a bit easier. Then to St Louis County, easier still. We end up at Hibbing. This is going to be the easiest. Why? Well it doesn't take a rocket scientist to figure this out. There probably are no web sites dedicated to lodging in Hibbing! You could probably develop this site and have it rank #1 for "lodging in Hibbing" within a month, if you know what you are doing. In fact, you could use this as your experimental site ;-) Uncultivated information is the key here. THAT is what it means to be unique. Not rewriting something that has been beat to death but writing something that has never been written about! I know this is getting tougher as time passes, but that is just the way it is. You need to be frank with your clients about this too. If they come to you and want to sell or advertise something that has been beat to death, you need to tell them the likelihood of great rankings are pretty much nil. If they can narrow their focus to what exactly they want in the way of users and how those users are going to find them, you are going to have a better chance of helping them.

6. growing spontaneously, without being manipulated or forced, as in an authority site. Okay, I am kind of stretching on this one. But this has to do with natural links. Let's go back to the guy who added 10,000 new pages. What if that guy added 10,000 new reciprocal links? I don't think that looks natural. Come on people... A good, well written site with useful information that is unique will garner links on its own. You need to kick-start it sometimes (adding links to some popular directories) but if the content is good and unique, they will come. This is what happened to my hobby site. I was interested in the subject, found useful information, added a forum for others who were interested, made outbound links to other sites that had useful information, finally created a directory to make the outbounds easier for users and to manage myself, and generally created a buzz. Now I have other forums from other sites linking to specific topics in my forums. Links coming in to specific pages to find information, etc. The site became an authority. What is really wacko is the fact that I really am not an authority on the subject, it just sort of happened on its own.

7. having undergone little or no processing and containing no manipulative additives: natural sites; natural code. I would refer here to black hat stuff. You are going to get caught! Here is a case in point. I took over a site that was being hosted on another server. When editing the site, I noticed that the old hosting company was linked at the bottom of the home page. You know what I mean... "Developed By..." I called the client and asked if I could change that link to my company and was given permission to do so. I noticed that this link was on some other pages too, so I did the 'ol "Replace All" and had my link added. I then started redoing pages and working along developing the site. For quite a while... no matter what I did, I could never get that site to rank better then the 3rd page of G. I was baffled. Unique content, very narrow subject, some unique products, but still massive problems with ranking. I had garnered some very good inbounds and outbounds but still having problems. I was at my wits end and after telling the client I thought I could do well with ranking, I had egg on my face. Well after a couple of months of this, I was using link: on yahoo for my company's link and noticed a hundred or so inbounds from the client's site. I started clicking on these links and, What The Heck? I couldn't find any reference to our company! I did a "view source" and found the link and guess what... most of the links were for a small gif file that was used on the bottom of each page. In other words, the small gif file was the tag line at the bottom of each page and it linked to my company's web site. Now, that certainly looked bad to me and I am not certain that the change I immediately made was why the site FINALLY got on to page 1, but 2 weeks after removing that link on every page, the site shot up to #2. So what does this mean? Be Careful! Do not hide links, text, or try to manipulate code in any way to fool G. You will get caught.

8. having a real or physical existence, as opposed to one that is an affiliate, fictitious, etc. - I have found that it is much easier to rank companies or businesses that are real brick and mortar establishments or companies that manufacture or warehouse their products. Why? Because they are a viable company. They have an address, offices, people working for just that business, etc. Now there are a lot of sites out there that sell stuff, but they really don't exist in a brick and mortar sense. Their sole purpose is to rely on the SERPs for their business and if they take a hit in their ranking, they starve. This is another point that has been beat to death in the forums here. Folks, if you rely 100% on G for your revenue, things can and will get tough for you. I don't rely on G for my business, though it's nice to have, but I still find myself networking, calling prospective clients, advertising, etc. all without using the Web at all. If my ranking tanks, I'm still in business. That's the same with all of my clients too. You cannot rely on something as finicky as ranking as your sole source of revenue. I even got the boot out of DMOZ once when I tried to run an affiliate site that had the same address as the company I work for. They claimed that the sideline site I had created wasn't the sole purpose of the business, so they just took it out. This was back when DMOZ played a significant role in ranking in Google. So, if you are a webmaster that has 50 sites that sell various products all with the same business address and/or phone number, you may want to rethink your business plan. DMOZ looks to see if you have a contact page with your address and phone number, I bet it would be easy for G to do the same. Or do you have a contact page at all? Maybe you don't want people to find you... mmm?

Anyway, sorry this got so long winded. I just wanted to try to contribute some very basic observations from reading posts and how I find things. Some of the posts here are laughable. It always seems to be a game... how many times can I use a keyword before I trip a filter, how many pages can I add a day, how many links can I add a day, how many links can I have on a page, etc., etc., etc. Don't you see? Google creates the Algo to try to find what is natural and we try to find ways to create what appears natural. Why not just create websites for the user? I think you will find everything else will fall into place.

Okay, I tackled the word "natural." Anyone want to try the others?

"manipulation," "geniune," "trust," "contrived," "authority"

Comments are welcome but remember, I am just a hack in SEO...

[edited by: tedster at 3:55 am (utc) on April 17, 2008]
[edit reason] spelling fixes - member requested [/edit]


2:00 am on Nov 14, 2006 (gmt 0)

Worried about bull#*$! domains not counting? No problem just create a network of bull#*$! domains all using keywords in url, link them together. Google copying msn by regarding urls important so you get in straight away with keyword to link to main site and network creates the pr.
I am just observing not participating. This technique is pure spam.


3:44 am on Nov 14, 2006 (gmt 0)

10+ Year Member

jtara said:
Google should offer consumers the ability to permanently opt-out of seeing these kinds of results.

Wow! You have just presented one of the most significant statements about how G should handle this that I have ever seen! What a concept! Let the user decide if they really want to see only the sites of local service and/or product suppliers, or all. i.e.: &results=local or &results=national, or &results=anywhere, with anywhere being the default. This could also apply to AdWords results, as well as organic.

[edited by: RonnieG at 3:46 am (utc) on Nov. 14, 2006]


7:04 pm on Nov 14, 2006 (gmt 0)

10+ Year Member


You have just presented one of the most significant statements about how G should handle this that I have ever seen!


It's been there for a while - with maps and everything!

Just type in 'plumbers Tampa'. ...PAGES of them!

It doesn't get any more local than that - excluding the aggregators.

What you are getting at correctly implies that Google needs to educate USERS more on how to use Google. Did YOU know about Google Local?


7:37 pm on Nov 14, 2006 (gmt 0)

WebmasterWorld Senior Member jtara is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Google Local doesn't really solve the problem, nor do the plethora of other specialized Google search sites.

The problem with Google's approach to the problem is that it is overly simplistic. There is no way to combine the various specialized categorizations. Further, many of these specialized search actually search a different index or database, which further restricts the opportunity for combining them.

What if I want to do a local academic search? I am back to keywords to do the localization.

Google does do categorization of sites, in order to have these specialized searchs. But they don't expose it to their users, whom I guess they assume are too dumb to use the categories.


11:49 pm on Nov 14, 2006 (gmt 0)

Google Local last time I looked is powered by outside database's. So its not really part of google at all. Its drop in data sometimes powered by outside sources and sometimes powered from a paid feed from companies that specialise in this type of data collection. Its a very different creature from the main search results.


5:41 am on Nov 15, 2006 (gmt 0)

10+ Year Member

jtara: Can you give an example search? What's a 'local acedemic search'?

Pirates: Of course the results are 'two different creatures' - one is entirely local, the other is worldwide! Who cares if they're an amalgo of dbs, the information is there isn't it? It's fairly complete - for 90%+ of users, isn't it?

Do you want local B&M results, or do you want regular listings? How can you mix the two?

You can't - it would be 'unnatural'. (That's why there's Google Local!) If you tried to mix B&M listings with regular results, the regular results would be flooded with B&Ms, with no view of the 'wider world' part of the 'world wide web'.

You want local results, search local, you want logical results, search general...

GLocal is another example of how PhD's see 'natural vs. unnatural' - it's unnatural to mix entirely local results with 'worldwide results'. You can't have all meals on one plate, yet. :)


2:50 am on Nov 16, 2006 (gmt 0)

Pirates: Of course the results are 'two different creatures' - one is entirely local, the other is worldwide! Who cares if they're an amalgo of dbs, the information is there isn't it? It's fairly complete - for 90%+ of users, isn't it?

No I think the difference is a vital thing for an seo to know and by the way there is no longer a worldwide result on google. Google Search results rely on a database that is created from there spider. Local search results on google come from an entirely different source. A really good search engine should be able to determine the location and relevence of a site in there spider in my opinion and not rely on other people's databases to provide local results. I find google local results poor I am afraid and this is probably due to the databases they rely on

[edited by: Pirates at 2:51 am (utc) on Nov. 16, 2006]


9:49 pm on Jan 10, 2007 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

A thought provoking and fascinating thread that happened to get overlooked because of it's proximity to PubCon in Novemember.

aka: *bump*


10:46 pm on Jan 10, 2007 (gmt 0)

10+ Year Member

What I have not seen in this thread is a reference to what G might be doing with the data they are collecting from the G toolbar. They know how long surfers stay on pages, & which pages are added to favorites. IMHO in time they will use this data in deciding natural vs un-natural & in deciding which pages & sites are deserving of trust, and which sites are just in the way, and should not be listed high in the SERPs.

Back to Watching


11:33 pm on Jan 10, 2007 (gmt 0)

10+ Year Member

Duh... I laugh when I read posts like "I added 10,000 pages to my site and it was dropped from the SERPs!"

This has to be the most insane statement that I have read time and time again in thread after thread.

With 10,000 pages the subject matter of a domain is so diluted. Even though the Google algo is said to be page based...it is my opinion the algo also takes into account the big picture (the domain as a whole) in part.

After about 10 pages of content dealing with a subject..about all there is to be said, is said. Instead of 10,000 pages under one domain...a more "natural" approach would be 1000 domains with 10 pages each.

In the "good ole days" this was our approach to getting top listings for targeted keywords. It still works for me...author well written content, validated code and a couple of relevant inbound text links and I have no problem landing in the top 10 for targeted keywords within a couple of weeks...and remain there 90% of the time.

It's so simple, why do we make it so difficult? It's like we can't see the forest for the trees sometimes.


11:54 pm on Jan 10, 2007 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

Great post! Some of the other natural things you could include are fixing bad and broken code. I think anytime you do that google shows a site some love!


12:57 am on Jan 11, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

It seems unreasonable to penalise simply becuase they went online with it all at once

Something else to consider - is not indexing a url immediately the same thing as a "penalty"? I think not. I suggest that the quoted statement has a complementary version that also holds a lot of truth,

"It seems unreasonable to index and immediately rank a large number of new urls, simply because they went online all at once."


1:24 am on Jan 11, 2007 (gmt 0)

5+ Year Member

Very interesting thread…
While SEs can’t perceive meaning of ‘natural’ as humans, their algos do perform various mathematics (duhh) to try to get there. Zipf’s law provides one such formula. So what is natural growth of a site, or rate of link growth, etc.--- Zipf’s law and/or other laws based and derived from it can offer a clue (as was mentioned before some SEs have a lot of historical data so it’s not hard to extrapolate). Don’t get me wrong I am not saying this is THE way, or only variable, that SEs potentially use…
As always Tedster’s answers are insightful, and I do share opinion that there is comparison to good known profile (footprint) when looking into types of sites. While back I came across paper that actually did this type of analysis, and presented some interesting data and graphs of inbound/outbound links for various types of site (ecommerce, academic sites, etc.). If I find the info I’ll post it.


12:33 pm on Jan 14, 2007 (gmt 0)

10+ Year Member

Instead of 10,000 pages under one domain...a more "natural" approach would be 1000 domains with 10 pages each.

While I agree with the logic behind, there are a few things that defy it.
Some domains could still be authority on almost everything. E.g. Wikipedia.
Speaking about Google's view; owning everything above 5-6 (especially non related) domains is again - unnatural.

So, site around business is OK, but business around sites looks somehow unethical in Google's eyes and algo's.


3:33 pm on Jan 14, 2007 (gmt 0)

10+ Year Member

Thank you for your great contribution to WW as well as to the readers.


5:40 pm on Jan 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Hey Tastatura,

I would definitely be interested in that paper if you ever find it. Might be some good clues in there.

This 46 message thread spans 2 pages: 46

Featured Threads

Hot Threads This Week

Hot Threads This Month