homepage Welcome to WebmasterWorld Guest from 54.226.43.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 46 message thread spans 2 pages: 46 ( [1] 2 > >     
Natural vs. Un-natural - in SEO and the Google Algorithm
webdude




msg:3150450
 6:27 pm on Nov 8, 2006 (gmt 0)

I have been reading threads here for a few years and once in a while, I'll have the proverbial "lightbulb" go off in my head when looking at my sites and others. So I would like to discuss...

What is natural and what is not.

I run a very small web development company that develops and hosts sites for other companies. I have hunted and found posts from some of the Google posters (ie. Adam, Matt, Googleguy, etc.) here and on other forums. Every once in a while, some terms pop up that I really try to grasp. Things like "manipulation," "natural," "geniune," "trust," "contrived," "authority" and so on. I think the brain just had one of those light bulbs...

I consider myself fairly good at development... kind of a hack in SEO, but I still get fairly good rankings for a lot of my web sites. Not sure how, but that's the truth. But there are some things I do when it comes to development that actually may help in the SEO process. These are not things that I have conceived, but part of a "natural" process that I really never thought of before.

For example, let's look at the meaning of the word "natural." Here are just the first 8 I found...

1. existing in or formed by nature (opposed to artificial): a natural bridge.
2. based on the state of things in nature; constituted by nature: Growth is a natural process.
3. of or pertaining to nature or the universe: natural beauty.
4. of, pertaining to, or occupied with the study of natural science: conducting natural experiments.
5. in a state of nature; uncultivated, as land.
6. growing spontaneously, without being planted or tended by human hand, as vegetation.
7. having undergone little or no processing and containing no chemical additives: natural food; natural ingredients.
8. having a real or physical existence, as opposed to one that is spiritual, intellectual, fictitious, etc.

Of course, all of these have to do with nature, but I bet if you read these closely, you will see what I am getting at. All of these definitions could be reworded slightly and made to fit a "natural" progression of a web site from its infancy to maturity...

1. existing in or formed by naturally (opposed to artificial): a natural web site - When I get a call from a perspective client, I make it clear that I know nothing of their business, how to run their business, market their business, how to make their widget, etc. My clients are the ones who know their product and expertise the best. This is why they actually have a business. If they are selling red widgets, I need them to supply me with information on red widgets. I know development, but nothing of red widgets so I need to depend on them for the information, copy, images, or anything else that has to do with red widgets. I don't run out on the web and find information on red widgets for copy. I don't take images from other sites to use. I don't take anything that I think I may know about red widgets into consideration at all. They are the experts and I need to rely on them and their expertise. They dictate what their information is and how that information is linked. They also tell me what they beleive is the most important information and where that information should be and how they would like their clients to find that information. This is what I would call a natural website. It really had nothing to do with me, I just am able to interpret what they want on their site. Sure, I will make suggestions and sometimes even fight for a keyword in a link, but ultimately, it is their business.

2. based on the state of things on the web; constituted by websites: Growth is a natural process. - Duh... I laugh when I read posts like "I added 10,000 pages to my site and it was dropped from the SERPs!" Did this guy actually sit down for a couple of years and write all those pages of genuinely unique content? I doubt it. If best, it was ripped off or it is pages upon pages of slightly different database calls for multiple product descriptions. Let's face it folks, there is no need to have G index 10,000 pages of your site. My largest database is about 10,000 products and I don't care if they get indexed or not. I would rather have the user hit a category page before getting to the actual product... it gives them choice. All my sites are developed the same... thay start with the home page with a menu on the left and as I get information, I add links to the menu. These pages are the ones that I think are important in the SERPs. Kind of like a tree growing... it starts small and branches off into other parts. These parts get bigger and could branch into other parts. It's the large "trunks" that you need to pay attention to. Sure you want them to view a product or some piece of information, but the first goal should be to get them there. I tend to break all products into a few main categories and go from there. It seems to work for me. And if some of these pages get indexed, I would consider it a bonus.

3. of or pertaining to sites or the web: natural beauty. - Things have to look good folks. And information has to be readily available. A page of 10,000 links pointed to everywhere is not very pleasing to the eye. This may not have anything to do with your ranking, but I bet a pleasing, well thought out site will get you more repeat business. By this I mean the ability for a user to find useful information in a fast, easy and pleasant way. I have rule, especially for e-commerce type sites... "Four clicks and your out." What does this mean? It means that if a user cannot find information or order a product, find a phone number or address, find directions or anything related to the company in four clicks, you better seriously consider a better way to arrange your site. I have been doing this for years and I find it very helpful to keep this in mind. Take this site, for instance... I type in "webmasterworld.com," click on "Google Search News," click on a subject, and then I post a response... simple, elegant and very easy to do. Think about it, folks.

4. of, pertaining to, or occupied with the study of website science: conducting website experiments. This is something we, as SEO people, do all the time. You need to experiment to see what works and what doesn't. The problem I have found is that a bit of experimenting can be devastating, but you must experiment none the less. I have a site, a hobby site, that generates no income. It started out as an experiment to see how and why SEO works. Now the site in question happens to be something that I am very interested in. I have a forum, informational pages, lots of links to other related sites, a very comprehensive directory of related sites (very much like yahoo) and this site actually does very well in the SERPs. That is not the point, though. The point is it started out as something I could experiment with. Granted, I now really enjoy the site and have over 5000 members in the forum, but I will still tweak it once in a while to see how changes affect ranking. The bottom line is that if this site drops out or ends up on page 50 of the SERPs, it is not that big of a deal for me. I do not end up on Welfare and hop on the forum here and slam G for doing me wrong. Let's face it folks, if you want to get into the SEO business, you need to experiment. I would recommend a throw-away site just to see what works and what doesn't.

5. in a state of uniqness; unique, as in uncultivated information. This has been beat to death on the forums here. YOUR CONTENT HAS TO BE UNIQUE! What exactly does this mean? Take a look at this directory structure...

Lodging/NorthAmerica/US/Minnesota/StLouisCo/Hibbing/

There are probably hundreds of thousands of sites out there for lodging. Optimizing for such a site is going to be tough. Not only are you competing with those thousands upon thousands of sites, you are probably going to be competing with some pretty savvy webmasters, especially in the SEO business. Likelyhood of you getting any of your pages in the first ten is going to be pretty close to impossible. That's just the way it is. If you are planning on starting a lodging site like this, you are pretty much going to be adding to the Web something that it has way too much of. Lodging in North America gets a bit easier, but that is still going to be a tough nut to crack. Even narrowing it down to the US will be difficult. Now we get to Minnesota. It starts getting a bit easier. Then to St Louis County, easier still. We end up at Hibbing. This is going to be the easiest. Why? Well it doesn't take a rocket scientist to figure this out. There probably are no web sites dedicated to lodging in Hibbing! You could probably develop this site and have it rank #1 for "lodging in Hibbing" within a month, if you know what you are doing. In fact, you could use this as your experimental site ;-) Uncultivated information is the key here. THAT is what it means to be unique. Not rewriting something that has been beat to death but writing something that has never been written about! I know this is getting tougher as time passes, but that is just the way it is. You need to be frank with your clients about this too. If they come to you and want to sell or advertise something that has been beat to death, you need to tell them the likelihood of great rankings are pretty much nil. If they can narrow their focus to what exactly they want in the way of users and how those users are going to find them, you are going to have a better chance of helping them.

6. growing spontaneously, without being manipulated or forced, as in an authority site. Okay, I am kind of stretching on this one. But this has to do with natural links. Let's go back to the guy who added 10,000 new pages. What if that guy added 10,000 new reciprocal links? I don't think that looks natural. Come on people... A good, well written site with useful information that is unique will garner links on its own. You need to kick-start it sometimes (adding links to some popular directories) but if the content is good and unique, they will come. This is what happened to my hobby site. I was interested in the subject, found useful information, added a forum for others who were interested, made outbound links to other sites that had useful information, finally created a directory to make the outbounds easier for users and to manage myself, and generally created a buzz. Now I have other forums from other sites linking to specific topics in my forums. Links coming in to specific pages to find information, etc. The site became an authority. What is really wacko is the fact that I really am not an authority on the subject, it just sort of happened on its own.

7. having undergone little or no processing and containing no manipulative additives: natural sites; natural code. I would refer here to black hat stuff. You are going to get caught! Here is a case in point. I took over a site that was being hosted on another server. When editing the site, I noticed that the old hosting company was linked at the bottom of the home page. You know what I mean... "Developed By..." I called the client and asked if I could change that link to my company and was given permission to do so. I noticed that this link was on some other pages too, so I did the 'ol "Replace All" and had my link added. I then started redoing pages and working along developing the site. For quite a while... no matter what I did, I could never get that site to rank better then the 3rd page of G. I was baffled. Unique content, very narrow subject, some unique products, but still massive problems with ranking. I had garnered some very good inbounds and outbounds but still having problems. I was at my wits end and after telling the client I thought I could do well with ranking, I had egg on my face. Well after a couple of months of this, I was using link: on yahoo for my company's link and noticed a hundred or so inbounds from the client's site. I started clicking on these links and, What The Heck? I couldn't find any reference to our company! I did a "view source" and found the link and guess what... most of the links were for a small gif file that was used on the bottom of each page. In other words, the small gif file was the tag line at the bottom of each page and it linked to my company's web site. Now, that certainly looked bad to me and I am not certain that the change I immediately made was why the site FINALLY got on to page 1, but 2 weeks after removing that link on every page, the site shot up to #2. So what does this mean? Be Careful! Do not hide links, text, or try to manipulate code in any way to fool G. You will get caught.

8. having a real or physical existence, as opposed to one that is an affiliate, fictitious, etc. - I have found that it is much easier to rank companies or businesses that are real brick and mortar establishments or companies that manufacture or warehouse their products. Why? Because they are a viable company. They have an address, offices, people working for just that business, etc. Now there are a lot of sites out there that sell stuff, but they really don't exist in a brick and mortar sense. Their sole purpose is to rely on the SERPs for their business and if they take a hit in their ranking, they starve. This is another point that has been beat to death in the forums here. Folks, if you rely 100% on G for your revenue, things can and will get tough for you. I don't rely on G for my business, though it's nice to have, but I still find myself networking, calling prospective clients, advertising, etc. all without using the Web at all. If my ranking tanks, I'm still in business. That's the same with all of my clients too. You cannot rely on something as finicky as ranking as your sole source of revenue. I even got the boot out of DMOZ once when I tried to run an affiliate site that had the same address as the company I work for. They claimed that the sideline site I had created wasn't the sole purpose of the business, so they just took it out. This was back when DMOZ played a significant role in ranking in Google. So, if you are a webmaster that has 50 sites that sell various products all with the same business address and/or phone number, you may want to rethink your business plan. DMOZ looks to see if you have a contact page with your address and phone number, I bet it would be easy for G to do the same. Or do you have a contact page at all? Maybe you don't want people to find you... mmm?

Anyway, sorry this got so long winded. I just wanted to try to contribute some very basic observations from reading posts and how I find things. Some of the posts here are laughable. It always seems to be a game... how many times can I use a keyword before I trip a filter, how many pages can I add a day, how many links can I add a day, how many links can I have on a page, etc., etc., etc. Don't you see? Google creates the Algo to try to find what is natural and we try to find ways to create what appears natural. Why not just create websites for the user? I think you will find everything else will fall into place.

Okay, I tackled the word "natural." Anyone want to try the others?

"manipulation," "geniune," "trust," "contrived," "authority"

Comments are welcome but remember, I am just a hack in SEO...

[edited by: tedster at 3:55 am (utc) on April 17, 2008]
[edit reason] spelling fixes - member requested [/edit]

 

Digimon




msg:3151264
 12:58 pm on Nov 9, 2006 (gmt 0)

I do agree with most of what you say. Good post.

However bear in mind that some people here are not developers but just SEO professionals that need to get the most in the shortest ammount of time. That means finding the limits and pushing things as much as they can. What makes WebmasterWorld is that you can find pretty much every approach you could think of.

tedster




msg:3151962
 12:47 am on Nov 10, 2006 (gmt 0)

That is an excellent thread starter, webdude - thanks. I'll offer my vision about what "natural" versus "un-nnatural" might mean to Google in the area of websites. This is educated conjecture. I've never been to the Plex, never looked over a Googler shoulder and had a peek at their special tools. But I do know such tools exist, and I have been in the room when their power was made clear.

REAL DATA
Google has been keeping historical records for quite a while now -- records of how fast content appears, changes and vanishes over different types of web properties. They also keep historical records of backlinks -- their rate of appearance, change, and disappearance in various types of scenarios.

Google has been studying all this quite directly, en masse and "in the wild" rather than just generating ivory-tower abstractions. By now Google has an immense data set.

FOOTPRINTS and PROFILING
Google has established statistically significant footprints for what is natural and un-natural in the areas of appearance, change and disappearance for content, links, and who knows what else. And they can generate such footprints for various "types" of sites and market areas.

With that pile of data, Google can generate very sophisticated web-maps and visual representations of their data -- and confine those maps to various slices of the whole. They can build extremely accurate link profiles and then see visually what the mean distribution really looks like -- with regards to rate of link acquisition, the ratio of deep linking to domain root linking, the differences comparing branded corporate sites to free hosted pages -- on and on.

THE BIG VIEW and CLOSE-UP
Google can zoom out for an overview of the entire web, or zoom in to look at just e-commerce sites, or just sites without affiliate links. They can profile one single domain and overlay its footprint with the mean profile or footprint they've collected for similar sites or the web as a whole.

They can designate certain hot spots such as "manipulative linking nodes" and display them in red on a link map. Once the available data set grows to a certain level, amazing and apparently magical learnings become simple.

TECHNICALLY PRECISE MEANINGS
In a situation like this, words like "natural" or "manipulative" can take on very precise and rigorous technical meaning. And deviation from the normal footprint can be measured algorithmically and have automated consequences.

Statistically significant deviation can also raise a flag for a human to visually inspect the webmap and associated footprints. When major deviations are spotted, they will not commonly be "false positives" -- although with statistics, anything is still possible in a single isolated case.

GROWING SOPHISTICATION
A lot of the oddities that we see recently in search results are improvments to the sophistication of this kind of data modeling. Big Daddy gave Google a lot more elbow room to crunch many more numbers, keep more records, and so on.

Google has evolved a long way from the crude text matching that characterised early attempts at web search. When we work to understand what is happening with the SERPs today, we should appreciate the near-magic that has been created at Google and not be too primitive in our assumptions about what is, or soon will be, possible on their back end.

And the beauty of such an approach - it scales, and it IMPROVES as scale increases.

CAVEAT
And as I said, what I wrote here contains a lot of educated guesswork, fueled both by studying Google patents and by close listening to what Googlers say and how they choose their words. I don't "know" that this is all true, but I'd be very surprised if it isn't.

[edited by: tedster at 5:15 pm (utc) on Dec. 28, 2006]

g1smd




msg:3152000
 1:23 am on Nov 10, 2006 (gmt 0)

>> pushing things as much as they can <<

But now, Google is pushing back, and in multiple different ways: and in different ways for different types/sizes/ages/etc of site.

nippi




msg:3152037
 1:59 am on Nov 10, 2006 (gmt 0)

I'm always curious about the "You added too much content, too fast argument" I have two new clients, about to do just that,

(1) A parenting magazine with 10 years of monthly issues, that decided to put its whole content on a site, including all the worthwhile articles it did NOT print due to magazine size restraints?

20,000 articles. As far as the net goes, its all fresh, fat content.

Another client, has as a product database going online for the first time, that is 70,000 products strong.(Jewlelery,gemstones... so many types and mdoels.)

There are many occasions where a smallish offline business that goes online, immediately has a massive site.

It seems unreasonable to penalise simply becuase they went online with it all at once, rather than adding it as the same speed as if the content was being created now, rather being having historically created, but not web published.

jtara




msg:3152063
 2:28 am on Nov 10, 2006 (gmt 0)

Here's a testimony to the power of natural links...

I have a domain name once had a stock discussion site. I abandoned the site a few years ago. (Before keyword-based text advertising. Banners just weren't bringing in the bucks.) Got to be too much trouble. I kept the domain name, though.

The empty web site consists of a single line containing the domain name.

It is a 9-letter .com, consisting of two keywords - a financial term and a common word for a kind of group.

It is the first listing in Google for those keywords.

Google just can't bear to drop a site registred in 1996 with 100's of links that pre-date Google. ;)

It's amazing how those links persist. I think the site has been dead now for 5 years, yet the links are still there. Maybe I should give it another go, but I don't think the world needs another stock discussion site. ;)

I sold another site that was #4 for keyword "San Diego". The only thing above it were the city official web site, chamber of commerce, etc. (The Zoo was just below my site...) Of course, all the sites above it linked to my site, which helps.

I never lifted a finger to get links. They just accumulated on their own over several years.

tedster




msg:3152077
 2:39 am on Nov 10, 2006 (gmt 0)

It seems unreasonable to penalise simply becuase they went online with it all at once

Agreed, and I don't think that will happen in the cases you described. Again, it's about footprints, and the situation you described will have a very different footprint than a site that decides to autogenerate a huge pile of new pages by slicing and dicing a database in new and "creative" ways, or with less-than-original content.

I would appreciate a follow-up report on those situations a few weeks after launch. I have a few clients in somewhat parallel situations, and some others who were planning a 400-fold increase in their urls through various database manipulations.

Even though the 400-fold increase was planned for visitor convenience, I do have a lot of caution around exposing 80 million new urls to Google all at once and I have advised some protective measures. Those urls do not even all NEED to be in the index, in my view, and trying to get them in is just asking for trouble.

shri




msg:3152104
 3:26 am on Nov 10, 2006 (gmt 0)

One thing we tend to ignore that a fair bit of the research at Google (and most other search engines) is driven by academics and the algorithms are based a lot on the citation model.

Sometimes it helps to step back from obvious SEO manipulations and step into the feet of someone writing a research paper .. what do they cite and what to do they avoid citing.

Academics after all invented the whole publish or perish thing. Plenty of what google has to deal with, copied papers, bad quality etc tends to stem from issues that academics both create and have to judge on a daily basis.

Hopefully it brings a focus on quality, expert v/s authority etc etc ...

hellraiser1




msg:3152202
 6:32 am on Nov 10, 2006 (gmt 0)

"It seems unreasonable to penalise simply becuase they went online with it all at once"

completely agree --- actually this is an interesting topic for me. I have an ecommerce site and what it really is is a database of products and their related categories. there are sub categories to those categories. I have handmade widgets, red widgets, mass produced widgets, etc. why shouldnt i have a page for "handmade red widgets" as well as a page for each handmade red widget i sell --- (small handmade red widget, large handmade red widget.) the more i think about it, the more i sub catogorize and add product pages, the more focused parts of my site become, and when someone looks for "handmade red widget" theres a page with that keyword in the title. Most of my traffic comes from google, and because of my many many pages, my landing pages are well distributed. about 1% search for my front page keywords, they are too broad, instead 90% land on my sub category and product pages, as the keywords become more specific. In fact, if one pages keywords falls, another page creeps up, thus traffic is steady and only grows as i add more products and categories.

google has created numnrous APIs, sitemap and GOOGLE BASE FEEDS thus encourging the automation and bulk processing of data. If i have 10,000 products, i want google to recieve ALL 10,000 of them immediately, as each product should be searchable. In fact, if i dedicate a static page for each product, then the searcher will get to what theyre looking for quicker.

I think the problem with too much content is too much duplicate content, versus good content. like people who recreate the SMC catalog and assume they will be successfull.

anyway, thats my rant

webdoctor




msg:3152214
 6:51 am on Nov 10, 2006 (gmt 0)

In fact, if i dedicate a static page for each product, then the searcher will get to what theyre looking for quicker.

This is fine as long as we're talking about unique content that your visitors are after.

If I'm looking for a "handmade red widget" and I find your page www.example.com/widgets/handmade/red.html and it's got all the information I need, then good for you - I'll buy your product.

On the other hand, there are countless sites out there who are taking someone else's "feed" and generating pages this way - often without any unique content at all - and from the visitor's point of view, this stinks.

I was using Google and Froogle yesterday, looking for a 'small widget'. It seems that there's only one supplier of 'small widgets' where I am - but guess what, they run an affiliate program :-) So there were hundreds of matches. Sadly they were all for the same product. With the same price... and identical content on the product pages.

How is this a good experience for the visitor?

I gave up searching for my 'small widget' online and decided to try a local bricks-and-mortar store instead.

beanfortez




msg:3152248
 8:00 am on Nov 10, 2006 (gmt 0)

Excellent Post. Content and Relevant Links is what will project the site in the future. Instead of generating garbage, you need to generate actual factual data. I wouldn't be surprised if google one day checks the data on the site to see if it is actually true. Sounds outlandish maybe, but we already have robots that vacuum.

I would call it the truth filter. It would filter out the bullshiiiters. lol...

davidof




msg:3152934
 7:30 pm on Nov 10, 2006 (gmt 0)

> Google has been keeping historical records for quite a while now -- records of how fast content appears,

and this is probably one reason why Live.com is having so much trouble trying to play catch up, coupled with the fact they don't seem to stick at any one strategy for long so probably don't build up a good picture of what constitutes a good web site.

Oliver Henniges




msg:3153101
 10:12 pm on Nov 10, 2006 (gmt 0)

Excellent post. Let me add: Natural comes from latin "natura" and this in turn from the verb-stem "nasci", so natura is the "things that will be born." Maybe as a consequence some of these things are also bound to die. Has anyone yet analysed the dying of parts of a website as a possible ranking factor drawn from historical data?

> Kind of like a tree growing

Yes. Always. Aristotle was the one who invented this prototype in his description of nature.

In a more modern terminology I would say "fractal."

Is there any standard shopping-cart-system able to account for such a fractal strucure of product-groups with varying if not infinite depths of the tree-roots? Is this the reason for the success of the self-made-systems of amazon and others?

And "inhomogeneous."
Nature often is so ugly and asymetric.
In contrast to databases.

jd01




msg:3153115
 10:44 pm on Nov 10, 2006 (gmt 0)

WOW --- Early 'thread of the year' nomination for this one. Thanks.

Justin

g1smd




msg:3153164
 11:47 pm on Nov 10, 2006 (gmt 0)

I still think that a large part of the algorithm simply asks of a URL: "what type of spam are you trying very hard not to look like today?"

webdude




msg:3155124
 3:06 pm on Nov 13, 2006 (gmt 0)

tedster,

I found your reply very enlightening.

I think "Historical" is a key ingredient in the mix. In fact, if you combine "historical" and "natural," that is probably the best winning combination. I have several sites that I have taken over of companies that have been in business for many years. Their websites were usually written by the boss's son or the secretary and these people really knew nothing of SEO and in many cases the code was poorly written, lots of huge images... you know the type of site I am talking about ;-)

What I found, though, even though I would have developed these sites totally different from a developer's point of view, is that these sites still did fairly good in the SERPs. Granted they weren't #1, but for some phrases, they were popping up in the top 30. Now I would consider that pretty good for not having a single incoming and maybe a handful of outgoing and having the same title on each page.

So what placed these sites where they are? They had a history and they were built naturally. And G new of their history and percieved the growth of the site as natural. In fact, I love these types of sites. It's amazing what you can do for them.

A case in point...

I was approached by a company that sold and serviced widgets in a large metropolitan area. Their site consisted of 6 pages - home page, contact page, a list of their suppliers with outgoing links, a page that talked about their great service, a page of photos of their showroom, and a page that showed a map on how to get to their brick and mortar store. This business was not large, probably around 30 employees, and it was put together by the daughter of the owner. The site had been up for over 8 years and had changed very little in that time. The titles for each page was just the name along with the city and state of the company and the description was the same on each. Even though the site had zero optimization, it still ranked around #35 for various key phrases along with the state that the site was located. The business was very local and only sold, installed and serviced within this metropolitan area. I was asked if I could improve their ranking along with a new design. The only page indexed by G at the time was the home page.

I redesigned the site from scratch. I broke down their widgets into 8 main categories and created slideshows for each with outgoing links to their suppliers on each of these pages. I tweaked all the titles, descriptions on each page to make them unique. I created a recip link page and an administrative front end for the company to be able to add links, sort them, etc. I then asked them to contact their suppliers and ask them for links back to the site. I added the site to a couple of dozen directories, usually in the regional sections because that was their target audience. The incoming links took about a month for them to add.

The site now ranks #1 for a very wide range of phrases along with locality. In other words, searching for widgets in (city or state), they are popping up #1. The widgets are something everyone has in their home so they are very pleased with the results. Even breaking down the categories ranks them #1... Thingamajings in (city or state), Doodads in (city or state), Thingamabobs in (city or state). Even the reverse is true... (state) widgets, (city) widgets, (state) thingamajings, (city) thingamajings, etc, etc. etc.

Now, what I did was not very hard to do. I just followed some of the suggestions of some of the fine folks here at webmasterworld. What this site had going for it was that it was a real brick and mortar business, it was built naturally and it had a history, albeit a simple one.

This was the reason for starting this thread. Are we getting bogged down in the minutiae of the algo? Could be...

zpbs1914




msg:3155187
 4:15 pm on Nov 13, 2006 (gmt 0)

webdude, yours is a very pertinent and useful example of exactly what you describe in your OP. That said, I wonder if the example was a little simplistic?

You describe a situation with a lot of "low hanging fruit". A well established company with a quality product and unique content for the web. Additionally, you were able to procure great link relationships with what were assuredly very popular sites. Finally, you took great care in the method you used for presenting this unique content. Facing facts, the surprise in that situation would be if the site DIDN'T do well in a reasonable amount of time.

Conversely, what happens when you are working on a PR 7 (I know, PR means little but it's a great indicator of the status of a site) site that has been around for 10+ years and you are trying to increase your rankings from 13-14 to 1-2? In that case, the details are all that you have to focus on.

webdude




msg:3155259
 5:21 pm on Nov 13, 2006 (gmt 0)

zpbs1914,

I agree that the illustration was simplistic. That was the point of the post. In fact, the point of this whole thread. Your competitors, how saavy they are and what is currently out there for any given subject plays a vital role in ranking. Please refer to 5. - in a state of uniqness; unique, as in uncultivated information. When I say uncultivated, that is exactly what I mean. Something that hasn't been written about. The site I have as an example rates nowhere if looking for just that widget. The fact that I was able to narrow it down to a widget within a specific geographic location is the point.

Look, I am not expousing any type of expertise in any of this. I am just taking real world examples of some of the stuff I look at. Ranking for snickerdoodles will come much easier then ranking for travel. That's just the way it is. I am an SEO hack and these are just very simple interpretations of what I see.

circusboy




msg:3155324
 6:29 pm on Nov 13, 2006 (gmt 0)

Great thread, Webdude. Thanks.

This brings up a case which has been keeping me up at night...

We have a client that we have re-built a site for, and they offer widgets around the country - all kinds of widgets, and they have an article/landing page about each type of widget.

We developed a CMS that will take each article (eg: article about 'large widgets'), and optimize it for 100's of cities, thus creating hundreds of localized pages about that 1 product, ie: '/large-widgets-city1.htm', '/large-widgets-city2.htm', '/large-widgets-city3.htm', etc...

The client currently has 200 pages about 200 widget types (ranking well for many), and we have 1500 cities in the system (client offers all these widgets in all these cities) - that means an increase from 200 pages, to 300,000 pages. The site is 5 years old, has well established links (and growing), PR5, and has unique valuable relevent content to localized searches. BUT, they also have AdSense and some affiliate ads on these landing pages in addition to their own Calls To Action. :)

In your opinion (to all) - will (or should) the site be penalized?

Is it 'natural' to choose to optimize for your entire market base, when you haven't in the past? Is there a more 'natural' way to do this?

jtara




msg:3155368
 7:06 pm on Nov 13, 2006 (gmt 0)

The client currently has 200 pages about 200 widget types (ranking well for many), and we have 1500 cities in the system (client offers all these widgets in all these cities) - that means an increase from 200 pages, to 300,000 pages.

What does this do for the company's customers or potential customers?

That's the question they should be asking, as it is fundamentally the one that Google asks.

Are the widgets any different depending on what city they are sold in or to?

econman




msg:3155392
 7:24 pm on Nov 13, 2006 (gmt 0)

An increase from 200 pages to 300,000 pages is certainly going to be noticed by Google. They are also likely to notice the fact that a lot of those pages have a strong amount of overlap with each other. Google might classify this as a carpet rather than a lawn. (Manufactured, rather than natural).

circusboy




msg:3155455
 8:13 pm on Nov 13, 2006 (gmt 0)

What does this do for the company's customers or potential customers?

Our keyword research revealed that the vast majority of searches for what client offers are localized, ie: 'large widgets pheonix', 'large widgets tampa', 'large widgets chicago', etc... (this is mainly because their widgets are local services, not products - think "plumbers"), and as mentioned, client offers all these services in all these cities. So to answer the above question, "What does this do for the company's customers or potential customers?" It exposes client to them, and offers client's valuable services wheras searcher may not have known about client previously - because client's business is completely web based. In fact, the searcher who uses a localized phrase would never find client without these localized pages. Client's off-line marketing takes this same approach, by focusing ads on specific services in all cities (yellow pages, etc...).

How else can our client capture searches for the 200 various services they offer in all these 1500 cities? Any suggestions? - although a little late! :) Does the client not have the right to fine-grain focus their pages without being penalized?

I should note, duplication was a huge concern for us, so we built a certain amount of randomization into the pages so they wouldn't all be the same, except for the city names. Also, we added the feature to allow comments/questions on these articles as well as rating them, to help generate user content on these pages.

webdude




msg:3155474
 8:27 pm on Nov 13, 2006 (gmt 0)

jtara,

Excellent point. The widgets are the same. In fact, if 1000's of sites sell the widget, the widget is still the same. However, if you are the only company selling that widget in Timbuktu, and you have your site optimized for that, searching for Timbuktu widgets will certainly place you near the top. I would think the opposite would be true also... If you are the only seller of the widget, then optimizing for the world would place you at the top. Kind of a conundrum. This is why I believe a brick and mortar business has an advantage. Hence - 8. having a real or physical existence, as opposed to one that is an affiliate, fictitious, etc.

webdude




msg:3155503
 8:50 pm on Nov 13, 2006 (gmt 0)

circusboy,

How else can our client capture searches for the 200 various services they offer in all these 1500 cities? Any suggestions? - although a little late! :) Does the client not have the right to fine-grain focus their pages without being penalized?

I think the problem is this is not fine-grain optimization - this is large bolder optimization. You are trying to optimize for many cities. Tell me, is this service something only your client does? If so, you should have no problem. If the only plumber in the world lived in California, well I guess I would have to call him to fix the toilet.

I should note, duplication was a huge concern for us, so we built a certain amount of randomization into the pages so they wouldn't all be the same, except for the city names. Also, we added the feature to allow comments/questions on these articles as well as rating them, to help generate user content on these pages.

mmm. 2. based on the state of things on the web; constituted by websites: Growth is a natural process -

joeduck




msg:3155563
 9:36 pm on Nov 13, 2006 (gmt 0)

"what type of spam are you trying very hard not to look like today?"

Good discussion here and I think Shri and G1's are key points - seems to me the emphasis shifted around 2004 from giving many sites the benefit of the doubt to application of rules that penalized a lot OK sites in an effort to kill off the swelling numbers of junk sites.

circusboy




msg:3155566
 9:39 pm on Nov 13, 2006 (gmt 0)

Webdude;

is this service something only your client does?

Without letting the cat out of the bag, let's say the client is a 'national online plumber's referal service', where local plumbers sign up to be listed locally in the client's national online database.

So, unless my client optimizes a page for 'plumbers in Tampa', they have no chance of being found. By NOT having - or penalizing - this page, it is a disservice to:

a) web searchers doing LOCALIZED searches for these services (which is proven to be vast majority - makes sense), because they won't see this full list of licensed plumbers in Tampa - user value
b) the various plumbers in Tampa who signed up to be listed
c) my client, who has a great business model, providing a referal service for busy plumbers who aren't marketers

Now, if a searcher just searches 'plumbers' we're cool, and rank well, but the VAST MAJORITY (90%) of searches are LOCALIZED long-tail searches.

So, is it not natural to grow pages for markets that you serve? And, just because (through research) you just realized this now, you should be penalized for handling it all at once?

Is it not natural for a caterpillar to realize it's time to grow, cocoon itself, and then re-emerge as something completely new and much larger? Should the new butterfly get penalized because the bird grows more gradually?

I totally see your point(s) - which is why I've been losing sleep. We're taking a chance here, and we're banking that Google has enough sense to know that 'natural' comes in many forms. If not, there's always the reinclusion request - LOL!

RonnieG




msg:3155611
 10:18 pm on Nov 13, 2006 (gmt 0)

What econnman said except:

G appears to be actually rewarding large eCom sites that do this with better PR and SERP positions because of their sheer size, and the artificial appearance of a national and/or global presence. Therefore, sites that resort to these unnatural methods to clone and create hundreds of thousands of locale specific pages will continue to do so and thrive, and will continue to suck up G's index and database capacity, to the detriment of truly local businesses that have a real brick & mortar presence in those communities and selling the same products or services.

This is happening in many industries. Big bucks national aggregators are artificially creating locale specific pages for certain services, even though they have no real presence there, and even though they do not actually offer or provide those services themselves. They are then using their prominence on G and other SEs to monopolize consumer online inquiries for those services in all those markets, just to turning around and sell the resulting consumer inquiries as "leads" to the truly local service providers and dealers. The effect is usually higher cost to the consumer, not a savings, because the true service providers have to mark up their charges to cover the cost of the leads, in some industries, a substantial percentage of sales.

In some industries and services, this may make good business sense, since the locals may not have the ability or desire to build their own web sites. In others, it is not good business, and detracts from good, local providers who do go to the effort of developing their own web presence. So it can go either way.

[edited by: RonnieG at 10:23 pm (utc) on Nov. 13, 2006]

circusboy




msg:3155684
 11:38 pm on Nov 13, 2006 (gmt 0)

to the detriment of truly local businesses that have a real brick & mortar presence in those communities and selling the same products or services

This is a service that MANY local small business have/can join, nationally (almost like an online co-op), precisely because they do not have the time or capability to build an online presence. This service PUTS THEM ONLINE, easily, quickly, and inexpensively - how is this a detriment?

I think G embraces these "national aggregators" because G has the data that shows everyone is happy. Here's the scenario: User does search, clicks top ranking 'national' site, lands on page and sees a list of what they're looking for with reviews and testimonials, stays and makes conversion. VS. User does search, clicks local top ranking site, not happy, wants to compare, goes back G, tries another result, takes down number, wants to compare more, goes back to G, tries another search, checks out a few other local sites, etc.... it's a waste of time and energy.

The effect is usually higher cost to the consumer, not a savings, because the true service providers have to mark up their charges to cover the cost of the leads, in some industries, a substantial percentage of sales.

(sorry to get off-topic everyone)

RonnieG, I'm glad you qualified that with the statement ending with "So it can go either way.", because, if you have access to the data I have, you'll learn that is completely untrue. Costs actually go down for the plumber because 'cost per lead acquisition' is a fraction of the cost and energy of traditional marketing - and in some cases it's even free. And, if a 'truly local' plumber has to raise prices because marketing costs are going up..., is that plumber making wise marketing decisions, when there are less expensive alternatives that obviously work?

"what type of spam are you trying very hard not to look like today?"

If a search is done for 'plumbers in Tampa'... how can a page - with an article about hiring plumbers, and a LIST of licensed plumbers in Tampa - be considered spam? ...even IF that page was created with 1,499 other pages with the same article about hiring plumbers, but focused on other locations. None of them are spam, but the process of creating them might be? I don't get it.

I think the broader issue is 'what is natural?' (vs. unnatural), and that this has to be interpreted many ways by Google to accomodate the many ways which 'natural' can and does occur.

It is natural for an established online business to recognize new marketing avenues and build a new site that exploits them. It is unnatural for Jimmy Teenager in his basement to build the exact same site, just because he recognizes the same market.

'Natural', for purposes here, is subjectively interpreted and applied by G based on observation of statistics that we have no clue of. So, in light of that, I'll tell you in a few weeks if we get banned or penalized. We launch tomorrow. Wish us luck. :)

jtara




msg:3155722
 12:14 am on Nov 14, 2006 (gmt 0)

As a consumer, I don't want to see ranking, directory, or comparison sites. My experience with them has been nothing but negative.

Google should offer consumers the ability to permanenty opt-out of seeing these kinds of results.

In the mean time, if you feel the same way, many browser search extensions permit a string to be added to the end of every search you do. This will at least help take out the worst of the trash. When Google sees these on a significant number of searches, perhaps they will wake up.

Here's mine:

-inurl:(dealtimeŽnextagŽbizrateŽepinions)

But that only solves half the problem. Unfortunately, this only filters natural results, not ads. And it doesn't solve the problem that Google is pricing local merchants out of the advertising market.

[edited by: tedster at 5:12 pm (utc) on Dec. 28, 2006]
[edit reason] turn off smile graphic [/edit]

Pirates




msg:3155772
 1:29 am on Nov 14, 2006 (gmt 0)

I always can roughly read an algo. This algo emphasises in my opinion inbound links from so called relevent sites. So in this algo in my opinion if your competitor has registered loads of domains with there keyword part of the the domain name and linked to target site they will win on this algo. So everyone out there go create a bull#*$! domain containing the keyword in url and link it to your main site if you wanna succeed on google on this algo. Of course I don't mean this as I consider it spam....... shame google don't.

This 46 message thread spans 2 pages: 46 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved