Forum Moderators: open

Message Too Old, No Replies

Cookie Cutter approach

         

incywincy

4:17 pm on May 9, 2003 (gmt 0)

10+ Year Member



hi,

a competitor of mine has a directory of widgets based on geographic location. rather than open up the database to google they have created thousands of cookie-cutter templates each varying only by town name. so each page has an optimal number of words and links containing 'widget townxyz' that point to their home page. the pages themselves contain no original content.

in one case i searched for 'widgets townabc', their home page was returned at #1, i looked at googles cache and saw 'These terms only appear in links pointing to this page' , i guess this is due to lots of internal links.

to be fair their database does cover the geographic areas that their doorway pages are promoting but does database protection justify spamming?

is this technique safe to use?

heini

4:44 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A very common technique, you'll find it in the travel area all the time. Understandable from both sides. They do cover all those areas/towns, so they want the be ranked for a query on that.
OTOH this tends to clutter up serps, as well as make it hard to find specific local services.

Justified? Not my problem. If a SE finds this is against their interests they should improve their algo.

Safe? Looks like a bit of gambling to me. That said technically it's probably no spam at all. All legit pages, all linked within the site, no hidden stuff, I assume. At least if they are clever.

Yidaki

4:45 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is the content they create using templates the same than the database offers / the db queries return?

>the pages themselves contain no original content.
>to be fair their database does cover the geographic areas that their doorway pages are promoting but does database protection justify spamming?

Sorry, i can't see your point. If the database covers what they are promoting what's the problem? What do you mean with "protecting"? Using templates to generate static content out of a db is the same as "opening the database" for queries and perfectly legit if the content isn't just artificial.

Marcia

5:08 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the pages and content are relevant to what people are searching for I don't see what the problem is.

>database

It makes more sense than sitting and hand coding each page in Notepad.

incywincy

5:08 pm on May 9, 2003 (gmt 0)

10+ Year Member



yidaki,

their pages are like this:

'our site is designed to give you information about <link to homepage>townxyz blah blah if you want to know about widgets in <link to homepage>townxyz we have everything you need etc etc'

this page doesn't actually contain widget information it's just a doorway into their homepage and their search facility, do you see what i mean about it not being original content?

these pages are replicated thousands of times with the townxyz changed to rank for widget-anytown . they also do this for widgeta, widgetb etc etc.

when you actually use their internal search facility you can get at widget information.

martinibuster

5:10 pm on May 9, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



each varying only by town name.

Unless I'm misinterpreting this, you're talking about a duplicate content penalty.

From the Google Webmaster Guidelines [google.com]: "Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches... with little or no original content."

Some people complain that not enough of it is caught.

[edited by: martinibuster at 5:11 pm (utc) on May 9, 2003]

incywincy

5:10 pm on May 9, 2003 (gmt 0)

10+ Year Member



marcia,

they could generate dynamic pages with the real content on it and trap non-spider ips if they try and leech the database. then google would see what visitors ultimately see, the real site content not a doorway to it.

Yidaki

5:13 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>they could generate dynamic pages with the real content on it ...

Unfortunately they're your competitors so you can't give them this great advice, or!? ;)

Marcia

5:13 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>just a doorway into their homepage and their search facility

If they're useless doorways that specifically violate guidelines and provide no useful information for searchers, fill out a report and let the search quality people have a look.

All we can do if pages themselves actually provide searchers what they're looking for, but are contrary to how we think they should be done, is just not do it that way ourselves.

This is really all we can do that's productive:
[google.com...]

Yidaki

5:16 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>fill out a report and let the search quality people have a look.

But don't forget: if they change their tactics and create dynamic pages with their real db content, it can get even harder for you to beat them. Sometimes it's wise to accept / ignore spam. ;)

incywincy

6:07 pm on May 9, 2003 (gmt 0)

10+ Year Member



you're right yidaki, need to learn to live with the competition. google will eventually determine duplicate pages i guess. i suppose i'm fascinated with peoples perception of spam.

allanp73

6:37 pm on May 9, 2003 (gmt 0)

10+ Year Member



I do something similar to the "towns widgets" approach. I start off templeted pages with only the keyword variance and look (different html code) because the template has information that is relevent to any town. Then to the template I add specific information related to each town. It is a time consuming process. My fear is that people who only create the template but not the follow up will cause problems for me. I don't want Google to think I am spamming because some of these templeted pages are up and awaiting their unique content.
So far their hasn't been a problem, but it is a concern for me. What do others think of this approach?

~~edited for spelling~~

onionrep

8:48 pm on May 9, 2003 (gmt 0)



I really dont see what is wrong with creating pages, whose sole purpose is to catch specific queries from search engines. If pages have a similar structure, and are calibrated towards a different sets of kw's/kp's then I fail to see what the problem is.

Its hardly fair to class pages that rank well and provide users with the information that they require as spam or cookie cutters. Pages optimised for single generic products seldom rank highly and are often found as a result of a click from some other umbrella page that focused on a broader range of terms.

If an area has a number of particular divisions, and each division has a number of sub divisions which in turn have a further number of sub sections that lead to various different types of similar product lines.

If we consider that a user or visitor may try to find a particular product via a variety of methods and terms, then it is easier to appreciate the need for casting the net as wide as possible.

He may for example look for a "generic product in Area" or "generic product in division" he may even look for a "generic product in subsection" he may even use a combination of area, division or sub section within his query. In many cases he wont even know the name of the particular product that he will eventually go on to purchase! simply because there are so many different types, that he has yet to discover.

Hotel Tropicana (the useful product) is one of *many* beautiful Hotels (generic products) in some island in a place called "Lovely", Lovely is located within a region named "Fabulous" in the country named "Wonderful". It offers a variety of Hotels (generic products) that the querying visitor may require. In order to drive the visitor to these products, or choices it is vital to consider *how* he may actually try to find them.The user is therefore, presented with as many useful routes as possible.

Pages are optimised to catch queries for combinations of all of the possible locations that a user might try to find a hotel(the useful product) in one of many.

For visitors looking for "Hotels in Lovely" (which in this example happens to have around 60 Hotels) The Page that is optimised for "Hotels in lovely" will quite probably have a required degree of KW anchortext IBLs as well as a number of on page standard markup techniques. The page itself is likely to contain nothing more than perhaps a couple of token images, links to other regions/places/ of the same type and structure, as well as a definitive number of links to Hotels that the visitor may find of interest.

In some ways its akin to a directory structure, without the tree. We wouldnt say Yahoo is full of useless doorway pages would we? So I'd argue that it would be equally wrong to argue that pages designed along similar hierachical principles are.

Area

url=domain.com/country/wonderfu.htm (mod_rewrite db manipulated url)

Page Title= Wonderful Country Hotels
Content = Wonderful Hotels and links to places and hotels in wonderful as well as some interesting or useful info for the visitor. Suitable Levels of kw freq for target terms.

Region

url=domain.com/region/Fabulous.htm (mod_rewrite..........)

Page Title= Fabulous Region Hotels - A region of Wonderful
Content = Fabulous Hotels and links to other places and hotels in Fabulous as well as some interesting or useful info for the visitor.Suitable Levels of kw freq for target terms.

Place

url=domain.com/place/Lovely.htm(mod_rewrite..........)

Page Title= Lovely Place Hotels - A Place in Fabulous, Wonderful
Content = Lovely Hotels and links to hotels in Lovely as well as some interesting or useful info for the visitor.Suitable Levels of kw freq for target terms.

Hotel

url=domain.com/hotel/hotel-tropicana.htm(mod_rewrite..........)

Page Title= Hotel Tropicana in Lovely - one of many hotels in lovely
Content = Specific information relevant to the hotel enabling the user to make a full and informed choice.
Page is *not* calibrated for a particular locality, other than perhaps a road or street.

It would be utterly futile and useless to try to optimise each individual hotel page to catch the inputted query of "Hotel in Lovely" or "Hotel in wonderful" simply because there are too many options and too many competitiors trying to achieve the same, besides it wouldnt even work (unless you were cloaking or using some method likely to get you banned). Why would you try and optimise several hundred pages for the same locality? I cant help but think that such an approach would fail.

Heini remarked that such approaches can clutter up the serps, and I agree, if done poorly they can and occasionly do, make an absolute mess.

Only 2 weeks back I encountered a domain that was basically dominating the first 2 pages for a competitive kp. For the SEO, this was an absolute disaster, and in may ways worse than not appearing at all, as it damaged search quality and led to the domains disappearance from the SERP (probably due to a complaint from a competitor).

Ive waffled on for far too long so I'll close by saying that if done correctly, as borne out by the hundreds of well ranked domains that use this technique, then it is a sure fire winner. If it is done poorly then yes, it could be classed as spam and lead to all sorts of unwelcome scrutiny.

JudgeJeffries

9:39 pm on May 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Allan73...I adopted the same approach as you having received advice from a number of the more savvy senior members here and included lots of unique individual town info, a list of useful and emergency phone numbers etc etc and apart from satisfying my sales needs seem to have unintentionaly created quite a comprehensive tourist guide to the larger and less attractive UK industrial cities. Anyone for a day out in Hull, Hell or Halifax? lol.

CCowboy

10:00 pm on May 9, 2003 (gmt 0)

10+ Year Member



incywincy,

I am doing the very same thing.
Glad to hear it's not spam!

incywincy

8:14 am on May 10, 2003 (gmt 0)

10+ Year Member



i'm not sure that i've communicated the situation very clearly, the site in question has ssay 5,000 pages where each and every one of them is identical save for the search term 'widget townabc' which occurs say 10 times as a hyperlink throughout the page. if you click on the hyperlink you arrive at a search form, then you can actually access the data you are looking for.

the majority of people here use a template and fill it in with the info the searcher is looking for, that is entirely different, at least you guys are presenting google with the same thing that the searchers are looking for

Yidaki

8:32 am on May 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



incywincy, that's their way of opening their database to the public. The thing is simple: if their database really contains the content they are promoting using doorway pages, it sounds ok to me. Yes, they could do it better with more relevant content, linked wisely, instead of creating simple title optimized doorway pages. Just be happy, that they don't do it. If their positions remain above you after the update is finished, either think about using the same tactics or improve your site using other, better tactics than your current.

Personally, i don't report sites that have the content people search for. If they'd have doorway pages that promote viagra but they'd sell yachts, i'd report it.

Stop whining - start working. ;)

>to be fair their database does cover the geographic areas
>that their doorway pages are promoting

... don't forget what you wrote yesterday: stay fair. ;)

allanp73

8:45 am on May 10, 2003 (gmt 0)

10+ Year Member



JudgeJeffries,

Are you Canadian? I was originally from Halifax. :) I think the approach that I take makes sense. This especially applies to city/town related information. There is generic information that applies to every area and this information can be teaked to make it more targeted to the area it serves. I think this type of information is useful to the user and should rank well.

Bernie

10:01 am on May 10, 2003 (gmt 0)

10+ Year Member



as far as I remember a very related question was asked in boston and matt replied: don't worry about taking care of different kw-kombinations like buy widgets and cheap widgets as long as you don't overdo it with hundreds or thousands.

the duplicate content issue is still subject to further interpretation.

as you can never talk about absolute truth in seo I did a risk/legitimacy case division for myself with the following structure:

-> let's take heinis good example of the travel market with product + region/country/town combinations.

strategy
risk/legitimacy

1.
dwps differ only in keywords
(title, hx,text) and point all
back to homepage
h/l

2.
dwps differ only in keywords but
every link points in a special database
query related to the optimized keyword -> means the user klicks only 1x and gets the results of the region/country/town he was looking for
m/m

3.
pages differ contentwhise e.g. give information
about the region/country/town and have a
deeplink to the database query.
l/h

I would go for 3 and don't see a difference to the option of putting the real database online with mod_rewrite. Plus: No. 3 is also an option for affiliates who have to accept e.g. the booking engine of the merchant to be inaccessible for them.