|Creating a static replica site - is this "unethical"?|
HTML site copy of dynamic site
| 3:26 pm on Sep 13, 2004 (gmt 0)|
For many good reasons I have gone down the route of creating a seperate site on a seperate URL to optimise:
1) the e-commerce site is dynamically generated and a pain to optimise. Much easier to optimise static HTML.
2) I am aiming to get as much traffic to my replica as possible so customer only pays for traffic I generate on my URL.
3) Customers web designers can continue to develop their site and I can continue to develop mine.
Looking at the Ts & Cs of some of the search engines and directories makes me concerned that my site would be considered either cloaked, not original content or an affiliate link site. I don't think it is as it is essentially a joint venture with the site owner.
I have 2 such sites and want to make sure I can get backlinks to them from the likes of DMOZ, Skaffe, Joeant etc.
Site 1: Original site is www.brandname.com and my site will be on www.brandname.NET
Site 2: Original site is www.brandname.com and my site will be www.keyword.com (but the logo and branding will be same/similar as Brandname's website, and will click through seamlessly to www.brandname.com/catalog/product
Not sure which of these is the better model, but if this works I think its a great idea -- I become an "agent" and own these new sites. Over time, my sites could become better sources of information than the ecommerce stores. I can close them if the contract finishes. I don't see why this could be thought of as "unethical".
| 4:52 pm on Sep 13, 2004 (gmt 0)|
The issue isn't what the ownership issues are but the content. Yes, Google will slap a duplicate content penalty on the site if it's not original content. I'm not sure what the threshold difference is before you will run into trouble, but it clearly does not have to be 100% the same for Google to see it for what it is.
| 6:48 pm on Sep 13, 2004 (gmt 0)|
if your sites are informational, meaning they have UNIQUE information only appearing on your site than I guess its not horrible. Its like a sales extension... but if you only copy what they have with the only difference being that its static than I wouldn't think its ethical at all.
Its really not that hard to write search engine friendly dynamic pages. What language/db does the site use? PM me that info and I might be able to suggest some free scripts you could impliment.
| 8:07 pm on Sep 14, 2004 (gmt 0)|
Maybe consider using some sort of cacheing solution.
That way, you have ONE site -- with no duplicate content issues -- but the pages that have previously been requested are served out of cache rather than being created anew at the time of each request for the same URL (when the data making up the dynamic request hasn't yet changed).
When data that makes up the page changes, the old version of the page in cache gets flushed and replaced with the new dynamicly built version of the page -- stored in cache for the next request of the same URL.
Note that cache means memory on the motherboard, so the amount of memory of the server will play a large part in the exact way cacheing is set up -- and of course, how much "dynamic" content you can serve from the static memory cache (which to all the world appears to be dynamic, just served up very fast) of your server.
Moreover, with cacheing software, a little memory can often go a long way, as accessing data from memory is many, many times faster than getting it from the fastest hard drive, of course.
Heck, some folks keep their entire database in memory and NEVER go to disk to fetch the data unless the data updates, and thus needs to be refreshed on both disk and in memory. Indeed, sometimes setting up a server with lots of memory -- and proper memory use via cacheing software -- can enable one to avoid buying a much more powerful server setup.
Various cacheing software handles questions like when to refresh the cache, etc. Sorry to say I'm not experienced with this stuff -- just familiar with it from reading a couple of articles. One of them was at aceshardware dot com. They are a computer hardware enthusiast site. I think the article was in the "server" section of their articles section. I think it was the one that went into detail about how they set up their latest "webserver for the 21st century" -- and included benchmarks demonstrating the difference between using and not using cacheing software. A VERY interesting read!
Hope the above helps!
| 4:38 pm on Sep 15, 2004 (gmt 0)|
A good rule of thumb - oft quoted by Google is not to do anything to your site that you're only doing for search engines.
At a glance I think your proposals will cause trouble. You will split your inbound links between two sites. You will get into trouble with either duplicated copy, affiliate link (effectively) or doorway pages.
Your static copy of the original dynamic site needs to be substantially different from the original and have enough unique copy of its own (otherwise it's a doorway). At that point you might as well have built a whole new, unrelated, site. There aren't any issues in having two sites (as long as they don't cross-link too heavily) supplied by one warehouse/bank/etc.
| 11:25 pm on Sep 15, 2004 (gmt 0)|
I think you're right and I think it's unfair...
I am working as an "agent" and I will work hard to refer customers to her site. This goes on in the real world: I'm not too fond of estate agents, recruitment agents, insurance agents and the like, but I see why they are needed and others don't stop them trading. I will put up to 3 days effort a month making my site accessible via search engines, directories, keyword optimisation, content addition etc...why should I be penalised for this?
Anyway lets see: I've been accepted by a few directories, even if I do need to do some editing for the better cause. Took me a few times to pass the tests though ;-)