Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
The idea may go against the grain of automation but i thinks would be cheap simple and logical.
I have made a webshop wich is based on a database and is generated from the database but the content isn't dramatically changinging every month.
So the idea is the following: Since the purpose is to sell in this shop i need to get people in who are looking for the products or simular products.
My idea is just to surf trough the site and with view source copy, each relevant page and transfer it into a static one. Then i will add metatags wich are relevant to the page. and i save it in to the root.
The next time a robot comes by it will search the root and see all the relevant content.
If the search engines get a request they might direct to one of the static pages but since all links are going to website itself the pages will work just fine.
The following question i have is this possibly seen as cheating by one of the bots, or is it maybe a damn good solution.
Maybe somebody will say; why don't you talk about getting listed high up. Well that is step two because off cource i could list a group of products with a description full of keywords. But i first one to make sure that this is a good way to do it, and then i am working on step two.