joined:Dec 13, 2003
did anyone try this approach...
Fooling the search engine to believe yu have millions of pages ..imagine scrambling semantics to generate new pages on the fly and having a link on every page which when crawled generates a new page scrambled from a random page on website with a link back to main page ... the spider gets fresh content every time ..what is the theoretical take on this ...
No i am not for it, ( have over 2000 pages myself ) but has anyone tried this idea .. :-)
or it is too naive :-(..just thought of sharing it ..lemme know your opinion.