Question Because all search engines are different in algo and so on, is it worth giving each engine a different page based on user_agent? I'm tempted to try this using if - then statement, Is this standard practice of most web sites? whats the down side?
i would asume that by cloaking a page then you could do the foillowing: cover up affialte links? make a controlled internal links structure? take away all hmtl and scripts? hide meta tags? so why do engines not like them?
i can see the possiblity of that, i think they dont like it because you can create and design a more controlled site that only you and the engine know about. i have now created and cloaked a site completetly and i'm quite impressed with it, To the point of showing the user one thing and yet the engine something else, (Not Spamming)this has become quite an interesting project for me now. what tips are there to be careful of?