If you use a script that dynamically creates thousands of pages, how are you going to get incoming links to all those dynamically created pages.
One way is they use the SEs. Once an SE requests a page they generate subsequent links and page content at that time. SEs then propagate the page content including the other links in their index. In turn the other links are also accessed and the whole process is repeated indefinitely.
Once a visitor access any of these links from the spider's index, the application logic may vary depending what they want to do. Redirect, give out a different content, hijack the browser or reconfigure the router etc.
It's cloaking, spam and scam combined and it's not fading at all.