Forum Moderators: Robert Charlton & goodroi
Ours is a database driven php site that of course generates long giberish-looking URL strings for pages deeper than the homepage and so we are in the process of doing a mod rewrite to create the appearance of URL strings that follow a logical syntax and are spider-friendly.
My question is that if we want to use the Google Sitemap Generator we have to run a Python script to crawl the files on our web server. Will it see the files with their neat-looking mod rewrite URL's, because my understanding is that these URL's don't even really exist, but are rather a sort of shiny "paint job" on the database driven URL's.
Please advise.
However, before submitting the url for your sitemap to Google, you always should do a visual inspection of the generated url list anyway. So why not give it a try and actually see what happens in your case?