Forum Moderators: LifeinAsia
You can do a search for Mod_rewrite
I don't know how much it will help, deppends on what webserver you run, haven't found a good solution for IIS rewriting the url.
I would say that FAST is the best search engine to index dynamic content, Google and Inktomi is good too.
if you save a query in access you can query that without having a ??? in the URL. If you use frontpage, its pretty easy, just go to the database results wizard. Use your db connection and the query as your database results, and save the file as .asp. All should work well, but you would need FP extensions, but no doubt you will have them on the server already if you can use access (?)
Google, for sure, will index ASP files so long as you don't have session information in the URL. (i.e. So long as the URL that the Googlebot gets for the page is the same url a user of your site gets for the page).
Ime not using frontpage, and yes i ame using a access db.
1st i Must confess: i didn't checkout lazerzub's links yet, thanx for compiling them !! :)
so if i understand right, the search engines mentioned by you, do search db's and index it as a content of the page, if linked as [mysite.nl...] in a html/asp file ? , yes :) ?
But like I said with access, if you make a query in access "where id = 1"...then you could make a "static" asp page without the ??? in it....so you dont have to bother about the whole ??? issue.
would definetely check out the mod-rewrite threads too. I have read them, but dont use mod_rewrite
But ime surely interrested in the BrotherhoodZ query solution!, i've been playing with it, but dont understand how i could use this as a "direct" entrance to the db? (btw. ime using 2 db's 560 and 1140 records)
Hoping you can show me ;)
firstly, the database itself doesn't get spidered. ASP scripts will read from the database and display the output as HTML - it is this HTML that gets spidered.
spiders can read the URLs with name / value pairs (?id=1&xyz=123) and so on, but they dislike session IDs in the URL. keep session IDs out of the URL. there may be problems if your URL is too long or if you use too many name / value pairs, but i've never encountered any as i use no more than 2 name value pairs and i keep the URLs short just in case.
to get the entire database contents spidered, ensure your scripts generate links to every row in the database, ie:
[mydomain.com...]
[mydomain.com...]
[mydomain.com...]
and so on. this will allow the spider to find everything.
spiders cannot submit forms, so don't use a form where the user enters the id number and hits submit.
I can only really tell you the way I do it because I use frontpage and access for a website, and never really bothered looking into how the ASP "worked the database". instead I chose to start learning PHP
Basically, yeah, if you save a query in access then you can make pages without the query string in it. I'll send a sticky to you about the way I do it, but I know there are "cleaner, better" ways....thats why im learning PHP and saving me the embarassment of FP generateed ASP
think of a spider being like a browser. when a link to a script gets a hit, the server will process the script and generate HTML. a browser will interpret the HTML and display it as a web page. a spider will see the source code of the generated HTML (not the source code of the ASP script) just the same as if you use View source from the menu.
i have no idea whether blind links get spidered, but discussion in the google forum suggests that weighting is given to the text used in the link, so probably best to use some RELEVANT text there.