Forum Moderators: phranque
loads of queries, session ID's, the lot.
Has anyone had any experience of working around this?
Or perhaps, applying a Mod rewrite type of solution?
I found this thread [webmasterworld.com] from December 2002 where this question was asked, but without a positive response.
Any progress since then?
There are several ways to work around the terrible limitations of BV, but none of them are great. Generally, cloaking is required.
Consultants get $$$ for this, BTW. As an aside, this has to be the no. 1 or no. 2 most frequently asked question of the dynamic sites panel at SES.
Here is what I would do.
1) Set up another domain to use for search engines.
2) Use aspseek (or something similar) to index your site.
3) create a translation system, you could md5 all the urls but it would probably be better to use some sensible words in the urls for better search engine performance.
4) it looks like the session id can be omitted, i am not totally familiar with this broadvision thingy but i experimented a bit on their site, i can remove the BV_SessionID=xxx and still get the proper page to display. I would assume that the sess id identifies the user? you likely don't want this in your urls.
5) after you build your translation methodology, create a database to match up bv urls on your cms site to good search engine urls
6) do a mod_rewrite on your new domain and feed everything to index.php. index.php will figure out which page is asked for, pull the content from the cms site, and convert all the urls on the cms site to search engine friendly urls.
example:
cms url is
[(x).com...]
seo url is
[(y).com...]
if you md5d the urls it would be downright ugly, but still work. this would save you from figuring out appropriate names for all the pages. one idea would be to use the title of the page to create the page name.
ex:
[(y).com...]
this gets sent to index.php as something like index.php?page=applesandoranges.html
index.php looks up the matching cms url, fetches the html content and adjusts all the links by matching up the urls to the database (getting the seo urls)
gotta do something about images...
you can also read the content of the fetched page and create keywords and description meta tags before pumping the content out the server.
anyhow, i realize it may sound complicated, however quite doable.
maybe you would have your broadvision url on your intranet, and only publish the 'seo friendly' url...
that way you could build a script to automagically index and create a static version of your site, and publish that.
if you depend on the session ids for something like an online shop checkout then that might be a little tricky, but still doable...
take care,