Forum Moderators: open
I have a web app that has 150,000+ aspx files, without a code-behind. Every time I transfer an updated dll to the server, it takes 10+ minutes to load the first time.
I understand that .net must compile the app the first time it's run, but this is excessive! Are there any ways to speed this process up, or do it offline?
Thanks!
Why do you have 150,000 aspx pages? Sounds like you should use a database and some URL rewriting if you are concerned about having unique file names.
I have a TON of content in my databases... The vast majority of aspx files on my site do not have a code-behind. They're created dynamically from a database on another machine, and FTP'd to the site. So, my dll is not very big - 230KB.
I am concerned about unique file names, for spidering purposes. Not sure what you mean by URL rewriting, but each of my files is named uniquely.
Is there a way to precomile the aspx pages when all the pages don't exist on my development machine?
If you just want to use them as static content pages it would probably have been better to use standard htm extensions for them.
URL rewriting is taking something like
www.example.com/MyPage.aspx?PageID=295678 and turning into
www.example.com/pages/MyPageTitle/295678/Page.aspx
That way you could have 1 page getting the data from the database...for each page in your site and then rewrite the url so it appears as a different physical page to visitors and spiders.
I guess I am sort of using url rewriting in a way, but all of the files are loaded onto the server with their own unique file name like:
www.example.com/articles/20050601/3847_some_file_name.aspx
The reason I decided to use aspx files in the first place was for the dynamic header and footer, to allow easy changes to the look and feel of the pages... well, that seems to have backfired. I'd rather have static htm files now... but thousands of my aspx files are already spidered by the search engines.
So, every single aspx file must have JIT compilation, regardless of whether there is a code-behind or not? Is there any way around this - to compile at load time like asp? If not, I wonder if the best thing to do would be to republish everything as an htm file, and 301 permanent redirect from the current aspx files to the htm version - and then remove the aspx versions after a couple of months...
Have any thoughts on that?
If you can, then do some research on ASP.Net url rewriting and you could then keep the same file names and paths that you already have...except you wouldn't have 150,000 individual files. The URL re-writing is done in an HTTP handler that you can add to your application.
This method would help you avoid messing up any search engine rankings by avoiding chaning your paths and file extensions.
I'm not an expert on it but there are some tutorials out there.
If you also turn on caching in the pages there really should not be any performance problem over static html.
write some logic into my 404 page to do a 301 redirect
I would not do that, will confuse search engines very much. If you want to rank well that is :)
I've just finished re-publishing my entire content base in html and am preparing to do the 301 redirect switch. It works like this:
1. In web.config, I have the following code:
<customErrors mode="On">
<error statusCode="404" redirect="/404.aspx" />
</customErrors>
2. In the 404.aspx.vb file I have the following code:
If InStr(LCase(Request("aspxerrorpath")), "/articles/") > 0 Then
'if this user is looking for an article, redirect them to the html version
Dim sReDir = Request("aspxerrorpath")
sReDir = Replace(sReDir, ".aspx", ".html")
Response.Status = "301 Moved Permanently"
Response.AddHeader("Location", sReDir)
Response.End()
End If
Why do you say that this is bad for the search engines?
If the requested page really does not exist, also not as static html make sure the client does receive a 404. Otherwise you may get duplicate content issues. Google seems to request lots of gibberish URLs lately to detect generated spam.