Forum Moderators: open

Message Too Old, No Replies

Dataset in half

Dataset memory sectioned table

         

makeit

1:46 am on Sep 29, 2004 (gmt 0)

10+ Year Member




I was wondering if it is possible to load only a half of a table in a dataset performe specific instructions then fill the second half of that table in the dataset again and perform the same instructions. Im scared that the whole table will load in a dataset(memory not big enought) Imagine having to search through text files. If I load in a dataset about 300 of them (lets say text files of at least 4000 words) then I might be ok. If I have let's say 7000 text files I only want to fill in the Dataset with text file number 1 to 1200 perform my instructions than continue 1201@1400
until I get to 6000 @7000

well first is it a good idea?

let me know Please. keep the positive vibes

MozMan

6:54 pm on Sep 29, 2004 (gmt 0)

10+ Year Member



Not sure if this is what you are looking for, but the recordset object has a PageSize method you could use. It goes something like objRS.PageSize = 4000 (where 4000 is the maximum number of records you want in each page).

Hope that at least points you in the right direction.

-Moz

TheNige

8:07 pm on Sep 29, 2004 (gmt 0)

10+ Year Member



What exactly are you going to do with the data once it is in the dataset? If you are going to search through text, filter, etc., why not do it on the database side in SQL before you return that many records?

makeit

8:35 pm on Sep 29, 2004 (gmt 0)

10+ Year Member



Im scared that... let's say one day I get so many text files (lets suppose 9000 text files that we can call here records)that it considerably slow down the search. A search in the memory is allot quicker than a search in the Database(sql). If there is no problem with me having thousands of records in a dataset without killing the server's memory than my problem is solved. But I am trying to be preventive. Having a 100 text records filled in a dataset(cache) is not suppose to be a problem. but If I ask the program to pull out all the records that as the word "keyboard" I think(maybe am wrong) that 9000 records in a dataset might kill the memory or just not perform as it should Let me know if I have the right way of thinking?

keep the positive vibe!