Msg#: 3603417 posted 10:14 am on Mar 18, 2008 (gmt 0)
That sounds, ummm, wildly inefficient. Can I ask exactly what you're trying to do? and what your database/tables are trying to store? Maybe there's a better way of organising the tables so that you don't need to do so many inserts.
Msg#: 3603417 posted 10:38 am on Mar 18, 2008 (gmt 0)
I do agree that the method I posted sucks, but it's the only thing I've found in relation to performing bulk inserts from dynamically generated data.
I don't think that I have access on the DB server to create stored procedures, but if I do that will be the route I'll be taking. Even though, at the end of the day, the same while-loop methodology will be applied...I think. =/
Oh, and the data being stored (in a single table) are the combination of consecutive numbers, random numbers, and constants.
Msg#: 3603417 posted 11:02 am on Mar 18, 2008 (gmt 0)
I've been away from the DB scene for a while, but we used to be able to do bulk inserts from a text file in Sybase SQL. Essentially one INSERT statement that read the contents of a file (CSV, TAB seperated etc.) Without the overhead of multiple INSERTS and continuous index building.
A bit of a google... does your version of MSSQL support this sort of thing? (Is there a 'bcp' - bulk copy program?)
BULK INSERT tblMyTable FROM 'c:\mydata.txt' WITH (FIELDTERMINATOR = ',')
May be the database forum [webmasterworld.com] can offer more help...?