Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Design issues - bulk insert

Author  Topic 

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2006-08-08 : 10:06:30
Hi all,

I've had a meeting with the my managers and we have decided to replace a DTS-solution with a BULK INSERT-solution (http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=68124). We have about 200 tables split between 5 different systems that we need to recreate every day and I'm thinking of designing a database to take care of everything.

What I'm thinking is that if I create a file/table/record-description with column-names and everything in a separate database I can generate the bulk insert-scripts automatically using one single stored procedure for everything, instead of having a separate procedure for every file that needs to be imported.

I also wonder if I will see any performance gain if I was running several bulk insert statements at the same time, say 3-5 concurrently, or if it will just give me problems...

My question is; am I thinking the right way here? Are there any considerations I need to do when creating dynamic bulk insert statements? This will be a somewhat "large" project in the sense that it will take some time to create and will have a quite large impact on our daily routines...I wanna make sure that I make a sturdy, robust and scalable solution...any insights are welcome.

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2006-08-08 : 11:44:47
You will probably see a performance gain by running several streams of loads in parallel.

Why are you switching from DTS?

You could also use the file/table/record-description to generate stored procedures using a common template with standard parameters, error handling, logging. etc. Even if you chose to generate them dynamically each day, I have found it better to create s temporary stored procedure, and then execute it because it is easier to do error handling, logging. etc.

You might also take it a step further, and create a table of job streams with file/tables to be loaded in a single stream, and use that to create stored procedures to call the lower level procedures. That way you can regenerate the job streams as needed to keep the server working at maximum to shorten the load times.



CODO ERGO SUM
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2006-08-09 : 02:15:26
DTS: these babies are slooooow and the reason is that it's using something called AcuODBC to connect to our main business application. It's an old COBOL-application based on vision-files if that makes any sense to you (it's sure doesn't to me!) from the 80's and performance isn't all that good. We will instead create flat txt-files in cobol, move them via sftp or something and import into sql server using bulk insert. I have tested it and the performance gains are massive.

Procedures: so what you are saying is that it would be better to create a separate procedure for each file I need to import but to do that dynamically instead of just coding them by hand?

And you lost me a little on the last part about job streams...can you please try to explain more

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page
   

- Advertisement -