Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
fisherman_jake
Slave to the Almighty Yak
159 Posts |
Posted - 2001-10-26 : 02:27:24
|
Friends, I am about to import via DTS about 7GB of data from 20+ data files. This is what I was planning to do to try to be as economical with space and time.Before I give my approach, I will explain briefly just what the hell the data I'm importing is about.- each data file contains several sections- the first character of each line determines what kind of data it is- there are various flags within the row to tell you how to handle the data- the dat file are classified as SUPPORTING & STANDARDIZEDOk.. Now what I propose to do is:1) Have 1 staging table a. since each row has to be run through a set of instructions I find that there is no need to import the other data files into other staging tables, thus saving space2) Process each data file through the staging table.Seems pretty straight forward. Now the only problems I see here is SPEED, since each file needs to be processed before the staging table can be cleared it may take a hell of a long time to process 20+ files about 7GB worth. I obviously don't have to do it this way, so your opinions would help.I may need a few beers while I'm at it, processing this monster!! ==================================================World War III is imminent, you know what that means... No Bag limits!!!Master Fisherman |
|
|
|
|
|