| Author |
Topic |
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-29 : 19:04:20
|
| In the query below, I need to add control, to prevent duplicate Values from hitting the CH_BANKCHECKHISTORY Table.I know how to deal with the duplicates once they are in the CH_BANKCHECKHSOTRY Table, but I would like to know how to keep them from ever being inserted. --Removes all DataTRUNCATE TABLE CH_TEMPACCOUNT--This Statement Executes the DTS to import a CSV File to the CH_TEMPACCOUNT Table.EXECUTE master.dbo.xp_cmdshell 'DTSRun /~Z0x81918DBE0D127DEF809F4FA5DDE880D5FB939D864269FA1ED8C47141D18E907AEBD1DAE78C834398654816053CA0FD08AD64DC4F47B6B9134A69B789F23D91C5577E9B1B6FA43978597E045E26C555CF92DE50589E2B5CDA897260'--Inserts Temporary Data into CH_BankCheckHistory Table.INSERT INTO CH_BANKCHECKHISTORY(TransactionDate,AccountNumber,TransactionType,Amount,CheckNumber,Reference)SELECT col001 AS CheckDate, col003 AS AccountNumber, col006 AS TransactionType, convert(money,col008) AS Amount,col009 AS CheckNumber, SUBSTRING(COL011,1,250) AS REFERENCE FROM CH_TEMPACCOUNT GO |
|
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-29 : 19:22:30
|
| Actually I better clarify. I want to keep the existing data in the CH_BANKCHECHISTORY Table from being re-imported. The Accountants Get the CSV File from a bank site daily, and will run the previous date to see what new activity has cleared the bank for that day. If they import the same day again it adds all the same transactions. There are no distinct fields, only complete distinct rows. |
 |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-12-30 : 10:58:13
|
| How many rows are typically in your CSV File and does it always contain a full snapshot of the days transactions? |
 |
|
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-30 : 11:06:59
|
| Between 6 and 10 thousand. The CSV File is Month to Date. |
 |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-12-30 : 11:18:04
|
quote: I want to keep the existing data in the CH_BANKCHECHISTORY Table from being re-imported
Why? Is this not an option? Has the data in CH_BANKCHECHISTORY been further modified? |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-12-30 : 11:18:51
|
| Why not just import them to a work table and manipulate that...I guess you could code active x scripts in the DTS, but that'll be slow...T-SQL against a work table will be much fatser and cleanerMOOBrett8-) |
 |
|
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-30 : 11:21:33
|
| No. The Accoutants download the file from the bank Web site. It gives the month to date transaction detail. I don't want them to insert data that is already in the table. I could deal with duplicates once they are in the table but I would rather keep them from ever getting there. thanks, |
 |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-12-30 : 11:30:27
|
One option might be to add an identity column to CH_TEMPACCOUNT then use logic such as the following:create table tmp ( id int identity(0,1),col1 int, col2 int)insert into tmp select 0,0union select 0,1union select 0,2create table base ( col1 int, col2 int)insert into base select 0,0union select 0,1select * from tmpselect * from baseinsert into base (col1,col2)select col1,col2 from tmp twhere id not in ( select t.id from tmp t join base b on b.col1 = t.col1 and b.col2 = t.col2)select * from basedrop table tmp, base |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-12-30 : 11:33:13
|
| Do you have to use a DTS Package?Why can't you use a sproc?bcp the data in to a temp tableThen interogate that and insert only the "good" rows?Brett8-) |
 |
|
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-30 : 11:40:48
|
| Actually it does have a clustered ID. I don't know why I didn't think of that. It is basically the same way I would deal with it once they were in the table to get rid of the duplicates. Thanks for your time and energy |
 |
|
|
TSQLMan
Posting Yak Master
160 Posts |
Posted - 2003-12-31 : 10:02:14
|
The last field in the CSV, can be up to 350 characters long. I would like to do that, but can't seem to make the bulk insert work on that field. It has something to do with ROWTERMINATOR, and I was having a bad time getting it to import the data. Maybe I am just lacking the correct syntax. quote: Originally posted by X002548 Do you have to use a DTS Package?Why can't you use a sproc?bcp the data in to a temp tableThen interogate that and insert only the "good" rows?Brett8-)
|
 |
|
|
|