Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
spareus
Yak Posting Veteran
52 Posts |
Posted - 2013-06-05 : 10:03:32
|
I have to upload many file with reports to server in a day.All are having same columns but diff file name.Currently I am using import export utility provided.Each time I upload data, I have to map columns in the utility.There are total 84 columns out of which I have to manually change mapping for 18 cols due to invalid chr in col name.Is there a way to automate the process so that every time i only have to select the file and maipping is done automatically for all columns (including those 18 cols)?Pl suggest.Regards,Regards,Spareus |
|
MIK_2008
Master Smack Fu Yak Hacker
1054 Posts |
Posted - 2013-06-05 : 10:17:07
|
You might want to check/read about the "OpenRowSet" and "OpenDataSource". Through any of these you can read data of an excel file, irrespective of mapping. Sample usage is like SELECT * FROM OpenDataSource(..........) -- * will pull all columns' data from the excel file. Check master google/BOL for more details/exampleCheersMIK |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2013-06-05 : 13:47:53
|
Why not save the package created by Export Import as .dtsx file for later use. provided you've done mapping correctly and there's no change in metadata (column information) in any of the other files the package should still be able to process all of them.If you want to automatically do the processing, you could extend package to look for folder where you can upload files and do the processing from there. This will work for any number of files put there. You can also schedule this package as a sql agent job to get it executed without any user interventionsee example of similar package here which does automatic processing of files dailyhttp://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/https://www.facebook.com/VmBlogs |
|
|
spareus
Yak Posting Veteran
52 Posts |
Posted - 2013-06-10 : 06:58:06
|
Hi Visakh16,Sorry to delay in reply as I was on leave.The link you have provided is quite promising and mostly will suffice.But I have to ensure that the file once processed is not processed again.Is there any way to restrict importing data only once from the same file?It will help to stop any duplicate records in table.Regards,Regards,Spareus |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2013-06-10 : 07:44:23
|
quote: Originally posted by spareus Hi Visakh16,Sorry to delay in reply as I was on leave.The link you have provided is quite promising and mostly will suffice.But I have to ensure that the file once processed is not processed again.Is there any way to restrict importing data only once from the same file?It will help to stop any duplicate records in table.Regards,Regards,Spareus
yep its possibleyou can add a step to archive file once processed to another folder. So anytime main folder will only have unprocessed files.------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/https://www.facebook.com/VmBlogs |
|
|
|
|
|
|
|