SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 Import data to SQL
 New Topic  Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic  

spareus
Yak Posting Veteran

India
52 Posts

Posted - 06/05/2013 :  10:03:32  Show Profile  Reply with Quote
I have to upload many file with reports to server in a day.
All are having same columns but diff file name.

Currently I am using import export utility provided.
Each time I upload data, I have to map columns in the utility.
There are total 84 columns out of which I have to manually change mapping for 18 cols due to invalid chr in col name.

Is there a way to automate the process so that every time i only have to select the file and maipping is done automatically for all columns (including those 18 cols)?

Pl suggest.

Regards,


Regards,
Spareus

MIK_2008
Flowing Fount of Yak Knowledge

Pakistan
1054 Posts

Posted - 06/05/2013 :  10:17:07  Show Profile  Reply with Quote
You might want to check/read about the "OpenRowSet" and "OpenDataSource". Through any of these you can read data of an excel file, irrespective of mapping. Sample usage is like

SELECT * FROM OpenDataSource(..........) -- * will pull all columns' data from the excel file.

Check master google/BOL for more details/example

Cheers
MIK
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

India
52317 Posts

Posted - 06/05/2013 :  13:47:53  Show Profile  Reply with Quote
Why not save the package created by Export Import as .dtsx file for later use. provided you've done mapping correctly and there's no change in metadata (column information) in any of the other files the package should still be able to process all of them.
If you want to automatically do the processing, you could extend package to look for folder where you can upload files and do the processing from there. This will work for any number of files put there. You can also schedule this package as a sql agent job to get it executed without any user intervention

see example of similar package here which does automatic processing of files daily

http://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html



------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/
https://www.facebook.com/VmBlogs
Go to Top of Page

spareus
Yak Posting Veteran

India
52 Posts

Posted - 06/10/2013 :  06:58:06  Show Profile  Reply with Quote
Hi Visakh16,
Sorry to delay in reply as I was on leave.
The link you have provided is quite promising and mostly will suffice.
But I have to ensure that the file once processed is not processed again.
Is there any way to restrict importing data only once from the same file?
It will help to stop any duplicate records in table.

Regards,



Regards,
Spareus
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

India
52317 Posts

Posted - 06/10/2013 :  07:44:23  Show Profile  Reply with Quote
quote:
Originally posted by spareus

Hi Visakh16,
Sorry to delay in reply as I was on leave.
The link you have provided is quite promising and mostly will suffice.
But I have to ensure that the file once processed is not processed again.
Is there any way to restrict importing data only once from the same file?
It will help to stop any duplicate records in table.

Regards,



Regards,
Spareus


yep its possible
you can add a step to archive file once processed to another folder. So anytime main folder will only have unprocessed files.

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/
https://www.facebook.com/VmBlogs
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.08 seconds. Powered By: Snitz Forums 2000