Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 Analyzing large delimited extracts for import

Author  Topic 

phud
Starting Member

1 Post

Posted - 2009-03-17 : 01:41:55

Hello.. I am new to SQL Server. Am using SQL Server 2008 Management Studio. I do a lot of importing of fairly large (up to 1 million row by 100 column or so) extracts that are delivered to me as Tab-delimited files. Is there a tool in the SQL2008 suite that will allow me to analyze max string length in the tab file, and suggest appropriate char field lengths?? Right now I guess, and the import bombs due to truncation, then I guess again etc.. not very efficient.

I guess I could just set all char fields to 200 or 300 chars but that would make the table really bloated. Any suggestions would be greatly appreciated.

Thanks so much!
phud

sunitabeck
Master Smack Fu Yak Hacker

5155 Posts

Posted - 2009-03-17 : 07:32:16
I had to do something like this at my last job. I was using SQL 2005, and there wasn't anything built in in SSIS that I could find to do this. I don't know if SQL 2008 has.

My boss, who was a Perl aficionado made me write a Perl script to do this task - read the files, split the lines based on the delimiter and print out the maximum length of each column.

It turned out to be very simple - 6 or 8 lines of code - and fast. Unfortunately, I don't have the code - lost that along with my job.

But you may be able to find examples in Perl Cookbook. Google books has the Perl Cookbook online.

[Shuts up before anyone notices that she is discussing Perl in a SQL forum.]
Go to Top of Page
   

- Advertisement -