Author |
Topic |
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-15 : 12:38:36
|
OK, I searched the forums, and they all talk about Creating headers and trailers.No problems there.I'm getting (or soon to get) COBOL files with headers and trailers (1 each per file)The only thing I can think of is loading the files in to a work table with 1 column (varchar(8000)).Perform my audits between the header/trailer and data, then do an insert (inefficient) with SUBSTRINGS in to a table.Of course the file is fixed width.Maybe I could bcp the data back out sans the H/T records...Any thoughts...seems like a lot of hoop jumping....Brett8-) |
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2003-07-15 : 13:43:04
|
If you've got a file with specific, machine-generated formatting then DTS is the way to go. I would use a staging table with the same layout as the file though, not a varchar(8000).You might consider stripping off the last line of the file to get rid of the trailer; there are many file utilities out there that do it. Alternatively, force the COBOL source to not include it as it is bad data.Edited by - setbasedisthetruepath on 07/15/2003 13:45:00 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-07-15 : 13:49:37
|
>> The only thing I can think of is loading the files in to a work table with 1 column (varchar(8000)).That's what I always try to do if possible - a staging table in a staging database.Depends on how much data you have and how fast you want it to be whether you can do inserts or not.What time constraints do you have? i.e. do you need it to not impact the production table for very long or do you have to get the data into the database quickly after the file is delivered?You could also try importing with the max errors set so that it fails on the header and trailer but accepts the rest - but you would have to be careful that the header did not lose the next record. Not a very nice option anyway.The bcp in / out / in would work but as you say is a bit finicky.Could write an app which parses the file and creates a file to import after the checks - but that would be the same as the multiple bcps.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2003-07-15 : 13:52:36
|
I also like the staging idea but not the varchar(8000) b/c then you need to perform the data conversion checks manually that DTS would do automatically. Every date column, every currency column, every numeric column, etc.Jonathan{0} |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-07-15 : 13:57:46
|
Using the SUBSTRING() method isn't bad at all, it depends on the file format. If the data is stored fixed-length, you can define a record layout in another table and CROSS JOIN it with the imported rows. The record layout table has Start and Length columns that you feed to Substring(). As long as the format is consistent within the file this works like a charm. And if there is something in the header and trailer records that identifies them, you can include extra layout records to parse them as well. These can then be included or excluded as needed using a WHERE clause.I have something like this set up at work to import NACHA formatted bank transaction files. This format is replete with file headers, batch headers, footers, etc. in addition to the detail records, all in the same file. Every record type is identified in the spec and is easily parseable using a record layout table like I described. I'll try posting it here later today after I get home from work. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-15 : 14:07:32
|
Thanks everyone.Rob, I love the config table idea!We're talking dynamic sql here correct?Config table holds the column definitions and column nameYeah I got...and yes they are all fixed.In all I have many of these files. I can add a column that identifies the source file and use a predicaste.Cool. I can write one process and config it on the fly...I jazzed up now.Thanks guys..Brett8-) |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-15 : 14:11:18
|
AND>>>>The sproc will determine if the process needs to be a bcp out if the data is too large or if it can be an insert...I'll have the whole thing kick off over night during a maint window or something...Brett8-) |
|
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2003-07-15 : 14:19:17
|
Rob-What would you say are the advantages of your method? I'm intrigued by the approach, but I wonder what functionality is there aside from an elegant re-write of the DTS data pump task.Jonathan{0} |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-15 : 14:56:37
|
I like the fact that it will be table driven...don't even have to write anything new when I'm done.I could even add the frequency of the load per file...so that it could be part of a sql predicate to know what to look for.All the file will audit the same way, any exception will be handled the same.I'll be even to do data integrity checking BEFORE I load...OH, lots of upside here....Downside of course if the data becomes large...just have a process to check and bcp out and then in to the waiting table, instead of insert..(besides, I'm not a fan of DTS)Also I'm dealing with money on this one, so I want to be squeaky clean..Brett8-) |
|
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2003-07-15 : 15:58:34
|
quote: (besides, I'm not a fan of DTS)
Why? I suspect many developers tried DTS w/ SQL Server 7.0 and formed a bad opinion of the product (deservedly so), and never tried it again w/ SQL 2K. DTS under SQL 2K is a far superior product, and from what literature is becoming available about Yukon, it will be significantly enhanced yet again.Aside from the more mechanical aspects of DTS - the actual movement of data from source to destination, you also get many fringe benefits: versioning, logging, a graphical interface that practically self-documents, etc.Thoughts?Jonathan{0} |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-07-16 : 08:28:34
|
OK, first lets get the data file out of the way. Save this into a file named flintstones.txt (or whatever):1 Flintstone Bedrock 2 Fred Father2 Wilma Mother2 Pebbles Daughter2 Dino Pet1 Rubble Bedrock2 Barney Father2 Betty Mother2 Bam Bam Son1 Slate Bedrock2 Mr. Boss1 Rockhead Bedrock2 Joe Co-worker And here's the SQL to get you started:SET NOCOUNT ONCREATE TABLE FileFormats(type varchar(4) NOT NULL, Item varchar(10) NOT NULL, Start smallint NOT NULL, Length smallint NOT NULL, CONSTRAINT PK_FileFormats PRIMARY KEY(type, Start))CREATE TABLE ImportData(LineData varchar(8000) Null, linenum int not null)INSERT INTO FileFormats VALUES('1', 'Family', 1, 2)INSERT INTO FileFormats VALUES('1', 'LastName', 3, 19)INSERT INTO FileFormats VALUES('1', 'City', 22, 15)INSERT INTO FileFormats VALUES('2', 'Person', 1, 2)INSERT INTO FileFormats VALUES('2', 'FirstName', 3, 13)INSERT INTO FileFormats VALUES('2', 'Title', 16, 20)exec('alter table importdata drop column linenum') --this is not needed if you use a format file--change this to match the path and filename you're usingBULK INSERT importData FROM 'Z:\documents\flintstones.txt'exec('alter table ImportData ADD linenum int not null identity(1,1), family int null') --this is not needed if you use a format fileCREATE CLUSTERED INDEX CL_LineNum ON ImportData(LineNum)SELECT C.LineNum, A.Type, A.Item AS Class, B.Item, SubString(C.LineData, B.Start, B.Length) AS ValueFROM FileFormats A INNER JOIN FileFormats B ON A.Start=1 AND A.Type=B.Type AND B.Start<>1INNER JOIN ImportData C ON SubString(C.LineData, A.Start, A.Length)=B.TypeORDER BY C.LineNum, B.StartSELECT DISTINCT LineNum Start, LineNum*LineNum Finish INTO #Nums FROM ImportData WHERE SubString(LineData, 1, 1)='1'UPDATE #nums SET Finish=(SELECT IsNull(Min(A.Start), #Nums.Finish)-1 FROM #Nums A WHERE A.Start>#Nums.Start)SELECT N.Start AS FamilyID, C.LineNum, A.Item AS Class, B.Item, SubString(C.LineData, B.Start, B.Length) AS ValueFROM FileFormats A INNER JOIN FileFormats B ON A.Start=1 AND A.Type=B.Type AND B.Start<>1INNER JOIN ImportData C ON SubString(C.LineData, A.Start, A.Length)=B.TypeINNER JOIN #Nums N ON C.LineNum BETWEEN N.Start AND N.FinishORDER BY C.LineNum, B.Startdrop table fileformatsdrop table importdatadrop table #numsI was hoping to set up a cross-tab that would combine the header (family) lines with the detail (person) lines, but I blew it. You can probably figure that out if you needed to anyway.This example is pretty simple, there are two line types (1 and 2) that have different formats. The Type column defines a line type, in this example the line type is actually in the line data itself (the NACHA format is set up this way as was my inspiration). If your lines don't have a clearly identifiable bit of data then you might have to tweak the format definition, or use it with a WHERE clause to get only the headers, details, footers etc.Since the line type is stored in the data itself, I used a self-join on the FileFormats table to make the parsing work right. I also use an identity column to maintain the line numbers; go ahead and BOOOOO HISSSSS!!! me, but in this case position DOES matter! Again this was from the NACHA format.Most of the advantages Brett already mentioned: the import process becomes data-driven and very easy to maintain, very easy to add new formats. Right now I have three different text file formats for various payment files. They all have the same basic info (account number, payment amount, payment date) from 3 different sources (credit cards, lockbox process, and NACHA transactions) All of them are handled using the table format, and fortunately only the NACHA file has multiple formats, the others are plain text files. I also had an ImportType column to distinguish LBOX from CREDIT from NACHA types so that the parsing is done correctly for each file type. I also have bcp format files for some file types, basically to discard extraneous crap.As far as converting data types, well, you kinda have to do that anyway, DTS just does it behind the scenes. I started playing with storing column names and types in the fileformat table, so that I could possibly dynamically construct the SQL to insert into the proper tables, but I don't think I'll bother since I don't have that many formats.I also started setting up a batch import queue table, very much like Brett described, to manage the actual import and processing of the files. Combined with this technique:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=24371I can take a load of text files in any format, dump them into a folder, and process each type as I need to. I can BULK INSERT all of them in one go and parse them as one batch, or individually. I can use the DIR output to manage the file names & sizes, and update it to include rows imported, import date, whatever. I'm also saving the file name in addition to the line data so I can go back and determine which file it came from. I don't think I need to have all this but at least I won't be crying later if I do need it. It's one of those things that just makes more sense to do in tables than it does to manage using DTS.And for the record, I LOVE DTS, I just don't use it for text files anymore. BULK INSERT and bcp rule there.Edited by - robvolk on 07/16/2003 08:29:36 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-29 : 17:58:15
|
Ok...A couple of things..this does not work as a whole (my brain is about to explode) but when executing the individual snipets in the stream it willEDIT: Plus, unless I miss my guess...I've done everything in thiscode that I have protested against...dynamic SQLEDIT2: Did I forget to mention Identities? Luke I am...No it can't be trueNot only dynamic SQL But teetering on the line of dynamic object creation ( i pulled back a little from that..still don't think its a good idea).It needs lots, error handling, ect, but it will, based on config tablesLocate a directory of a source fileLoad it to a work tableIdentify the Name of the table to loadParse it out and load itDo Data integrity checks on date and numeric dataAudit a trailer file and check to see if you have everything (about 80% complete)It can handle many files, and many record types with in a fileWhat it needs.A compilable sprocA better "schedule" method (well there isn't one, if the file exists it's loaded)An alternative method to bcp out/in largered filesA Log of all the events that happed during the load..and probably more than I can think of now...(what's this grey matter oozing from my ear)I'll get to it more tomorrow (as I've got to get it in soon), and I'll post the final when it's 100%But I was having WITHDRAWAL SEE! (plus I need a break)Another note. The first is my "draft" code. The supporting data and tables follow.AND 2000!CREATE PROC isp_FixedWidthFileLoaderASDECLARE @FeedName varchar(255) , @FileRootName varchar(20) , @FileExtvarchar(3) , @FilePath varchar(255) , @LoadFreq char(1) , @FilenameAndPath varchar(278) , @CommandString varchar(8000) , @BatchId Int , @LayoutCD varchar(5) , @LayoutName varchar(50) , @RecordIdentStart Int , @RecordIdentLen Int , @RowCount intDECLARE CSR CURSOR FOR SELECT FeedName, FileRootName, FileExt, FilePath, LoadFreq FROM FixedWidthFeedOPEN CSRFETCH NEXT CSR INTO @FeedName, @FileRootName, @FileExt, @FilePath, @LoadFreqWHILE @@FETCH_STATUS = 0 BEGIN-- ------------------------------------------------------------------- Let's go fishing..does the file even exist?-- ----------------------------------------------------------------- SELECT @CommandString = 'DIR '+ @FilePath+@FileRootName, @FileNameAndPath = @FilePath+@FileRootName Delete From Load_Folder Insert Into Load_Folder exec master..xp_cmdshell @CommandString Delete From Load_Folder_Parsed Insert Into Load_Folder_Parsed (Create_Time, File_Size, File_Name ) Select Convert(datetime,Substring(dir_output,1,8) + ' ' + (Substring(dir_output,11,5) + Case When Substring(dir_output,16,1) = 'a' Then ' AM' Else ' PM' End)) As Create_Time , Convert(Int,LTrim(RTrim(Replace(Substring(dir_output,17,22),',','')))) As File_Size , Substring(dir_output,40,(Len(dir_output)-39)) As File_Name From Ledger_Folder Where Substring(dir_output,1,1) <> ' ' And (Substring(dir_output,1,1) <> ' ' And Substring(dir_output,25,5) <> '<DIR>')--Does the file exist? IF EXISTS (SELECT 1 FROM Load_Folder_Parsed WHERE File_Name = @FileRootName)--yup, lets prepare the Batch Control Table BEGIN INSERT INTO BatchControl (FeedName, FeedDate, FileName, FilesSize, FileDate) SELECT @FeedName, GetDate(), File_Name, File_Size, CreateTime FROM Load_Folder_Parsed WHERE File_Name = @FileRootName -- Some "cheating" here because of I'm running the code in snippets and don't want to redeclare local vars-- INSERT INTO BatchControl (FeedName, FeedDate, FileName, FileSize, FileDate)-- SELECT 'Bretts Test Feed', GetDate(), 'Test.txt', 0, '1960-10-24'-- Let's go get the batchId so we can keep track of things SELECT @BatchId = Max(BatchId) FROM BatchControl-- --------------------------------------------------------------------------------------- Let's make some magic-- ------------------------------------------------------------------------------------- DELETE FROM FixedWidthLoad @CommandString = 'BULK INSERT FROM FixedWidthLoad' + '''' + @FileNameAndPath + '''' + 'WITH (FORMATFILE = '+ ''''+ 'D:\Data\FixedWidthLoad.fmt'+ ''''+ ')' EXEC(@CommandString)--Again the cheating thing--BULK INSERT FixedWidthLoad FROM 'D:\Data\Test.txt'--WITH (FORMATFILE = 'D:\Data\FixedWidthLoad.fmt')-- OK We're loaded...assign the BatchId UPDATE FixedWidthLoad SET BatchId = @BatchId WHERE BatchId IS NULL-- Again with the cheating thing-- UPDATE FixedWidthLoad -- SET BatchId = 1 -- WHERE BatchId IS NULL--SELECT * FROM BATCHControl-- -------------------------------------------------------------------------------- Need to build a loop here for every different record type within the feed...-- ------------------------------------------------------------------------------ DECLARE CSR2 CURSOR FOR SELECT LayoutCD, LayoutName, RecordIdentStart, RecordIdentLen FROM FixedWidthLayoutType WHERE FeedName = @FeedName OPEN CSR2 FETCH NEXT CSR2 INTO @LayoutCD, @LayoutName, @RecordIdentStart, @RecordIdentLen WHILE @@FETCH_STATUS = 0 BEGIN-- ----------------------------------------------------------------------------------------------------- Let's perform some Data Integrity Checks-- -----------------------------------------------------------------------------------------------------DECLARE @SQL varchar(8000) SELECT @SQL = '' SELECT @SQL = @SQL + SQL FROM ( SELECT 'SELECT * FROM FixedWidthLoad' AS SQL UNION ALL SELECT ' WHERE 1=1' UNION ALL SELECT ' AND ISNUMERIC(SUBSTRING(DataRow,' +CONVERT(varchar(5),FieldStartPos)+',' +CONVERT(varchar(5),FieldLength)+'))' +' <> 1 AND ' +'SUBSTRING(Datarow,1,1) = '+''''+'D'+'''' -- +'SUBSTRING(Datarow,'+@RecordIdentStart+','+@RecordIdentLen+') = '+''''+@LayoutCD+'''' FROM FixedWidthLayout WHERE FieldDataType NOT IN ('char','varchar', 'datetime','smalldatetime') AND FeedName = 'Bretts Test Feed' AND LayoutCD = 'D' ) AS XXX --SELECT @SQL EXEC(@SQL) SELECT '@@ROWCOUNT:' = Convert(varchar(25),@@ROWCOUNT) DECLARE @SQL varchar(8000) SELECT @SQL = '' SELECT @SQL = @SQL + SQL FROM ( SELECT 'SELECT * FROM FixedWidthLoad' AS SQL UNION ALL SELECT ' WHERE 1=1' UNION ALL SELECT ' AND ISDATE(SUBSTRING(DataRow,' +CONVERT(varchar(5),FieldStartPos)+',' +CONVERT(varchar(5),FieldLength)+'))' +' <> 1 AND ' +'SUBSTRING(Datarow,1,1) = '+''''+'D'+'''' -- +'SUBSTRING(Datarow,'+@RecordIdentStart+','+@RecordIdentLen+') = '+''''+@LayoutCD+'''' FROM FixedWidthLayout WHERE FieldDataType IN ('datetime','smalldatetime') AND FeedName = 'Bretts Test Feed' AND LayoutCD = 'D' ) AS XXX SELECT @SQL EXEC(@SQL) SELECT '@@ROWCOUNT:' = Convert(varchar(25),@@ROWCOUNT)-- -------------------------------------------------------------------------------------------------------------- Now let's get the data out and in to it's new home-- ----------------------------------------------------------------------------------------------------------- DECLARE @SQL varchar(8000) SELECT @SQL = '' SELECT @SQL = @SQL + SQL FROM ( SELECT ' INSERT INTO '+REPLACE(FileRootName+'_'+LayoutName,' ','_')+ ' ('+FieldName AS SQL , l.FeedName , 1 AS SQL_GROUP , 1 AS ROW_ORDER FROM FixedWidthLayout l INNER JOIN FixedWidthLayoutType r1 ON l.FeedName = r1.FeedName AND l.LayoutCd = r1.LayoutCD AND l.FieldOrdPos = 1 AND l.LayoutCD = 'D' INNER JOIN FixedWidthFeed r2 ON l.FeedName = r2.FeedName WHERE l.FeedName = 'Bretts Test Feed' UNION ALL SELECT ', ' + FieldName , FeedName , 2 AS SQL_GROUP , FieldOrdPos FROM FixedWidthLayout WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' AND FieldOrdPos <> 1 UNION ALL SELECT ')' , FeedName , 3 AS SQL_GROUP , 1 AS FieldOrdPos FROM FixedWidthLayout WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' AND FieldOrdPos = 1 UNION ALL SELECT ' SELECT SUBSTRING(DataRow,' +Convert(varchar(4),FieldStartPos)+',' +Convert(varchar(4),FieldLength)+') AS ' + FieldName AS SQL , FeedName , 4 AS SQL_GROUP , 1 AS FieldOrdPos FROM FixedWidthLayout WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' AND FieldOrdPos = 1 UNION ALL SELECT ', SUBSTRING(DataRow,' +Convert(varchar(4),FieldStartPos)+',' +Convert(varchar(4),FieldLength)+') AS ' + FieldName AS SQL , FeedName , 5 AS SQL_GROUP , FieldOrdPos FROM FixedWidthLayout WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' AND FieldOrdPos <> 1 UNION ALL SELECT ' FROM FixedWidthLoad' , FeedName , 6 AS SQL_GROUP , 1 AS FieldOrdPos FROM FixedWidthLayout WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' AND FieldOrdPos = 1 UNION ALL SELECT ' WHERE SUBSTRING(DataRow,'+Convert(varchar(4),RecordIdentStart)+','+Convert(varchar(4),RecordIdentLen)+') = ' + '''' + LayoutCD + '''' , FeedName , 7 AS SQL_GROUP , 1 AS FieldOrdPos FROM FixedWidthLayoutType WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' UNION ALL SELECT ' AND BatchId = ' + Convert(varchar(15),1) , FeedName , 8 AS SQL_GROUP , 1 AS FieldOrdPos FROM FixedWidthLayoutType WHERE LayoutCD = 'D' AND FeedName = 'Bretts Test Feed' ) AS XXX --ORDER BY FeedName, SQL_GROUP, ROW_ORDER --Produces: SELECT LEN(@SQL), @SQL EXEC(@SQL)--Produces:--INSERT INTO Test_Detail_Record ( Record_Type-- , Row_Date-- , Produce-- , Produce_Amt--)-- SELECT SUBSTRING(DataRow,1,1) AS Record_Type-- , SUBSTRING(DataRow,2,10) AS Row_Date-- , SUBSTRING(DataRow,13,8) AS Produce-- , SUBSTRING(DataRow,21,11) AS Produce_Amt-- FROM FixedWidthLoad--WHERE SUBSTRING(DataRow,1,1) = 'D'--AND BatchId = 1-- -------------------------------------------------------------------------------------------------------- End of CRS2 WHILE LOOP-- ------------------------------------------------------------------------------------------------------ END-- -------------------------------------------------------------------------------------------------------- End of File Existance Check -- ------------------------------------------------------------------------------------------------------ END-- ---------------------------------------------------------------------------------------------------------- Lets Audit the Data - So far only set up for Record Counts -- --------------------------------------------------------------------------------------------------------SELECT SQL FROM (SELECT ' SELECT COUNT(*) AS '+REPLACE(LayoutName,' ','_')+'_COUNT FROM ' +REPLACE(FileRootName+'_'+LayoutName,' ','_') AS SQL , l.FeedName , 1 AS SQL_GROUP , 1 AS ROW_ORDER FROM FixedWidthLayout lINNER JOIN FixedWidthLayoutType r1 ON l.FeedName = r1.FeedName AND l.LayoutCd = r1.LayoutCD AND l.FieldOrdPos = 1-- AND l.LayoutCD = 'D' INNER JOIN FixedWidthFeed r2 ON l.FeedName = r2.FeedName WHERE l.FeedName = 'Bretts Test Feed'--ProducesSELECT COUNT(*) AS Detail_Record_COUNT FROM Test_Detail_Record -- -------------------------------------------------------------------------------------------------------- This should be the end of the loop for CSR1...go on an process the nect file.-- ------------------------------------------------------------------------------------------------------ENDGO And the supporting castCREATE TABLE BatchControl ( BatchId Int IDENTITY(1,1) , FeedName varchar(255) , FeedDate datetime , FileName varchar(255) , FileSize int , FileDate datetime)GOCREATE TABLE FixedWidthLoad ( DataRow varchar(7500) , BatchId Int , RowId int IDENTITY (1,1))GOCREATE TABLE FixedWidthFeed ( FeedName varchar(255) , FileRootName Varchar(20) , FileExt varchar(3) , FilePath varchar(255) , LoadFreq Char(1))GO CREATE TABLE FixedWidthLayoutType ( FeedName varchar(255) , LayoutCD varchar(5) , LayoutName Varchar(50) , RecordIdentStart Int , RecordIdentLen Int , AuditInd Char(1) , AuditStart int , AuditLength int , InclideInCountsInd int)GO CREATE TABLE FixedWidthLayout ( FeedName varchar(255) , LayoutCD varchar(5) , FieldName varchar(50) , FieldOrdPos int , FieldDataType varchar(20) , FieldStartPos int , FieldLength int)GOCREATE TABLE FixedWidthConstr ( FeedName varchar(255) , FieldName varchar(50) , FieldConstr varchar(255))GOCREATE TABLE [dbo].[Load_Folder_Parsed] ( [Create_Time] [datetime] NULL , [File_Size] [int] NULL , [File_Name] [varchar] (255) NULL ) ON [PRIMARY]GOCREATE TABLE [dbo].[load_folder] ( [dir_output] [varchar] (255) NULL ) ON [PRIMARY]GO-- Need to add all contraints, FK's PK's...sorry lazy me againINSERT INTO FixedWidthFeed (FeedName, FileRootName, FileExt, FilePath, LoadFreq)SELECT 'Bretts Test Feed', 'Test', 'txt', 'D:\Data\', 'D'UNION ALLSELECT 'Bretts Test Feed 2', 'Test2', 'txt', 'D:\Data\', 'D'GOINSERT INTO FixedWidthLayoutType (FeedName, LayoutCD, LayoutName, RecordIdentStart, RecordIdentLen , AuditInd, AuditStart, AuditLength, IncludeInCountsInd)SELECT 'Bretts Test Feed', 'H','Header Record',1,1, 'N', 0, 0, 'N'UNION ALLSELECT 'Bretts Test Feed', 'D','Detail Record',1,1, 'N', 0, 0, 'Y'UNION ALLSELECT 'Bretts Test Feed', 'T','Trailer Record',1,1, 'Y', 48, 11, 'N'UNION ALLSELECT 'Bretts Test Feed 2', 'H','Header Record',1,1, 'N', 0, 0, 'N'UNION ALLSELECT 'Bretts Test Feed 2', 'D','Detail Record',1,1, 'N', 0, 0, 'Y'UNION ALLSELECT 'Bretts Test Feed 2', 'T','Trailer Record',1,1, 'Y', 48, 11, 'N'GODELETE FROM FixedWidthLayoutGO-- CREATE FILE INPUT DEF #1INSERT INTO FixedWidthLayout (FeedName, LayoutCD, FieldName, FieldOrdPos, FieldDataType, FieldStartPos, FieldLength)SELECT 'Bretts Test Feed', 'H','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed', 'H','Feed_Date', 2, 'DateTime', 10, 10UNION ALLSELECT 'Bretts Test Feed', 'H','Descr', 3, 'varchar', 20, 30UNION ALLSELECT 'Bretts Test Feed', 'D','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed', 'D','Row_Date', 2, 'datetime', 2, 10UNION ALLSELECT 'Bretts Test Feed', 'D','Produce', 3, 'varchar', 13, 8UNION ALLSELECT 'Bretts Test Feed', 'D','Produce_Amt', 4, 'decimal(10,2)', 21, 11UNION ALLSELECT 'Bretts Test Feed', 'T','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed', 'T','Feed_Date', 2, 'DateTime', 10, 10UNION ALLSELECT 'Bretts Test Feed', 'T','Descr', 3, 'varchar', 20, 30UNION ALLSELECT 'Bretts Test Feed', 'T','Record_Count', 4, 'Int', 50, 11GO-- CREATE FILE INPUT DEF #2INSERT INTO FixedWidthLayout (FeedName, LayoutCD, FieldName, FieldOrdPos, FieldDataType, FieldStartPos, FieldLength)SELECT 'Bretts Test Feed 2', 'H','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed 2', 'H','Feed_Date', 2, 'DateTime', 10, 10UNION ALLSELECT 'Bretts Test Feed 2', 'H','Descr', 3, 'varchar', 20, 30UNION ALLSELECT 'Bretts Test Feed 2', 'D','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed 2', 'D','Row_Date', 2, 'datetime', 2, 10UNION ALLSELECT 'Bretts Test Feed 2', 'D','Produce', 3, 'varchar', 13, 8UNION ALLSELECT 'Bretts Test Feed 2', 'D','Produce_Amt', 4, 'decimal(10,2)', 21, 11UNION ALLSELECT 'Bretts Test Feed 2', 'T','Record_Type', 1, 'char', 1, 1UNION ALLSELECT 'Bretts Test Feed 2', 'T','Feed_Date', 2, 'DateTime', 10, 10UNION ALLSELECT 'Bretts Test Feed 2', 'T','Descr', 3, 'varchar', 20, 30UNION ALLSELECT 'Bretts Test Feed 2', 'T','Record_Count', 4, 'Int', 50, 11GO-- Create the Tables from your definitionsDECLARE @SQL varchar(8000)--not that you would but you could--SELECT @SQL = SQL FROM (SELECT SQL FROM ( SELECT 'CREATE TABLE ' + REPLACE(FileRootName+'_'+LayoutName,' ','_') + '( ' AS SQL , l.FeedName, LayoutCD, 1 AS SQL_GROUP, 1 AS ROW_ORDER FROM FixedWidthFeed lINNER JOIN FixedWidthLayoutType r ON l.FeedName = r.FeedNameUNION ALL SELECT ' ' + FieldName + ' ' + + CASE WHEN FieldDatatype In ('char', 'varchar') THEN FieldDataType + '(' + CONVERT(varchar(4),FieldLength) + ')' ELSE FieldDataType END AS SQL , FeedName, LayoutCD, 2 AS SQL_GROUP, 1 AS ROW_ORDER FROM FixedWidthLayout l WHERE FieldOrdPos = 1UNION ALL SELECT ' , ' + FieldName + ' ' + + CASE WHEN FieldDatatype In ('char', 'varchar') THEN FieldDataType + '(' + CONVERT(varchar(4),FieldLength) + ')' ELSE FieldDataType END AS SQL , FeedName, LayoutCD, 2 AS SQL_GROUP, FieldOrdPos AS ROW_ORDER FROM FixedWidthLayout l WHERE FieldOrdPos <> 1UNION ALL SELECT ')' AS SQL , FeedName, LayoutCD, 3 AS SQL_GROUP, 1 AS ROW_ORDER FROM FixedWidthLayoutTypeUNION ALL SELECT 'GO' AS SQL , FeedName, LayoutCD, 4 AS SQL_GROUP, 1 AS ROW_ORDER FROM FixedWidthLayoutType) AS XXXOrder by FeedName, LayoutCD, SQL_GROUP, ROW_ORDER -- Which Produces:CREATE TABLE Test_Detail_Record( Record_Type char(1) , Row_Date datetime , Produce varchar(8) , Produce_Amt decimal(10,2))GOCREATE TABLE Test_Header_Record( Record_Type char(1) , Feed_Date DateTime , Descr varchar(30))GOCREATE TABLE Test_Trailer_Record( Record_Type char(1) , Feed_Date DateTime , Descr varchar(30) , Record_Count Int)GOCREATE TABLE Test2_Detail_Record( Record_Type char(1) , Row_Date datetime , Produce varchar(8) , Produce_Amt decimal(10,2))GOCREATE TABLE Test2_Header_Record( Record_Type char(1) , Feed_Date DateTime , Descr varchar(30))GOCREATE TABLE Test2_Trailer_Record( Record_Type char(1) , Feed_Date DateTime , Descr varchar(30) , Record_Count Int)GO And the Data of course-- FixedWidthLoad.fmt--8.0--3--1 SQLCHAR 0 7500 "\r\n" 1 DataRow SQL_Latin1_General_CP1_CI_AS--2 SQLCHAR 0 0 "" 0 BatchId SQL_Latin1_General_CP1_CI_AS--3 SQLCHAR 0 0 "" 0 RowId SQL_Latin1_General_CP1_CI_AS-- Test.txt--H 07/04/2003Header Record For Test File--D07/04/2003 APPLES 1.00--D07/04/2003 PEACHES 10000000.00--D07/04/2003 PEACHES 10000000.00--T 07/04/2003Trailer Record For Test File 00000000003--Dont include these last three lines--123456789012345678901234567890123456789012345678901234567890-- 1 2 3 4 5 6SELECT * FROM Test_Detail_RecordSELECT * FROM FixedWidthLayoutSELECT * FROM FixedWidthLayoutTypeSELECT * FROM FixedWidthFeedSELECT * FROM FixedWidthLoadSELECT * FROM BatchControlDROP TABLE FixedWidthLoadGODROP TABLE FixedWidthFeedGODROP TABLE FixedWidthLayoutGODROP TABLE FixedWidthLayoutTypeGODROP TABLE BatchControlGODROP TABLE FixedWidthConstrGODROP TABLE Load_Folder_ParsedGODROP TABLE Load_FolderGODROP TABLE Test_Detail_RecordGODROP TABLE Test_Header_RecordGODROP TABLE Test_Trailer_RecordGODROP TABLE Test2_Detail_RecordGODROP TABLE Test2_Header_RecordGODROP TABLE Test2_Trailer_RecordGO ooooo my head....somebody call me a bartender....Brett8-) |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-07-29 : 19:23:10
|
OK, you're a bartender. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-07-31 : 12:32:16
|
Thanks I feel much better now....Still working on the final solution...Brett8-)SELECT POST=NewId() |
|
|
Todd Morrow
Starting Member
1 Post |
Posted - 2004-03-05 : 16:47:20
|
I'll post a function (later) that converts COBOL record layouts to a bcp format file, as well as one to convert those layouts to "create table" statements. |
|
|
|
|
|