Please start any new threads on our new site at http://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

Our new SQL Server Forums are live! Come on over! We've restricted the ability to create new threads on these forums.

SQL Server Forums
Profile | Active Topics | Members | Search | Forum FAQ
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 Import File W/ Header and Trailers
 Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic  

X002548
Not Just a Number

15586 Posts

Posted - 07/15/2003 :  12:38:36  Show Profile  Reply with Quote
OK, I searched the forums, and they all talk about Creating headers and trailers.

No problems there.

I'm getting (or soon to get) COBOL files with headers and trailers (1 each per file)

The only thing I can think of is loading the files in to a work table with 1 column (varchar(8000)).

Perform my audits between the header/trailer and data, then do an insert (inefficient) with SUBSTRINGS in to a table.

Of course the file is fixed width.

Maybe I could bcp the data back out sans the H/T records...

Any thoughts...seems like a lot of hoop jumping....



Brett

8-)

setbasedisthetruepath
Used SQL Salesman

USA
992 Posts

Posted - 07/15/2003 :  13:43:04  Show Profile  Reply with Quote
If you've got a file with specific, machine-generated formatting then DTS is the way to go. I would use a staging table with the same layout as the file though, not a varchar(8000).

You might consider stripping off the last line of the file to get rid of the trailer; there are many file utilities out there that do it. Alternatively, force the COBOL source to not include it as it is bad data.

Edited by - setbasedisthetruepath on 07/15/2003 13:45:00
Go to Top of Page

nr
SQLTeam MVY

United Kingdom
12543 Posts

Posted - 07/15/2003 :  13:49:37  Show Profile  Visit nr's Homepage  Reply with Quote
>> The only thing I can think of is loading the files in to a work table with 1 column (varchar(8000)).

That's what I always try to do if possible - a staging table in a staging database.
Depends on how much data you have and how fast you want it to be whether you can do inserts or not.
What time constraints do you have? i.e. do you need it to not impact the production table for very long or do you have to get the data into the database quickly after the file is delivered?

You could also try importing with the max errors set so that it fails on the header and trailer but accepts the rest - but you would have to be careful that the header did not lose the next record. Not a very nice option anyway.

The bcp in / out / in would work but as you say is a bit finicky.

Could write an app which parses the file and creates a file to import after the checks - but that would be the same as the multiple bcps.



==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page

setbasedisthetruepath
Used SQL Salesman

USA
992 Posts

Posted - 07/15/2003 :  13:52:36  Show Profile  Reply with Quote
I also like the staging idea but not the varchar(8000) b/c then you need to perform the data conversion checks manually that DTS would do automatically. Every date column, every currency column, every numeric column, etc.

Jonathan
{0}
Go to Top of Page

robvolk
Most Valuable Yak

USA
15732 Posts

Posted - 07/15/2003 :  13:57:46  Show Profile  Visit robvolk's Homepage  Reply with Quote
Using the SUBSTRING() method isn't bad at all, it depends on the file format. If the data is stored fixed-length, you can define a record layout in another table and CROSS JOIN it with the imported rows. The record layout table has Start and Length columns that you feed to Substring(). As long as the format is consistent within the file this works like a charm. And if there is something in the header and trailer records that identifies them, you can include extra layout records to parse them as well. These can then be included or excluded as needed using a WHERE clause.

I have something like this set up at work to import NACHA formatted bank transaction files. This format is replete with file headers, batch headers, footers, etc. in addition to the detail records, all in the same file. Every record type is identified in the spec and is easily parseable using a record layout table like I described. I'll try posting it here later today after I get home from work.

Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 07/15/2003 :  14:07:32  Show Profile  Reply with Quote
Thanks everyone.

Rob, I love the config table idea!

We're talking dynamic sql here correct?

Config table holds the column definitions and column name

Yeah I got...and yes they are all fixed.

In all I have many of these files. I can add a column that identifies the source file and use a predicaste.

Cool. I can write one process and config it on the fly...

I jazzed up now.

Thanks guys..



Brett

8-)
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 07/15/2003 :  14:11:18  Show Profile  Reply with Quote
AND>>>>

The sproc will determine if the process needs to be a bcp out if the data is too large or if it can be an insert...

I'll have the whole thing kick off over night during a maint window or something...



Brett

8-)
Go to Top of Page

setbasedisthetruepath
Used SQL Salesman

USA
992 Posts

Posted - 07/15/2003 :  14:19:17  Show Profile  Reply with Quote
Rob-
What would you say are the advantages of your method? I'm intrigued by the approach, but I wonder what functionality is there aside from an elegant re-write of the DTS data pump task.

Jonathan
{0}
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 07/15/2003 :  14:56:37  Show Profile  Reply with Quote
I like the fact that it will be table driven...don't even have to write anything new when I'm done.

I could even add the frequency of the load per file...so that it could be part of a sql predicate to know what to look for.

All the file will audit the same way, any exception will be handled the same.

I'll be even to do data integrity checking BEFORE I load...

OH, lots of upside here....


Downside of course if the data becomes large...just have a process to check and bcp out and then in to the waiting table, instead of insert..

(besides, I'm not a fan of DTS)

Also I'm dealing with money on this one, so I want to be squeaky clean..



Brett

8-)
Go to Top of Page

setbasedisthetruepath
Used SQL Salesman

USA
992 Posts

Posted - 07/15/2003 :  15:58:34  Show Profile  Reply with Quote
quote:
(besides, I'm not a fan of DTS)

Why? I suspect many developers tried DTS w/ SQL Server 7.0 and formed a bad opinion of the product (deservedly so), and never tried it again w/ SQL 2K. DTS under SQL 2K is a far superior product, and from what literature is becoming available about Yukon, it will be significantly enhanced yet again.

Aside from the more mechanical aspects of DTS - the actual movement of data from source to destination, you also get many fringe benefits: versioning, logging, a graphical interface that practically self-documents, etc.

Thoughts?

Jonathan
{0}
Go to Top of Page

robvolk
Most Valuable Yak

USA
15732 Posts

Posted - 07/16/2003 :  08:28:34  Show Profile  Visit robvolk's Homepage  Reply with Quote
OK, first lets get the data file out of the way. Save this into a file named flintstones.txt (or whatever):
1 Flintstone         Bedrock      
2 Fred Father
2 Wilma Mother
2 Pebbles Daughter
2 Dino Pet
1 Rubble Bedrock
2 Barney Father
2 Betty Mother
2 Bam Bam Son
1 Slate Bedrock
2 Mr. Boss
1 Rockhead Bedrock
2 Joe Co-worker
And here's the SQL to get you started:

SET NOCOUNT ON
CREATE TABLE FileFormats(type varchar(4) NOT NULL,
Item varchar(10) NOT NULL,
Start smallint NOT NULL,
Length smallint NOT NULL, CONSTRAINT PK_FileFormats PRIMARY KEY(type, Start))

CREATE TABLE ImportData(LineData varchar(8000) Null, linenum int not null)

INSERT INTO FileFormats VALUES('1', 'Family', 1, 2)
INSERT INTO FileFormats VALUES('1', 'LastName', 3, 19)
INSERT INTO FileFormats VALUES('1', 'City', 22, 15)
INSERT INTO FileFormats VALUES('2', 'Person', 1, 2)
INSERT INTO FileFormats VALUES('2', 'FirstName', 3, 13)
INSERT INTO FileFormats VALUES('2', 'Title', 16, 20)

exec('alter table importdata drop column linenum') --this is not needed if you use a format file
--change this to match the path and filename you're using
BULK INSERT importData FROM 'Z:\documents\flintstones.txt'
exec('alter table ImportData ADD linenum int not null identity(1,1), family int null') --this is not needed if you use a format file

CREATE CLUSTERED INDEX CL_LineNum ON ImportData(LineNum)

SELECT C.LineNum, A.Type, A.Item AS Class, B.Item, SubString(C.LineData, B.Start, B.Length) AS Value
FROM FileFormats A INNER JOIN FileFormats B ON A.Start=1 AND A.Type=B.Type AND B.Start<>1
INNER JOIN ImportData C ON SubString(C.LineData, A.Start, A.Length)=B.Type
ORDER BY C.LineNum, B.Start

SELECT DISTINCT LineNum Start, LineNum*LineNum Finish INTO #Nums FROM ImportData WHERE SubString(LineData, 1, 1)='1'
UPDATE #nums SET Finish=(SELECT IsNull(Min(A.Start), #Nums.Finish)-1 FROM #Nums A WHERE A.Start>#Nums.Start)

SELECT N.Start AS FamilyID, C.LineNum, A.Item AS Class, B.Item, SubString(C.LineData, B.Start, B.Length) AS Value
FROM FileFormats A INNER JOIN FileFormats B ON A.Start=1 AND A.Type=B.Type AND B.Start<>1
INNER JOIN ImportData C ON SubString(C.LineData, A.Start, A.Length)=B.Type
INNER JOIN #Nums N ON C.LineNum BETWEEN N.Start AND N.Finish
ORDER BY C.LineNum, B.Start

drop table fileformats
drop table importdata
drop table #nums


I was hoping to set up a cross-tab that would combine the header (family) lines with the detail (person) lines, but I blew it. You can probably figure that out if you needed to anyway.

This example is pretty simple, there are two line types (1 and 2) that have different formats. The Type column defines a line type, in this example the line type is actually in the line data itself (the NACHA format is set up this way as was my inspiration). If your lines don't have a clearly identifiable bit of data then you might have to tweak the format definition, or use it with a WHERE clause to get only the headers, details, footers etc.

Since the line type is stored in the data itself, I used a self-join on the FileFormats table to make the parsing work right. I also use an identity column to maintain the line numbers; go ahead and BOOOOO HISSSSS!!! me, but in this case position DOES matter! Again this was from the NACHA format.

Most of the advantages Brett already mentioned: the import process becomes data-driven and very easy to maintain, very easy to add new formats. Right now I have three different text file formats for various payment files. They all have the same basic info (account number, payment amount, payment date) from 3 different sources (credit cards, lockbox process, and NACHA transactions) All of them are handled using the table format, and fortunately only the NACHA file has multiple formats, the others are plain text files. I also had an ImportType column to distinguish LBOX from CREDIT from NACHA types so that the parsing is done correctly for each file type. I also have bcp format files for some file types, basically to discard extraneous crap.

As far as converting data types, well, you kinda have to do that anyway, DTS just does it behind the scenes. I started playing with storing column names and types in the fileformat table, so that I could possibly dynamically construct the SQL to insert into the proper tables, but I don't think I'll bother since I don't have that many formats.

I also started setting up a batch import queue table, very much like Brett described, to manage the actual import and processing of the files. Combined with this technique:

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=24371

I can take a load of text files in any format, dump them into a folder, and process each type as I need to. I can BULK INSERT all of them in one go and parse them as one batch, or individually. I can use the DIR output to manage the file names & sizes, and update it to include rows imported, import date, whatever. I'm also saving the file name in addition to the line data so I can go back and determine which file it came from. I don't think I need to have all this but at least I won't be crying later if I do need it. It's one of those things that just makes more sense to do in tables than it does to manage using DTS.

And for the record, I LOVE DTS, I just don't use it for text files anymore. BULK INSERT and bcp rule there.



Edited by - robvolk on 07/16/2003 08:29:36
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 07/29/2003 :  17:58:15  Show Profile  Reply with Quote
Ok...

A couple of things..this does not work as a whole
(my brain is about to explode)

but when executing the individual snipets in the stream it will

EDIT: Plus, unless I miss my guess...I've done everything in this
code that I have protested against...dynamic SQL

EDIT2: Did I forget to mention Identities?

Luke I am...No it can't be true

Not only dynamic SQL But teetering on the line of dynamic object creation ( i pulled back a little from that..still don't think its a good idea).

It needs lots, error handling, ect, but it will, based on config tables

Locate a directory of a source file
Load it to a work table
Identify the Name of the table to load
Parse it out and load it
Do Data integrity checks on date and numeric data
Audit a trailer file and check to see if you have everything (about 80% complete)
It can handle many files, and many record types with in a file

What it needs.

A compilable sproc
A better "schedule" method (well there isn't one, if the file exists it's loaded)
An alternative method to bcp out/in largered files
A Log of all the events that happed during the load
..and probably more than I can think of now...

(what's this grey matter oozing from my ear)

I'll get to it more tomorrow (as I've got to get it in soon), and I'll post the final when it's 100%

But I was having WITHDRAWAL SEE! (plus I need a break)

Another note. The first is my "draft" code. The supporting data and tables follow.

AND 2000!



CREATE PROC isp_FixedWidthFileLoader
AS

DECLARE   @FeedName varchar(255)
	, @FileRootName varchar(20)
	, @FileExtvarchar(3)
	, @FilePath varchar(255)
	, @LoadFreq char(1)
	, @FilenameAndPath varchar(278)
	, @CommandString varchar(8000)
	, @BatchId Int
	, @LayoutCD varchar(5)
	, @LayoutName varchar(50)
	, @RecordIdentStart Int
	, @RecordIdentLen Int
	, @RowCount int

DECLARE CSR CURSOR FOR

	SELECT FeedName, FileRootName, FileExt, FilePath, LoadFreq FROM FixedWidthFeed

OPEN CSR

FETCH NEXT CSR INTO @FeedName, @FileRootName, @FileExt, @FilePath, @LoadFreq

WHILE @@FETCH_STATUS = 0
	BEGIN

-- -----------------------------------------------------------------
-- Let's go fishing..does the file even exist?
-- -----------------------------------------------------------------

	SELECT @CommandString = 'DIR '+ @FilePath+@FileRootName, @FileNameAndPath = @FilePath+@FileRootName

	Delete From Load_Folder

	Insert Into Load_Folder exec master..xp_cmdshell @CommandString

	Delete From Load_Folder_Parsed

	Insert Into Load_Folder_Parsed (Create_Time, File_Size, File_Name )
	Select	  Convert(datetime,Substring(dir_output,1,8)
		+ ' ' 
		+ (Substring(dir_output,11,5) 
		+ Case When Substring(dir_output,16,1) = 'a' Then ' AM' Else ' PM' End)) As Create_Time
		, Convert(Int,LTrim(RTrim(Replace(Substring(dir_output,17,22),',','')))) As File_Size 
		, Substring(dir_output,40,(Len(dir_output)-39)) As File_Name
	  From 	  Ledger_Folder
	 Where  Substring(dir_output,1,1) <> ' '
	   And (Substring(dir_output,1,1) <> ' ' 
	   And  Substring(dir_output,25,5) <> '<DIR>')

--Does the file exist?

	IF EXISTS (SELECT 1 FROM Load_Folder_Parsed WHERE File_Name = @FileRootName)

--yup, lets prepare the Batch Control Table

	BEGIN
		INSERT INTO BatchControl (FeedName, FeedDate, FileName, FilesSize, FileDate)
		SELECT @FeedName, GetDate(), File_Name, File_Size, CreateTime
		FROM Load_Folder_Parsed WHERE File_Name = @FileRootName		


-- Some "cheating" here because of I'm running the code in snippets and don't want to redeclare local vars

--		INSERT INTO BatchControl (FeedName, FeedDate, FileName, FileSize, FileDate)
--		SELECT 'Bretts Test Feed', GetDate(), 'Test.txt', 0, '1960-10-24'


-- Let's go get the batchId so we can keep track of things

		SELECT @BatchId = Max(BatchId) FROM BatchControl

-- -------------------------------------------------------------------------------------
-- Let's make some magic
-- -------------------------------------------------------------------------------------

		DELETE FROM FixedWidthLoad

		@CommandString = 'BULK INSERT FROM FixedWidthLoad' + '''' + @FileNameAndPath + ''''
				+ 'WITH (FORMATFILE = '+ ''''+ 'D:\Data\FixedWidthLoad.fmt'+ ''''+ ')'

		EXEC(@CommandString)

--Again the cheating thing

--BULK INSERT FixedWidthLoad FROM 'D:\Data\Test.txt'
--WITH (FORMATFILE = 'D:\Data\FixedWidthLoad.fmt')


-- OK We're loaded...assign the BatchId

		UPDATE FixedWidthLoad 
			SET BatchId = @BatchId 
	  	 WHERE BatchId IS NULL

-- Again with the cheating thing

--	UPDATE FixedWidthLoad 
--		SET BatchId = 1 
--	  WHERE BatchId IS NULL

--SELECT * FROM BATCHControl

-- ------------------------------------------------------------------------------
-- Need to build a loop here for every different record type within the feed...
-- ------------------------------------------------------------------------------


		DECLARE CSR2 CURSOR FOR
		
			SELECT LayoutCD, LayoutName, RecordIdentStart, RecordIdentLen 
			  FROM FixedWidthLayoutType
			 WHERE FeedName = @FeedName
		
		OPEN CSR2
		
		FETCH NEXT CSR2 INTO @LayoutCD, @LayoutName, @RecordIdentStart, @RecordIdentLen 
		
		WHILE @@FETCH_STATUS = 0
			BEGIN

-- ---------------------------------------------------------------------------------------------------
-- Let's perform some Data Integrity Checks
-- ---------------------------------------------------------------------------------------------------

--DECLARE @SQL varchar(8000)
		SELECT @SQL = ''
		SELECT @SQL = @SQL + SQL FROM (
		SELECT 'SELECT * FROM FixedWidthLoad' AS SQL
		UNION ALL 
		SELECT  ' WHERE 1=1'
		UNION ALL
		SELECT	'   AND ISNUMERIC(SUBSTRING(DataRow,'
			+CONVERT(varchar(5),FieldStartPos)+','
			+CONVERT(varchar(5),FieldLength)+'))'
			+' <> 1 AND '
			+'SUBSTRING(Datarow,1,1) = '+''''+'D'+''''
		--	+'SUBSTRING(Datarow,'+@RecordIdentStart+','+@RecordIdentLen+') = '+''''+@LayoutCD+''''
		FROM FixedWidthLayout
		WHERE FieldDataType NOT IN ('char','varchar', 'datetime','smalldatetime')
		AND FeedName = 'Bretts Test Feed'
		AND LayoutCD = 'D'
		) AS XXX
		
		--SELECT @SQL
		
		EXEC(@SQL)
		
		SELECT '@@ROWCOUNT:' = Convert(varchar(25),@@ROWCOUNT)



		DECLARE @SQL varchar(8000)
		SELECT @SQL = ''
		SELECT @SQL = @SQL + SQL FROM (
		SELECT 'SELECT * FROM FixedWidthLoad' AS SQL
		UNION ALL 
		SELECT  ' WHERE 1=1'
		UNION ALL
		SELECT	'   AND ISDATE(SUBSTRING(DataRow,'
			+CONVERT(varchar(5),FieldStartPos)+','
			+CONVERT(varchar(5),FieldLength)+'))'
			+' <> 1 AND '
			+'SUBSTRING(Datarow,1,1) = '+''''+'D'+''''
		--	+'SUBSTRING(Datarow,'+@RecordIdentStart+','+@RecordIdentLen+') = '+''''+@LayoutCD+''''
		FROM FixedWidthLayout
		WHERE FieldDataType IN ('datetime','smalldatetime')
		AND FeedName = 'Bretts Test Feed'
		AND LayoutCD = 'D'
		) AS XXX
		
		SELECT @SQL
		
		EXEC(@SQL)
		
		SELECT '@@ROWCOUNT:' = Convert(varchar(25),@@ROWCOUNT)

-- ------------------------------------------------------------------------------------------------------------
-- Now let's get the data out and in to it's new home
-- -----------------------------------------------------------------------------------------------------------
		
		DECLARE @SQL varchar(8000)
		SELECT @SQL = ''
		SELECT @SQL =  @SQL + SQL 
		FROM (
		SELECT '  INSERT INTO '+REPLACE(FileRootName+'_'+LayoutName,' ','_')+ ' ('+FieldName AS SQL
			, l.FeedName
			, 1 AS SQL_GROUP
			, 1 AS ROW_ORDER
		  FROM FixedWidthLayout l
		INNER JOIN FixedWidthLayoutType r1 
			ON l.FeedName = r1.FeedName
		       AND l.LayoutCd = r1.LayoutCD
		       AND l.FieldOrdPos = 1
		       AND l.LayoutCD = 'D' 
		INNER JOIN FixedWidthFeed r2 
			ON l.FeedName = r2.FeedName
		     WHERE l.FeedName = 'Bretts Test Feed'
		UNION ALL
		SELECT ', ' + FieldName
			, FeedName
			, 2 AS SQL_GROUP
			, FieldOrdPos
		  FROM FixedWidthLayout
		 WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		AND FieldOrdPos <> 1
		UNION ALL
		SELECT ')'
			, FeedName
			, 3 AS SQL_GROUP
			, 1 AS FieldOrdPos
		  FROM FixedWidthLayout
		 WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		AND FieldOrdPos = 1
		UNION ALL
		SELECT '  SELECT SUBSTRING(DataRow,'
			+Convert(varchar(4),FieldStartPos)+','
			+Convert(varchar(4),FieldLength)+') AS ' + FieldName AS SQL
			, FeedName
			, 4 AS SQL_GROUP
			, 1 AS FieldOrdPos
		  FROM FixedWidthLayout
		 WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		AND FieldOrdPos = 1
		UNION ALL
		SELECT ', SUBSTRING(DataRow,'
			+Convert(varchar(4),FieldStartPos)+','
			+Convert(varchar(4),FieldLength)+')  AS ' + FieldName AS SQL 
			, FeedName
			, 5 AS SQL_GROUP
			, FieldOrdPos
		  FROM FixedWidthLayout
		 WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		AND FieldOrdPos <> 1
		UNION ALL
		SELECT ' FROM FixedWidthLoad'
			, FeedName
			, 6 AS SQL_GROUP
			, 1 AS FieldOrdPos
		  FROM FixedWidthLayout
		 WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		AND FieldOrdPos = 1
		UNION ALL
		SELECT ' WHERE SUBSTRING(DataRow,'+Convert(varchar(4),RecordIdentStart)+','+Convert(varchar(4),RecordIdentLen)+') = '
			+ '''' + LayoutCD + '''' 
			, FeedName
			, 7 AS SQL_GROUP
			, 1 AS FieldOrdPos
		FROM FixedWidthLayoutType
		WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		UNION ALL
		SELECT ' AND BatchId = ' + Convert(varchar(15),1)
			, FeedName
			, 8 AS SQL_GROUP
			, 1 AS FieldOrdPos
		FROM FixedWidthLayoutType
		WHERE LayoutCD = 'D'
		   AND FeedName = 'Bretts Test Feed'
		) AS XXX
		--ORDER BY FeedName, SQL_GROUP, ROW_ORDER
		
		--Produces:
		
		SELECT LEN(@SQL), @SQL
		
		EXEC(@SQL)


--Produces:

--INSERT INTO Test_Detail_Record ( Record_Type
--       , Row_Date
--       , Produce
--       , Produce_Amt
--)
--  SELECT SUBSTRING(DataRow,1,1) AS Record_Type
--       , SUBSTRING(DataRow,2,10)  AS Row_Date
--       , SUBSTRING(DataRow,13,8)  AS Produce
--       , SUBSTRING(DataRow,21,11)  AS Produce_Amt
--    FROM FixedWidthLoad
--WHERE SUBSTRING(DataRow,1,1) = 'D'
--AND BatchId = 1

-- ------------------------------------------------------------------------------------------------------
-- End of CRS2 WHILE LOOP
-- ------------------------------------------------------------------------------------------------------
		END
-- ------------------------------------------------------------------------------------------------------
-- End of File Existance Check 
-- ------------------------------------------------------------------------------------------------------
	END

-- --------------------------------------------------------------------------------------------------------
-- Lets Audit the Data - So far only set up for Record Counts 
-- --------------------------------------------------------------------------------------------------------

SELECT SQL FROM (

SELECT '  SELECT COUNT(*) AS '+REPLACE(LayoutName,' ','_')+'_COUNT FROM '
	+REPLACE(FileRootName+'_'+LayoutName,' ','_')  AS SQL
	, l.FeedName
	, 1 AS SQL_GROUP
	, 1 AS ROW_ORDER
  FROM FixedWidthLayout l
INNER JOIN FixedWidthLayoutType r1 
	ON l.FeedName = r1.FeedName
       AND l.LayoutCd = r1.LayoutCD
       AND l.FieldOrdPos = 1
--     AND l.LayoutCD = 'D' 
INNER JOIN FixedWidthFeed r2 
	ON l.FeedName = r2.FeedName
     WHERE l.FeedName = 'Bretts Test Feed'

--Produces

SELECT COUNT(*) AS Detail_Record_COUNT FROM Test_Detail_Record 


-- ------------------------------------------------------------------------------------------------------
-- This should be the end of the loop for CSR1...go on an process the nect file.
-- ------------------------------------------------------------------------------------------------------

END
GO



And the supporting cast


CREATE TABLE BatchControl (
	  BatchId 	Int IDENTITY(1,1)
	, FeedName 	varchar(255)
	, FeedDate 	datetime
	, FileName 	varchar(255)
	, FileSize 	int
	, FileDate 	datetime
)
GO

CREATE TABLE FixedWidthLoad (
	  DataRow 	varchar(7500)
	, BatchId 	Int
	, RowId 	int IDENTITY (1,1)
)
GO

CREATE TABLE FixedWidthFeed (
	  FeedName	varchar(255)
	, FileRootName	Varchar(20)
	, FileExt	varchar(3)
	, FilePath	varchar(255)
	, LoadFreq	Char(1)
)
GO 

CREATE TABLE FixedWidthLayoutType (
	  FeedName		varchar(255)
	, LayoutCD		varchar(5)
	, LayoutName		Varchar(50)
	, RecordIdentStart 	Int
	, RecordIdentLen	Int
	, AuditInd		Char(1)
	, AuditStart		int
	, AuditLength		int
	, InclideInCountsInd	int
)
GO 

CREATE TABLE FixedWidthLayout (
	  FeedName	varchar(255)
	, LayoutCD	varchar(5)
	, FieldName 	varchar(50)
	, FieldOrdPos	int
	, FieldDataType	varchar(20)
	, FieldStartPos	int
	, FieldLength	int
)
GO

CREATE TABLE FixedWidthConstr (
	  FeedName	varchar(255)
	, FieldName 	varchar(50)
	, FieldConstr	varchar(255)
)
GO

CREATE TABLE [dbo].[Load_Folder_Parsed] (
	[Create_Time] [datetime] NULL ,
	[File_Size] [int] NULL ,
	[File_Name] [varchar] (255) NULL 
) ON [PRIMARY]
GO

CREATE TABLE [dbo].[load_folder] (
	[dir_output] [varchar] (255) NULL 
) ON [PRIMARY]
GO

-- Need to add all contraints, FK's PK's...sorry lazy me again


INSERT INTO FixedWidthFeed (FeedName, FileRootName, FileExt, FilePath, LoadFreq)
SELECT 'Bretts Test Feed', 'Test', 'txt', 'D:\Data\', 'D'
UNION ALL
SELECT 'Bretts Test Feed 2', 'Test2', 'txt', 'D:\Data\', 'D'
GO

INSERT INTO FixedWidthLayoutType (FeedName, LayoutCD, LayoutName, RecordIdentStart, RecordIdentLen
				, AuditInd, AuditStart, AuditLength, IncludeInCountsInd)
SELECT 'Bretts Test Feed', 'H','Header Record',1,1, 'N', 0, 0, 'N'
UNION ALL
SELECT 'Bretts Test Feed', 'D','Detail Record',1,1, 'N', 0, 0, 'Y'
UNION ALL
SELECT 'Bretts Test Feed', 'T','Trailer Record',1,1, 'Y', 48, 11, 'N'
UNION ALL
SELECT 'Bretts Test Feed 2', 'H','Header Record',1,1, 'N', 0, 0, 'N'
UNION ALL
SELECT 'Bretts Test Feed 2', 'D','Detail Record',1,1, 'N', 0, 0, 'Y'
UNION ALL
SELECT 'Bretts Test Feed 2', 'T','Trailer Record',1,1, 'Y', 48, 11, 'N'
GO

DELETE FROM FixedWidthLayout
GO

-- CREATE FILE INPUT DEF #1

INSERT INTO FixedWidthLayout (FeedName, LayoutCD, FieldName, FieldOrdPos, FieldDataType, FieldStartPos, FieldLength)
SELECT 'Bretts Test Feed', 'H','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed', 'H','Feed_Date', 2, 'DateTime', 10, 10
UNION ALL
SELECT 'Bretts Test Feed', 'H','Descr', 3, 'varchar', 20, 30
UNION ALL
SELECT 'Bretts Test Feed', 'D','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed', 'D','Row_Date', 2, 'datetime', 2, 10
UNION ALL
SELECT 'Bretts Test Feed', 'D','Produce', 3, 'varchar', 13, 8
UNION ALL
SELECT 'Bretts Test Feed', 'D','Produce_Amt', 4, 'decimal(10,2)', 21, 11
UNION ALL
SELECT 'Bretts Test Feed', 'T','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed', 'T','Feed_Date', 2, 'DateTime', 10, 10
UNION ALL
SELECT 'Bretts Test Feed', 'T','Descr', 3, 'varchar', 20, 30
UNION ALL
SELECT 'Bretts Test Feed', 'T','Record_Count', 4, 'Int', 50, 11
GO

-- CREATE FILE INPUT DEF #2

INSERT INTO FixedWidthLayout (FeedName, LayoutCD, FieldName, FieldOrdPos, FieldDataType, FieldStartPos, FieldLength)
SELECT 'Bretts Test Feed 2', 'H','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed 2', 'H','Feed_Date', 2, 'DateTime', 10, 10
UNION ALL
SELECT 'Bretts Test Feed 2', 'H','Descr', 3, 'varchar', 20, 30
UNION ALL
SELECT 'Bretts Test Feed 2', 'D','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed 2', 'D','Row_Date', 2, 'datetime', 2, 10
UNION ALL
SELECT 'Bretts Test Feed 2', 'D','Produce', 3, 'varchar', 13, 8
UNION ALL
SELECT 'Bretts Test Feed 2', 'D','Produce_Amt', 4, 'decimal(10,2)', 21, 11
UNION ALL
SELECT 'Bretts Test Feed 2', 'T','Record_Type', 1, 'char', 1, 1
UNION ALL
SELECT 'Bretts Test Feed 2', 'T','Feed_Date', 2, 'DateTime', 10, 10
UNION ALL
SELECT 'Bretts Test Feed 2', 'T','Descr', 3, 'varchar', 20, 30
UNION ALL
SELECT 'Bretts Test Feed 2', 'T','Record_Count', 4, 'Int', 50, 11
GO


-- Create the Tables from your definitions

DECLARE @SQL varchar(8000)

--not that you would but you could
--SELECT @SQL = SQL FROM (

SELECT SQL FROM (
    SELECT 'CREATE TABLE ' + REPLACE(FileRootName+'_'+LayoutName,' ','_') + '( ' AS SQL
	, l.FeedName, LayoutCD, 1 AS SQL_GROUP, 1 AS ROW_ORDER 
      FROM FixedWidthFeed l
INNER JOIN FixedWidthLayoutType r
	ON l.FeedName = r.FeedName
UNION ALL
    SELECT '     ' + FieldName + ' ' + 
	+ CASE WHEN FieldDatatype In ('char', 'varchar') 
	       THEN FieldDataType + '(' + CONVERT(varchar(4),FieldLength) +  ')'
	       ELSE FieldDataType
	  END AS SQL
	, FeedName, LayoutCD, 2 AS SQL_GROUP, 1 AS ROW_ORDER     
     FROM FixedWidthLayout l  
    WHERE FieldOrdPos = 1
UNION ALL
    SELECT '   , ' + FieldName + ' ' + 
	+ CASE WHEN FieldDatatype In ('char', 'varchar') 
	       THEN FieldDataType + '(' + CONVERT(varchar(4),FieldLength) +  ')'
	       ELSE FieldDataType
	  END AS SQL
	, FeedName, LayoutCD, 2 AS SQL_GROUP, FieldOrdPos AS ROW_ORDER     
     FROM FixedWidthLayout l  
    WHERE FieldOrdPos <> 1
UNION ALL
    SELECT ')' AS SQL
	, FeedName, LayoutCD, 3 AS SQL_GROUP, 1 AS ROW_ORDER 
      FROM FixedWidthLayoutType
UNION ALL
    SELECT 'GO' AS SQL
	, FeedName, LayoutCD, 4 AS SQL_GROUP, 1 AS ROW_ORDER 
      FROM FixedWidthLayoutType
) AS XXX
Order by FeedName, LayoutCD, SQL_GROUP, ROW_ORDER     

-- Which Produces:

CREATE TABLE Test_Detail_Record( 
     Record_Type char(1)
   , Row_Date datetime
   , Produce varchar(8)
   , Produce_Amt decimal(10,2)
)
GO
CREATE TABLE Test_Header_Record( 
     Record_Type char(1)
   , Feed_Date DateTime
   , Descr varchar(30)
)
GO
CREATE TABLE Test_Trailer_Record( 
     Record_Type char(1)
   , Feed_Date DateTime
   , Descr varchar(30)
   , Record_Count Int
)
GO
CREATE TABLE Test2_Detail_Record( 
     Record_Type char(1)
   , Row_Date datetime
   , Produce varchar(8)
   , Produce_Amt decimal(10,2)
)
GO
CREATE TABLE Test2_Header_Record( 
     Record_Type char(1)
   , Feed_Date DateTime
   , Descr varchar(30)
)
GO
CREATE TABLE Test2_Trailer_Record( 
     Record_Type char(1)
   , Feed_Date DateTime
   , Descr varchar(30)
   , Record_Count Int
)
GO


And the Data of course




-- FixedWidthLoad.fmt

--8.0
--3
--1	SQLCHAR	0	7500	"\r\n"	1	DataRow	SQL_Latin1_General_CP1_CI_AS
--2	SQLCHAR	0	0	""	0	BatchId	SQL_Latin1_General_CP1_CI_AS
--3	SQLCHAR	0	0	""	0	RowId	SQL_Latin1_General_CP1_CI_AS

-- Test.txt

--H	 07/04/2003Header Record For Test File
--D07/04/2003 APPLES         1.00
--D07/04/2003 PEACHES 10000000.00
--D07/04/2003 PEACHES 10000000.00
--T	 07/04/2003Trailer Record For Test File  00000000003
--Dont include these last three lines
--123456789012345678901234567890123456789012345678901234567890
--         1         2         3         4         5         6



SELECT * FROM Test_Detail_Record
SELECT * FROM FixedWidthLayout
SELECT * FROM FixedWidthLayoutType
SELECT * FROM FixedWidthFeed
SELECT * FROM FixedWidthLoad
SELECT * FROM BatchControl




DROP TABLE FixedWidthLoad
GO
DROP TABLE FixedWidthFeed
GO
DROP TABLE FixedWidthLayout
GO
DROP TABLE FixedWidthLayoutType
GO
DROP TABLE BatchControl
GO
DROP TABLE FixedWidthConstr
GO
DROP TABLE Load_Folder_Parsed
GO
DROP TABLE Load_Folder
GO
DROP TABLE Test_Detail_Record
GO
DROP TABLE Test_Header_Record
GO
DROP TABLE Test_Trailer_Record
GO
DROP TABLE Test2_Detail_Record
GO
DROP TABLE Test2_Header_Record
GO
DROP TABLE Test2_Trailer_Record
GO



ooooo my head....somebody call me a bartender....




Brett

8-)

Edited by - X002548 on 07/29/2003 18:09:13
Go to Top of Page

robvolk
Most Valuable Yak

USA
15732 Posts

Posted - 07/29/2003 :  19:23:10  Show Profile  Visit robvolk's Homepage  Reply with Quote
OK, you're a bartender.
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 07/31/2003 :  12:32:16  Show Profile  Reply with Quote
Thanks I feel much better now....

Still working on the final solution...



Brett

8-)

SELECT POST=NewId()
Go to Top of Page

Todd Morrow
Starting Member

1 Posts

Posted - 03/05/2004 :  16:47:20  Show Profile  Reply with Quote
I'll post a function (later) that converts COBOL record layouts to a bcp format file, as well as one to convert those layouts to "create table" statements.
Go to Top of Page
  Previous Topic Topic Next Topic  
 Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.13 seconds. Powered By: Snitz Forums 2000