| Author |
Topic |
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-23 : 10:13:58
|
| I have a DTS package that is giving me some problems. The package has 21 transformation tasks, 12 Execute SQL Tasks, and 1 ActiveX Script Task. All but one of the transformation tasks are working properly. When run in QA Analyzer, the query in the source tab of the transformation returns 261,348 rows. When the package was executed it only transformed 133,259 records. There were no errors displayed or recorded in any of the logs. Any ideas where the extra 128,089 records would or could have gone? This package has been a work in progress for the last month or so. The transformations have never had a problem like this and I haven't changed any of them recently. I am definitly at a loss on this one. Any pointers in the right direction would be very very helpful.The package was written in SQL Server 7.0 SP4 with MDAC 2.5 SP2Jeff BanschbachConsultant, MCDBAEdited by - efelito on 05/23/2002 10:20:36 |
|
|
Page47
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2002-05-23 : 10:22:06
|
| what if you pull that one transform out and exec it in its own pkg as a test? same results? I am guessing your source query is somehow dependent on some other object in your pkg, that up until now has been completing before this transform, but now, because of some other issue (maybe more data) is not complete by the time this transform kicks off . . . just a guess.<O> |
 |
|
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-23 : 10:33:23
|
| I can't see anything that it would be dependant on since the source query runs on it's own in QA without a problem and returns the correct number of records. I also have the workflow configured to make sure all tables get loaded in the appropriate order. I will give the seperate package a try to see what happens. Thanks for you thoughts Page.Jeff BanschbachConsultant, MCDBA |
 |
|
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-23 : 12:06:33
|
| Same result with the one transformation in it's own package. It transformed about 400 more rows than the big package which makes me think it's some kind of memory problem. I'm going to try setting the commit size to 1000 and hope that forces some clean up and allows it to insert all of the records. I believe the only thing that's changed since the last time I ran this package has been an upgrade from SP3 to SP4. It's a little early in SP4's life to find much info on a problem like this though.Jeff BanschbachConsultant, MCDBA |
 |
|
|
Page47
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2002-05-23 : 12:29:03
|
note to self . . . don't upgrade to sp4 just yet let us know how it turns out, it'll be good to know<O> |
 |
|
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-23 : 15:32:04
|
| It's fixed, but I made the classic mistake of changing more than one option before re-running it, so I'm not exactly sure what fixed it. Since I was leaning towards a memory problem I checked the Lock Table box on the advanced tab to reduce the number of locks that had to be held and set the Insert Commit Size to 1000 records. After running this for a couple of hours and getting almost no where, I took a look at server memory. For some reason the SQL Server process was only using 85 MB of RAM even though it was configured to dynamicly use up to 385 MB. I stopped the dts task and rebooted the server out of despiration. (Thankfully this is on a development box) I changed the Insert Commit Size to 9000 records and restarted the DTS package. The SQL Server process imediately sucked up the full 385 MB and the package started crusing along. I also ran Profiler for a little bit to make sure it wasn't trying to log any of the inserts.I'm still very nervous about the fact that DTS was dropping 130,000 some odd records with no warnings or errors.Jeff BanschbachConsultant, MCDBA |
 |
|
|
Page47
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2002-05-23 : 16:04:25
|
| I am terrified to say that I fear some cosmic objects have come into line and are begining to tear at the very fabric of all those things we hold true about a dts. . .I had a pkg today for the first time ever (also in a dev environment) mysteriously skip 5K worth of records in a 100K data pump . . . it never reported failure, it never wrote to an exception file . . . I'm so scared.<O> |
 |
|
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-23 : 16:21:19
|
| Yieks!! I think I'm going to take a little time and try to reporduce this problem so I can ship it off to our good buddies at MS.Jeff BanschbachConsultant, MCDBA |
 |
|
|
Page47
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2002-05-24 : 11:08:49
|
| did you leave the fetch at the default (1)?EDIT: and how many concurrent thread are you letting fly at once?<O>Edited by - Page47 on 05/24/2002 11:13:11 |
 |
|
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2002-05-24 : 11:23:34
|
Perhaps [url]http://support.microsoft.com/default.aspx?scid=kb;en-us;Q301383[/url] is describing the issue, let me know if so.quote: Yieks!! I think I'm going to take a little time and try to reporduce this problem so I can ship it off to our good buddies at MS.Jeff BanschbachConsultant, MCDBA
setBasedIsTheTruepath<O> |
 |
|
|
efelito
Constraint Violating Yak Guru
478 Posts |
Posted - 2002-05-24 : 15:08:31
|
| Yes, the fetch is set to the default (1) for all transformation tasks. I'm letting 4 threads run at a time, but it still only seems to really process one at a time.EDIT:Setbased, I don't think that's the problem since I wasn't getting any errors messages in any of my logs. Looks like that's only a problem with the transfer objects task. Interesting article though.Jeff BanschbachConsultant, MCDBAEdited by - efelito on 05/24/2002 15:13:17 |
 |
|
|
Andraax
Aged Yak Warrior
790 Posts |
Posted - 2002-05-29 : 04:00:07
|
| Sounds like a nightmare... I have about 15 DTS-packages running at different intervals here. Some of them contain reasonably large transformation tasks (8 million rows). I have never experienced any loss of data that I know of. But then I'm running SQL2000 SP2, maybe something has been fixed... |
 |
|
|
jesus4u
Posting Yak Master
204 Posts |
Posted - 2003-03-12 : 16:29:06
|
quote: Sounds like a nightmare... I have about 15 DTS-packages running at different intervals here. Some of them contain reasonably large transformation tasks (8 million rows). I have never experienced any loss of data that I know of. But then I'm running SQL2000 SP2, maybe something has been fixed...
Question: When you run those 8 million row tasks does your server memory increase and never reclaim the memory? Ours does... |
 |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-03-12 : 16:32:43
|
| I don't know why this is such a mystery to people:SQL SERVER DOES NOT RELEASE MEMORY AFTER IT GRABS IT!!!Unless you're running something else on the same server (not recommended), then you won't have any problems. If you do have to share another service(s) on the same box as SQL Server, you can always configure SQL Server to use a maximum amount of memory and leave the rest for the others. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-03-12 : 16:39:19
|
| Jesus4u, why would you think that SQL Server should release the memory after running a DTS task? It has no need to release the memory unless it is done with the memory AND another application is requesting memory but not enough is available. Some of our production SQL Servers only have about 15MB of memory available. This isn't a problem because there are no other applications running on these database servers.TaraEdited by - tduggan on 03/12/2003 16:41:18 |
 |
|
|
|