Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 Big tables causing issues

Author  Topic 

ZoneFX
Starting Member

17 Posts

Posted - 2008-03-03 : 15:21:35
I have a very rubbish connection!! And have been having prodlems with "Protocol error in TDS stream" when trying to save a copy locally through a dts. Basically i have a table with about 11,000 rows, but this table had 28 columns. I've now normalised to a certain extent and now have 2 tables one of 11 columns mostly ints and one of 18 columns, but they will both be excess of 11,000 rows.

So, on to the question ... Will these 2 tables likely ease my network issues... The application works fine on 1 big table. It's just copying to my local machine that's always been the issue since the table hit 10,000 rows

If only I knew what I was doing ... Thanks in advance of any guidence

rmiao
Master Smack Fu Yak Hacker

7266 Posts

Posted - 2008-03-03 : 23:27:33
Possible to do backup/restore?
Go to Top of Page

ZoneFX
Starting Member

17 Posts

Posted - 2008-03-04 : 04:38:36
Let me clarify a little!!

The database lives on a commercial host and the web based application it drives function fine, in terms of speed and reliability. I run a local copy of the application on my local machine for developement purposes. I run a nightly DTS that transfers some tables to my local server. This transfer is what errors out. I live in the sticks and have a very poor internet connection, but things worked fine until a table within the database got to over 10,000 rows. On occation the transfer does work fine, but it's very much once out of every 7 or 8 attempts.

I can run backups and restores no problem. The only real issue is the DTS crapping out with the "Protocol error in TDS stream" . The other possible issue is the LIVE sql server is SQL 2005 and my local copy is SQL 2000
Go to Top of Page

ZoneFX
Starting Member

17 Posts

Posted - 2008-03-12 : 06:16:10
I realised that only a few (perhaps 180 rows currently) changed on a daily basis. So, there was no need to copy all 11,000 plus rows. I split the table that contained the data into two. One table contained legacy (pre Jan 2008) and the other current (post jan 2008). So, I now only copy across the 2nd table and will append all 2008 to the legacy table at year end. Had to make a few unions in my stored procedures to make the seach facilty function corectly, but all appears to work much better now...
Go to Top of Page
   

- Advertisement -