Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Development Tools
 Other Development Tools
 SQL Server Error '80040e31' (timeout)

Author  Topic 

ContACT1
Starting Member

5 Posts

Posted - 2004-07-12 : 20:23:18
I am have problems altering a SQL Server table from an ASP application. The SQL Statement is:

ALTER TABLE Policy ADD thisisatest nvarchar(255);


And the error I am receiving is:

Microsoft OLE DB Provider for SQL Server error '80040e31'
Timeout expired
/mhia/addfields.asp, line 89

I have tried passing the statement in Query Analyser and the syntax is correct, however the execution time is now up to 00:23:40 and still counting.... So there is obviousley a problem here.

When running the statement over the network or locally on the server I get the same error. Hence the link to the server does not appear to be the problem.

Is there a limit to the number of Columns a table in SQL Server 2000 can have? This table Policy currently has 117 columns.

Incidently I got the same error when doing a search of these forums:
Microsoft OLE DB Provider for SQL Server error '80040e31'
Timeout expired
/forums/search.asp, line 381


Cheers
Stuart

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-12 : 21:18:17


There is a limit of 1024 columns per table in v2000, so this shouldn't be a problem. You may be striking the byte limit, though. Each table can have a row length maximum of 8060 bytes.

Go to Top of Page

ContACT1
Starting Member

5 Posts

Posted - 2004-07-14 : 02:01:52
Thanks timmy

I will look into this. Is there an easy way to check the row length?


Cheers
Stuart
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2004-07-14 : 04:04:35
The row length won't be the problem - SQL lets you create varchar columns that exceed the 8060 limit, but will complain if you try to store more CHARACTERS in a ROW than that limit. (This COULD be a problem if you are already AT the 8060 limit, because this new field will need a byte or two to store the NULLs on each row - I think! - but that seems long odds to me.)

I think your problem is distinctly odd. *IF* you were adding a NOT NULL column with a DEFAULT value then that would have to be applied to every existing row in the table, and SQL would take a while to do that. But you have not said "NOT NULL" and I therefore assume that SQL is defaulting to "NULL" and in such a case I don't think it actually updates any rows.

I suppose you could prove this by trying
ALTER TABLE Policy ADD thisisatest nvarchar(255) NULL
instead.

My only other thought is to ask if your LOG file is huge? Have a look in the x:\MSSQL\DATA folder and see what the relative sizes of your .MDF and .LDF are. If the .LDF is huge that could be the problem. (Or right click the database in Enterprise Manager and check the PROPERTIES to see the size of Data and Log files)

Kristen
Go to Top of Page

Wanderer
Master Smack Fu Yak Hacker

1168 Posts

Posted - 2004-07-14 : 04:35:55
Hmm - posssibly your are actually timing out while trying to get locks on your schema table to make the table change? How busy is the server when you are doing this change?



*##* *##* *##* *##*

Chaos, Disorder and Panic ... my work is done here!
Go to Top of Page

ContACT1
Starting Member

5 Posts

Posted - 2004-07-14 : 06:17:15
Kristin:
The data file is 50MB and the log file is 292MB so still very small for what SQL Server should handle. This application logs every transaction for auditing so it is envisiged that it will grow very fast once in operation.
I tried using
ALTER TABLE Policy ADD thisisatest nvarchar(255) NULL
however, the same problem still occurs.

Wanderer:
This application is still in development and has been for the last two years (which is starting to feel like eternity), so there is only a couple developers and a couple of testers at this stage using the server. I don't think traffic should be a problem, however I did test the statement above when I was the only user connected and was at the local consol at time.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2004-07-14 : 08:03:22
I would suggest you kill the logs and try again. 300MB for a 50MB DB is a lot ...

Are you doing transaction backups every few minutes or so [at a maximum every hour] (i.e. to be able to recover to point-in-time), if you don't need tath ability [for development] I suggest you change the Recovery Model to SIMPLE - that will stop the LOGs being retained.

ALTER DATABASE MyDataBase RECOVERY SIMPLE

(I don't know if you need to do anything other that that to kill off the current log; a Database Shrink might be necessary)

Kristen
Go to Top of Page

Wanderer
Master Smack Fu Yak Hacker

1168 Posts

Posted - 2004-07-14 : 08:15:53
here's an off the wall thought:

Go on the box, open client network utility, and check to see if the "enable shared memory protocol" is ticked (it is on the general tab).

If this is off (default is on), then even local traffic uses the TCPIP stack (as i understand it). Enabling lets local process "use" chared memory, and avoid the "network" stack. (I am sure I got some of thise terms wrong - trying to remember from the last time we saaw this).



*##* *##* *##* *##*

Chaos, Disorder and Panic ... my work is done here!
Go to Top of Page

ContACT1
Starting Member

5 Posts

Posted - 2004-08-04 : 06:41:32
I was not able to find the exact cause of the problem however I implemented a few ideas suggested here such as compressing the log file.

A lot of the cols in the database are nvarchar as they are user defined and the DB is altered by the application . I have modified the application to force the user to specify the length and avoid using nvarchar where possible - however the row length is still of the policy table is still greater then the maximum of 8060 bytes.

The application is no longer timing out.

Thankyou everyone for your help.

Cheers
Stuart

Go to Top of Page

Wanderer
Master Smack Fu Yak Hacker

1168 Posts

Posted - 2004-08-04 : 10:20:13
Stuart,

I assume that you are dealing with other code pages, since you want to allow NVARCHAR data. Having said that - make sure yuo real need Nvarchar's - if you have to store the extended character sets, then you have to - no bones, but remember that Nvarchar are slower and bigger (... hm - seems you already seen that, so I guess you knew that).

If you column is over 8060, whether due to the use of necessary nvarchar's, or varchar data, I think you might want to SERIOUSLY review your DB design. I guess that there can be entities that have 116 distinct elements ... just can't think of any, off hand. Sounds very much like a highly denormalized table that might benefit from a good dose of normalization .

Cheers

*##* *##* *##* *##*

Chaos, Disorder and Panic ... my work is done here!
Go to Top of Page
   

- Advertisement -