SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 General SQL Server Forums
 Script Library
 size of user tables in a database
 New Topic  Reply to Topic
 Printer Friendly
Next Page
Author Previous Topic Topic Next Topic
Page: of 3

nishithrn
Yak Posting Veteran

India
58 Posts

Posted - 11/04/2003 :  01:39:38  Show Profile  Reply with Quote
The following procedure will display the size of all the user tables in a database.

CREATE proc sp_tablesize
as
if exists (select * from sysobjects where name = 'sp_tablesize')
begin
goto calculate_tablesize
end
else
begin
CREATE TABLE #SpaceUsed
(
TableName sysname,
TableRows int,
TableSize varchar(10),
DataSpaceUsed varchar(10),
IndexSpaceUsed varchar(10),
UnusedSpace varchar(10)
)
goto calculate_tablesize
end

calculate_tablesize:
declare @tablename nvarchar(50)
declare @cmd nvarchar(50)

declare c1 cursor for select name from sysobjects where xtype='u'
open c1
fetch c1 into @tablename
while @@fetch_status = 0
begin
set @cmd='exec sp_spaceused['+@tablename+']'
insert into #SpaceUsed exec sp_executesql @cmd
fetch next from c1 into @tablename
end

select * from #SpaceUsed

drop table #SpaceUsed
deallocate c1

Stoad
Freaky Yak Linguist

*
1983 Posts

Posted - 11/04/2003 :  07:03:45  Show Profile  Visit Stoad's Homepage  Reply with Quote
Work fine.
Only one thing... the underscore in the sp name. It's bad.
Go to Top of Page

Stoad
Freaky Yak Linguist

*
1983 Posts

Posted - 11/04/2003 :  07:38:00  Show Profile  Visit Stoad's Homepage  Reply with Quote
... and where lost "close c1" before "deallocate c1"?
Go to Top of Page

nishithrn
Yak Posting Veteran

India
58 Posts

Posted - 11/05/2003 :  02:57:54  Show Profile  Reply with Quote
Thanks a lot for your suggestion Stoad...
Go to Top of Page

nr
SQLTeam MVY

United Kingdom
12543 Posts

Posted - 11/05/2003 :  04:25:34  Show Profile  Visit nr's Homepage  Reply with Quote
I always wonder why people create cursors for things like this.
Table names should be sysname or at least 128 bytes
You might have a look at the information schema views too.

declare @tablename @nvarchar(128) ,
@maxtablename @nvarchar(128) ,
@cmd nvarchar(1000)


select @tablename = '', @maxtablename = max(name) from sysobjects where xtype='u'
while @tablename < @maxtablename
begin
select @tablename = min((name) from sysobjects where xtype='u' and name > @tablename
set @cmd='exec sp_spaceused['+@tablename+']'
insert into #SpaceUsed exec sp_executesql @cmd
end


==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  14:32:24  Show Profile  Reply with Quote
quote:
Originally posted by nr

I always wonder why people create cursors for things like this.
Table names should be sysname or at least 128 bytes
You might have a look at the information schema views too.

declare @tablename @nvarchar(128) ,
@maxtablename @nvarchar(128) ,
@cmd nvarchar(1000)


select @tablename = '', @maxtablename = max(name) from sysobjects where xtype='u'
while @tablename < @maxtablename
begin
select @tablename = min((name) from sysobjects where xtype='u' and name > @tablename
set @cmd='exec sp_spaceused['+@tablename+']'
insert into #SpaceUsed exec sp_executesql @cmd
end


==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.



You just blew away any reason I ever had for needing a cursor.....

Thanks....

Brett

8-)
Go to Top of Page

tkizer
Almighty SQL Goddess

USA
37316 Posts

Posted - 11/05/2003 :  14:40:53  Show Profile  Visit tkizer's Homepage  Reply with Quote
I've got a few "admin" scripts that use cursors. I'll have to check to see if they can be reworked so that they use this method. Thanks Nigel.

Even though you only run admin scripts every once in a while and they are not part of an application, it is still a good idea not to use cursors for the sake of good programming and performance.

Tara
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  14:58:39  Show Profile  Reply with Quote
'cept that it doesn't work...

Got the cask next to the keyboard?



A little tweaking...and thanks again for the idea..

Question though (and I'll go test it). How many singleton selects have to be performed until the get ouyweighed by a cursor?



USE Northwind
GO

CREATE TABLE #SpaceUsed (				-- The Actual output sizes of sp_spaceused
							-- Anything else is a waste
			  [name] 	varchar(20)
			, [rows] 	char(11)
			, [reserved] 	varchar(15)
			, [data] 	varchar(15)
			, [index_size] 	varchar(15)
			, [unused] 	varchar(15)
)
GO

DECLARE   @tablename nvarchar(128)
	, @maxtablename nvarchar(128)
	, @cmd nvarchar(1000) 

SET NOCOUNT ON

 SELECT   @tablename = ''
	, @maxtablename = MAX(name) 
   FROM   sysobjects 
  WHERE   xtype='u'

WHILE @tablename < @maxtablename 
	BEGIN
		SELECT @tablename = MIN(name) 
		  FROM sysobjects 
		 WHERE xtype='u' and name > @tablename
	 
		   SET @cmd='exec sp_spaceused['+@tablename+']'
		INSERT INTO #SpaceUsed EXEC sp_executesql @cmd
	END

SELECT * FROM #SpaceUsed

SET NOCOUNT OFF
GO

DROP TABLE #SpaceUSed
GO





Brett

8-)
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  15:43:38  Show Profile  Reply with Quote
I believe the singleton always outperforms the cursor.

However, Beware The Lost Update...

When the selected row must be updated after it is retrieved,
using a cursor for update is recommended over a singleton SELECT.
If you use a singleton SELECT, the row can be updated by someone else after
the singleton SELECT but before the subsequent UPDATE, thereby causing the intermediate
modification to be lost.

Edited by - ehorn on 11/05/2003 15:49:14
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  15:57:29  Show Profile  Reply with Quote
quote:
Originally posted by ehorn

I believe the singleton always outperforms the cursor.



What if you have to do a million of them?

Or maybe something not that crazy?

The cursor will bring back whole pages...



Brett

8-)
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  15:58:40  Show Profile  Reply with Quote
A CURSOR requires an OPEN, FETCH, and CLOSE to retrieve one row, whereas a singleton SELECT requires only SELECT…INTO.

Makes for an interesting examination...

Edited by - ehorn on 11/05/2003 16:00:18
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  16:09:07  Show Profile  Reply with Quote
quote:
Originally posted by ehorn

A CURSOR requires an OPEN, FETCH, and CLOSE to retrieve one row


uhhh...no

A CURSOR requires one OPEN, and CLOSE (you forgot DEALLOCATE) to instansiate (and de-instantiate..is that really a word) a Cursor (ooops...don't forget the DECLARE)

And a fetch for each and every row, from pages that are in memory..

No more rows on the page, fetch another page..(do they have sequential prefetch in SQL server?)






Brett

8-)
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  16:14:14  Show Profile  Reply with Quote
Sound like a lot of overhead to me.

It seems logical that if there is more overhead in retrieving one row, than there will be more overhead in retreiving pages of data.
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  16:17:15  Show Profile  Reply with Quote
quote:
Originally posted by ehorn

Sound like a lot of overhead to me.

It seems logical that if there is more overhead in retrieving one row, than there will be more overhead in retreiving pages of data.



You don't consider a million singleton selects (ok 250,000) alot of overhead?

I gotta test it out...let you know...



Brett

8-)
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  16:22:41  Show Profile  Reply with Quote
quote:

You don't consider a million singleton selects (ok 250,000) alot of overhead?



OK.. OK...

More overhead than 250,000 cursor fetches?

Time to bust out the showplan.


Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  16:28:01  Show Profile  Reply with Quote
quote:
Originally posted by ehorn

quote:

You don't consider a million singleton selects (ok 250,000) alot of overhead?



OK.. OK...

More overhead than 250,000 cursor fetches?

Time to bust out the showplan.






Remeber now...you have to use a WHILE loop 250,000 times...

No set based anything...



Brett

8-)
Go to Top of Page

Stoad
Freaky Yak Linguist

*
1983 Posts

Posted - 11/05/2003 :  16:32:28  Show Profile  Visit Stoad's Homepage  Reply with Quote
LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:
suppose you have two tables. Both these tables store coordinates (x, y) of
points. And in the 1st table there are say 5 mln. rows, and in the 2nd - about
say 500 rows.

Now you need to find for each point from the small table the nearest to it point
from the huge table...

You can't even imagine how many times a cursor solution will be faster than
ANY non-cursor one!!

OK. Don't be lazy and just check it out.
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 11/05/2003 :  16:39:29  Show Profile  Reply with Quote
quote:
Originally posted by Stoad

LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:
suppose you have two tables. Both these tables store coordinates (x, y) of
points. And in the 1st table there are say 5 mln. rows, and in the 2nd - about
say 500 rows.

Now you need to find for each point from the small table the nearest to it point
from the huge table...

You can't even imagine how many times a cursor solution will be faster than
ANY non-cursor one!!

OK. Don't be lazy and just check it out.




No. You did not read.

We are discussing attacking a solution withouth the aid of set based processing.

Nigel Came up with a great solution (a non cursor based one) for some admin functions.

The question was....

quote:

How many singleton selects have to be performed until they get outweighed(out-performed) by a cursor?


In other words, when is a cursor more effecient instead of looping through singleton SELECTS?

I know you're a bright guy, but ya gotta read the whole thread...




Brett

8-)

Edited by - X002548 on 11/05/2003 16:42:39
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  16:57:56  Show Profile  Reply with Quote
AND THE WINNER IS!!

oh, I dont know yet....

I am still waiting for the 250,000 fetches on my p3
Gotta get a new pc

come on... come on...

I'm going to have dinner..

maybe it will be finished when I get back.

Edited by - ehorn on 11/05/2003 16:59:28
Go to Top of Page

robvolk
Most Valuable Yak

USA
15683 Posts

Posted - 11/05/2003 :  18:56:01  Show Profile  Visit robvolk's Homepage  Reply with Quote
quote:
Originally posted by Stoad

LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:
suppose you have two tables. Both these tables store coordinates (x, y) of
points. And in the 1st table there are say 5 mln. rows, and in the 2nd - about
say 500 rows.

Now you need to find for each point from the small table the nearest to it point
from the huge table...

You can't even imagine how many times a cursor solution will be faster than
ANY non-cursor one!!

OK. Don't be lazy and just check it out.
Post some DDL and example output and we'll take (another) crack at it using set-based operations. Betcha we'll beat the cursor (again).

You need more faith Stoad.
Go to Top of Page

ehorn
Flowing Fount of Yak Knowledge

USA
1631 Posts

Posted - 11/05/2003 :  19:18:47  Show Profile  Reply with Quote
Alright, someone will have to see how this scales to 250,000 recs
cause my POS ain't gonna doit...

I originally had the data cols at 4000 to push out more pages but
I need more space on my box so I scaled back to CHAR(400) with 10,000 recs for these numbers.

Would like to see numbers with data >= CHAR(4000) and >= 250,000 recs if anyone can run them....

EDIT: Maybe this should be a new post....
We've gone off the topic a bit...


/*----------------------------------------------------------
 Sample Data
---------------------------------------------------------*/

CREATE TABLE TestData (n INT IDENTITY(0,1) PRIMARY KEY, data CHAR(400) NOT NULL )
GO

SET NOCOUNT ON
INSERT INTO TestData (data) SELECT REPLICATE('BLAH', 100)
FROM Numbers
WHERE n < 10000
GO

CREATE TABLE OutData (n INT, data CHAR(400) )
GO

/*----------------------------------------------------------
 Cursor Insert
---------------------------------------------------------*/
SET NOCOUNT ON

DECLARE @MaxRows int
SET @MaxRows = 10000

DECLARE @n int,@data char(400)

DECLARE number_cursor CURSOR --FAST_FORWARD
FOR
SELECT n FROM Numbers WHERE n <= @MaxRows

OPEN number_cursor

FETCH NEXT FROM number_cursor INTO @n  

WHILE @@FETCH_STATUS = 0
BEGIN
	INSERT INTO OutData 
	SELECT n,data FROM TestData WHERE n = @n

	FETCH NEXT FROM number_cursor INTO @n
END

CLOSE number_cursor
DEALLOCATE number_cursor
GO

DELETE OutData
GO


/*----------------------------------------------------------
 Singleton Insert
---------------------------------------------------------*/
SET NOCOUNT ON

DECLARE @MaxRows int
SET @MaxRows = 10000

DECLARE @n INT,@data CHAR(400)
SET @n = 0
WHILE @n <= @MaxRows
BEGIN
	INSERT INTO OutData 
	SELECT n,data FROM TestData WHERE n = @n

	SET @n = @n+1
END
GO

DROP TABLE TestData
DROP TABLE OutData

/*
----------------------------------------------------------------------------------------------------------------------------------------
			Estimate Rows	 Estimate IO	Estimate CPU	  AvgRowSize	TTLSubTreCost	    Duration
----------------------------------------------------------------------------------------------------------------------------------------
Cursor			30003.3010000	0.1787737479	0.1101602000	438.0000000000	0.3121736240  		1:03     
----------------------------------------------------------------------------------------------------------------------------------------

  |--Dynamic Cursor(number_cursor, Optimistic)
       |--Fetch Query(Fetch Query)
       |    |--Table Scan(OBJECT:([Common].[dbo].[Numbers]), WHERE:([Numbers].[n]<=10000))
       |--Refresh Query(Refresh Query)
 OPEN number_cursor
 FETCH NEXT FROM number_cursor INTO @n
 WHILE @@FETCH_STATUS = 0
   BEGIN   INSERT INTO OutData    SELECT n,data FROM TestData WHERE n = @n
       |--Table Insert(OBJECT:([Common].[dbo].[OutData]), SET:([OutData].[data]=[TestData].[data], [OutData].[n]=[TestData].[n]))
            |--Clustered Index Seek(OBJECT:([Common].[dbo].[TestData].[PK__TestData__50FB042B]), SEEK:([TestData].[n]=[@n]) ORDERED FORWARD)
   FETCH NEXT FROM number_cursor INTO @n
 END    CLOSE number_cursor
 DEALLOCATE number_cursor

----------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------
			Estimate Rows	 Estimate IO	Estimate CPU	  AvgRowSize	TTLSubTreCost	    Duration
----------------------------------------------------------------------------------------------------------------------------------------
Singleton 		3.0000000000	0.0167507979	0.0000806000	415.0000000000	0.0400710940	      1:01
----------------------------------------------------------------------------------------------------------------------------------------

DECLARE @n INT,@data CHAR(400)  SET @n = 0
 WHILE @n <= 10000
   BEGIN   INSERT INTO OutData    SELECT n,data FROM TestData WHERE n = @n     SET
       |--Table Insert(OBJECT:([Common].[dbo].[OutData]), SET:([OutData].[data]=[TestData].[data], [OutData].[n]=[TestData].[n]))
            |--Clustered Index Seek(OBJECT:([Common].[dbo].[TestData].[PK__TestData__50FB042B]), SEEK:([TestData].[n]=[@n]) ORDERED FORWARD)
   @n = @n+1
 END

*/

Edited by - ehorn on 11/06/2003 07:53:18
Go to Top of Page
Page: of 3 Previous Topic Topic Next Topic  
Next Page
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.14 seconds. Powered By: Snitz Forums 2000