Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Old Forums
 CLOSED - General SQL Server
 count confusion!

Author  Topic 

jazzym
Yak Posting Veteran

94 Posts

Posted - 2002-11-19 : 07:07:57
Hi,

I'm trying to write a view that will count up the amount of records per day in my table. Basically it's a log table of when users logged into my site.

I have a time_stamp field which stores both date and time in it (users can obviously log in more than once a day so time is relevent)

The problem is it's grouping records by time and date so obviously i'm getting lots of rows with a count of 1 in.

I'm trying to find a way to strip down the time_stamp field to just the date so i can group by that.

Any ideas anyway? much appreciated.

cheers

Matt Preston

burbakei
Yak Posting Veteran

80 Posts

Posted - 2002-11-19 : 08:18:50
you can group by convert(datetime, time_stamp, 101) to eliminate time part.
but remember that timestamp has little common with datetime. see BOL for more
information.

Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2002-11-19 : 08:48:54
remember to include the "convert(datetime, time_stamp, 101)" in your SELECT....what is put in the GROUP BY statement must also appear in the main SELECT....

Go to Top of Page

jazzym
Yak Posting Veteran

94 Posts

Posted - 2002-11-20 : 04:28:49
Sorry but that code doesn't remove the time. Not on my server anyway. I even tried

SELECT CONVERT(datetime, getdate(), 101) AS date

in SQL Query Analyzer and that too gives me the date and the time.

Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2002-11-20 : 05:13:55
SELECT CONVERT(char(10), getdate(), 103)

103 = UK format for dates
101 = US format for dates


Lookup BOL for info on CONVERT....you'll learn loads.


Go to Top of Page

jazzym
Yak Posting Veteran

94 Posts

Posted - 2002-11-20 : 05:22:14
Still a slight problem in that now all my dates are text and not dates so i can't sort them into order. I now need to convert the conversion back to a date type!

Hence why called this topic "confusion"

My table holds logs of every member that logged into my site and when. You can log in more than once a day so time is relevent. I just want to output a total for each day, newest date first working back in time.

There must be an easier way?!!?!?

Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2002-11-20 : 07:58:10
DATEADD(d, DATEDIFF(d, 0, time_stamp), 0)



Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2002-11-20 : 13:09:55
you don't "need" to change them to a date-type...(but you can if you want).


however change the dateformat to yyyy-mm-dd....and that will guarantee they can be sorted.......because the most significant part of the date is 1st....



Again "Lookup BOL for info on CONVERT....you'll learn loads."

Go to Top of Page

Bambola
Posting Yak Master

103 Posts

Posted - 2002-11-20 : 13:52:11
Actually there is. It's something I found out recently and been using it ever since.

select convert(datetime,cast(cast(date_col as float) as int)),
count(id_col)
from my_table with (nolock)
group by convert(datetime,cast(cast(date_col as float) as int))

The idea is this:
SQL holds the datetime fields as decimal (actually two 4 bytes integers) where the part left to the dot represents number of days since 1900-01-01 and the right part holds milliseconds since midnight. to group by day you have to get rid of the time part (what's right to the dot).

It works much faster than convert(varchar(10),date_field,120) and gives accurate results. It will also leave you with datetime fields. test it and see for yourself.

HTH


quote:

Still a slight problem in that now all my dates are text and not dates so i can't sort them into order. I now need to convert the conversion back to a date type!

Hence why called this topic "confusion"

My table holds logs of every member that logged into my site and when. You can log in more than once a day so time is relevent. I just want to output a total for each day, newest date first working back in time.

There must be an easier way?!!?!?





Bambola.
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-11-20 : 13:58:26
quote:
Actually there is. It's something I found out recently and been using it ever since.

select convert(datetime,cast(cast(date_col as float) as int)),
count(id_col)
from my_table with (nolock)
group by convert(datetime,cast(cast(date_col as float) as int))

The idea is this:
SQL holds the datetime fields as decimal (actually two 4 bytes integers) where the part left to the dot represents number of days since 1900-01-01 and the right part holds milliseconds since midnight. to group by day you have to get rid of the time part (what's right to the dot).

It works much faster than convert(varchar(10),date_field,120) and gives accurate results. It will also leave you with datetime fields. test it and see for yourself.
Arnold's expression is still better:

DATEADD(d, DATEDIFF(d, 0, time_stamp), 0)

It doesn't CONVERT anything. In the case of CONVERTing to float, float is an approximate value, and if you need to maintain a date to a certain precision you won't be able to do so with a float.

Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2002-11-20 : 15:22:14
Doesn't go wrong before 1900, either:

DECLARE @olddate datetime
SET @olddate = '18990101 10:00:00'

SELECT CONVERT(datetime, CAST(CAST(@olddate AS float) AS int)) AS dt
UNION ALL
SELECT DATEADD(d, DATEDIFF(d, 0, @olddate), 0)

 
Result:

dt
------------------------------------------------------
1899-01-02 00:00:00.000
1899-01-01 00:00:00.000



Go to Top of Page

Bambola
Posting Yak Master

103 Posts

Posted - 2002-11-21 : 03:26:16
First of all, since we were talking about a date that holds the datetime user logged into that site, I assumed we would not be needing dates before 1900-01-01. :-)

Second, I was talking about dealing with dates already stored on SQL.

The idea is that if you take a datetime column and cast it to float than int, you will remain with the number of days passed since 1900-01-01, and that's all you need to group by day.

What you are avoiding here is running the algorythm SQL uses to convert the datetime that is actually held as two 4 bytes integers to datetime, something that definately "costs" more.

Try to run this and you will see it gives accurate results.

declare @x datetime
declare @t table (base_date datetime, date_as_float datetime)

declare @i int
select @x = '1900-01-01 08:00:10.000'
select @i = 1
select @x = dateadd(ms,@i,@x)
while @i <= 10
begin
select @x = dateadd(millisecond,@i,@x)
insert into @t
select @x, convert(datetime,cast(@x as float))
select @i = @i + 1
end
select * from @t

As you can see, there's nothing approximate about it.

Now try to the initial quary I posted on a big table, the performance improvment is surprising, Especially when the column is not indexed.

Check it guys, it is worth the effort. :-)



quote:

quote:
Actually there is. It's something I found out recently and been using it ever since.

select convert(datetime,cast(cast(date_col as float) as int)),
count(id_col)
from my_table with (nolock)
group by convert(datetime,cast(cast(date_col as float) as int))

The idea is this:
SQL holds the datetime fields as decimal (actually two 4 bytes integers) where the part left to the dot represents number of days since 1900-01-01 and the right part holds milliseconds since midnight. to group by day you have to get rid of the time part (what's right to the dot).

It works much faster than convert(varchar(10),date_field,120) and gives accurate results. It will also leave you with datetime fields. test it and see for yourself.
Arnold's expression is still better:

DATEADD(d, DATEDIFF(d, 0, time_stamp), 0)

It doesn't CONVERT anything. In the case of CONVERTing to float, float is an approximate value, and if you need to maintain a date to a certain precision you won't be able to do so with a float.





Bambola.
Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2002-11-21 : 06:37:32
quote:

I assumed we would not be needing dates before 1900-01-01.


Of course you did.

You're right, in as much as
(a) it's fast
(b) it works for all values >= 1900-01-01

However, you're wrong in as much as there are datetime values dt where dt <> CONVERT(datetime,CAST(dt AS float))

SELECT dt, CONVERT(datetime,CAST(dt AS float))
FROM (SELECT CONVERT(datetime, '1900-01-02 23:59:59.997') AS dt) a



Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-11-21 : 07:45:33
quote:
First of all, since we were talking about a date that holds the datetime user logged into that site, I assumed we would not be needing dates before 1900-01-01. :-)

Second, I was talking about dealing with dates already stored on SQL...

Check it guys, it is worth the effort.
OK, in the meantime, I'd like to ask that you check something on SQL Team yourself.

Search the forums for '1-1-1900' or 'empty date' and read the posts. Many people mistakenly assume that passing an empty string is the same as a NULL date value. There are probably over 100 posts on how to fix that. Therefore, no only can you "assume" that there ARE dates in existing data in the range mentioned, you can guarantee it.

No one was criticizing your formula. But there are other considerations besides pure performance too, and if precision is lost in conversion, it can compromise the accuracy of the results being returned. If it can be done without conversion, without loss of precision, then it should be done that way.

Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2002-11-21 : 08:01:06
Rob, when I said "it works for all values >= 1900-01-01", I meant it.
I tested it for every one of the 2958464 days to 9999-12-31 with values of 00:00:00.000 and 23:59:59.997.


Go to Top of Page

Merkin
Funky Drop Bear Fearing SQL Dude!

4970 Posts

Posted - 2002-11-21 : 08:20:45
You have waaaaaaay too much time on your hands

Damian
Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2002-11-21 : 08:29:11
7 seconds.
Actually, we had an amusing conversation yesterday in the office about one of my cow-orker's friends who had to put a quote on the end of 10000 cell values in an Excel spreadsheet and was expected to do it manually -- last we heard she was up to 7000.



Edited by - Arnold Fribble on 11/21/2002 08:32:18
Go to Top of Page

ValterBorges
Master Smack Fu Yak Hacker

1429 Posts

Posted - 2002-11-21 : 08:40:03
Arnold,

I think I've seen you write

DATEADD(d, DATEDIFF(d, 0, time_stamp), 0)

about 10 times in the past week.

Maybe it should be in an article.





Edited by - ValterBorges on 11/21/2002 08:40:55
Go to Top of Page
   

- Advertisement -