Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 select top ranking record...

Author  Topic 

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-23 : 17:34:06
I am using the rank and partition to determine the 'newest' record in a group... now I want to select all records with 'newest' =1... ??

select *, rank() over (partition by [num-idoc] order by extractdate desc) as newest from studentrecords

num-idoc extractdate newest
A3411 2/1/2010 1
A3211 1/1/2010 2
A3211 12/1/2009 3
A3122 2/1/2010 1

the resulting set will be just the two records with newest = 1

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-23 : 17:39:11
how bout this solution?

with rankeddata as(
select *, rank() over (partition by [num-idoc] order by extractdate desc) as newest from studentdirectory )
select * from rankeddata where newest =1
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-03-24 : 00:32:20
it will work as long as you dont have multiple records with same extractdate for same [num-idoc] value. Are you sure you wont have such a scenario? just in case you've mulltiples, do you want to return all of just 1?

Also you dont require CTE in above case, just a derived table would be fine


select * from
(
select *, rank() over (partition by [num-idoc] order by extractdate desc) as newest from prisonadmissionsmerged )t
where newest =1


------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-24 : 09:10:11
thank you for the warning..I will have to research if there will be more than one record in the update table for each studentnumber...if there is, I really will not know WHICH record to make the most current...so it is a bit of a problem in any case...

your solution also returns the same records...I think yours is a bit more general...

thank you for the help!

update - As the nature of the source application for the update file...does NOT keep history...every week when I get an extract file, by definition there will only be one record per keyid (studentid)...apparantly they keep an updated date as a flag to know to include the record in the weekly extract..

again, thanks for the help!
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-03-24 : 11:33:53
welcome

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-24 : 14:45:22
I split the primary table into related tables... I think I need a left join to bring them together in the query? how would I change the ranking script to include a join?

eg. StudentDirectory, I split out DormAssignments;
from studentdirectory a left join DormAssignments on a.[num-idoc]=b.[num-idoc] --- this should give me all records in studentdirectory adding fields from DormAssignments ...

.......

select * from
(
select *, rank() over (partition by [num-idoc] order by extractdate desc) as newest from studentdirectory )t
where newest =1
Go to Top of Page

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-25 : 10:37:05
got it... my problem was in the 'select *'...as it brought in columns with the same name (such as the 'Entered' datetime stamp field)....

the solution is to be explicit with the field names in that SECOND select statement, to eliminate any duplicate fields...

the error that was being thrown from the select * was:

The column 'StudentID' was specified multiple times for 'z'.

Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-03-25 : 12:24:59
quote:
Originally posted by dlorenc

got it... my problem was in the 'select *'...as it brought in columns with the same name (such as the 'Entered' datetime stamp field)....

the solution is to be explicit with the field names in that SECOND select statement, to eliminate any duplicate fields...

the error that was being thrown from the select * was:

The column 'StudentID' was specified multiple times for 'z'.




its pretty obvious isnt it? so just rename one of instance to something else or include only one of them

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-25 : 12:44:39
Woh hansane wali baat thi, dost....

it was easier to use select *, then to list the 250 field explicitly..ah well...

i did include only one of the fields, no duplicates!..
Go to Top of Page

DBA in the making
Aged Yak Warrior

638 Posts

Posted - 2010-03-25 : 12:49:11
If you've got 250 columns spread across 2 tables, then your tables probably need splitting.

There are 10 types of people in the world, those that understand binary, and those that don't.
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-03-25 : 12:51:03
quote:
Originally posted by dlorenc

Woh hansane wali baat thi, dost....

it was easier to use select *, then to list the 250 field explicitly..ah well...

i did include only one of the fields, no duplicates!..


using select * is a bad practice both from clarity point of view and also from performance point

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

dlorenc
Posting Yak Master

172 Posts

Posted - 2010-03-25 : 12:53:56
309 field...over 23 tables... yes, splitting them into functional tables is excellent advice...I use ED diagrams to do the initial schema design...as a business analyst type, the concept of what data is 'owned' by what function role (structured as a table) makes sense to me....

Thank you all for your help!
Go to Top of Page
   

- Advertisement -