I'm writing VB 2010/ .Net 4 code against an SQL Server 2008 database. My program will display mainframe CPU usage for each fifteen minute interval in the day and allow the user to move from one day to the next, or previous, or select from a calendar back to 7/1/2010. I would very much like to go to the database once for ALL the data (96 per day for almost three years * 12 computers) and load it into a data table or array and retrieve data from there each time the user changes the date. Should I be concerned with memory limits? Is there a rule of thumb on how large a data table(s) can get before resorting to frequent database calls? A number of people will use this application, how does physical memory impact this decision? Any references to other reading on the subject would be greatly appreciated. TIA John
You'd have 421K rows per year's worth of data for 12 machines. That's not unreasonable for a datatable object, but it really depends on how big each row will be. 100 bytes each = 42 MB. If each row will be bigger than that you'll have to test to see if it uses too much memory for your application.
If you index your tables correctly, querying 1152 rows (96 slots x 12 computers) will take very little time. Personally I'd go with that option unless I had hundreds of users hitting the data simultaneously.
I'm thinking a DataSet with several DataTables, one of which will contain a row for each interval since 7/1/10 and 12 columns, one with the (decimal(5,2)) utilization for that machine for that date/time. The row numbers could be derived from the date/time selected and the column would indicate the particular machine. Does this sould right? Thanks again.