Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 Insert 1,000 records in one second

Author  Topic 

irmorteza
Starting Member

8 Posts

Posted - 2010-06-27 : 15:51:37
Hi

Suppose we have a website with 1,000 viewers each second. And we want to insert the viewer’s information(for example) to the SQL database. So we must insert 1,000 records to the SQL each second. And it may take long time to do.

My question is: How can we decrease this process?.
Can we fix it programmatically or we must use hardware solutions?

Best Regards. Morteza

Angate
Starting Member

24 Posts

Posted - 2010-06-27 : 18:16:25
It would likely take a combination of software and hardware. I personaly would created a threaded program that can send batches of the insert statements to the DBO. That way, if you get a spike, they can back up in the queue in the threaded program rather than bogging down the system or loosing records. You should also look into a decently fast storage solution for the data base.
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2010-06-28 : 03:39:00
It really depends on what kind of data you want to insert and how you insert it. Doing a batch insert from a data file for example then 1000 rows per second is no problem whatsoever. But having 1000 individual transactions per second is quite another ordeal. Are you having a performance problem at the moment or is this for planning purposes? Regardless, before making any recommendations some details would help...

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com
Go to Top of Page

irmorteza
Starting Member

8 Posts

Posted - 2010-06-28 : 16:01:15
quote:
Originally posted by Lumbago

It really depends on what kind of data you want to insert and how you insert it. Doing a batch insert from a data file for example then 1000 rows per second is no problem whatsoever. But having 1000 individual transactions per second is quite another ordeal. Are you having a performance problem at the moment or is this for planning purposes? Regardless, before making any recommendations some details would help...

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com




First thanks for replying

We are trying to create a website that its customer is another websites. A website joins to our site and we provide some advantages for its users (this advantages works with SQL). So when a site with 100,000 page view joins us we must handle its 100,000 page view too. And when 10000 sites with 100,000 page view join us; we must handle 1,000,000,000 pages view and so on. So as you see there is no limit for page view.

Best regards. Morteza
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2010-06-28 : 16:40:26
1) Are you an online business?
2) Can you make the affiliates to send hourly files to you and import them?
3) How long are you going to save the page views?



N 56°04'39.26"
E 12°55'05.63"
Go to Top of Page

irmorteza
Starting Member

8 Posts

Posted - 2010-06-29 : 00:50:30
quote:
Originally posted by Peso

1) Are you an online business?
2) Can you make the affiliates to send hourly files to you and import them?
3) How long are you going to save the page views?



N 56°04'39.26"
E 12°55'05.63"



1) yes.
2) please more explain.
3) there is no limit to save.

thanks. Morteza
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2010-06-29 : 01:59:29
I assume you are talking about something like Google Analytics, right? Installing a javascript or something on every page and then analyzing traffic every so often? If thats the case then I'm not so sure that databases are your best bet...at least not for doing an individual insert for each page view. I think it would be far more efficient to utilize webserver logging (http://en.wikipedia.org/wiki/Server_log) and insert the files to the database i.e. every 10 minutes or something. You'd get far better throughput...

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com
Go to Top of Page

irmorteza
Starting Member

8 Posts

Posted - 2010-06-29 : 08:56:11
quote:
Originally posted by Lumbago

I assume you are talking about something like Google Analytics, right? Installing a javascript or something on every page and then analyzing traffic every so often? If thats the case then I'm not so sure that databases are your best bet...at least not for doing an individual insert for each page view. I think it would be far more efficient to utilize webserver logging (http://en.wikipedia.org/wiki/Server_log) and insert the files to the database i.e. every 10 minutes or something. You'd get far better throughput...

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com




Thanks Lumbago for replying.
Your example is close to what we do in writing side, but this is not end, we want read some data from DB and provide some acts for each page view, too.
I thing log file solution is very useful. In this way we get far better throughput in writing side, but what about reading side. Is it possible to get something like cash memory or offline DB (imagine) or log file or … to do this well?
For further info: when we do “read action”, we must analyze some data (as example 500 records) and then decide for action. So if we do this in temporary memories, we must have those 500 records too.

I know my lines may confuse you, but I thanks for your attention, anyway.

Best regards. Morteza
Go to Top of Page
   

- Advertisement -