| Author |
Topic |
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 12:55:03
|
| I have a scenario where multiple users will be doing import processes, but all of them will be working for different clients.I have one core table which gets the most hits whenever import processes run. I have 2 options now To have one core table and do the sequential imports by making queue for the import processes. To have 300 core table, one for each client, it will allow the users to work on the import processes simultaneously without waiting for one another.Can anyone suggest which one is better and why? |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 13:15:47
|
| More information about the import process:1. This table is not going to be used in any Reporting.2. Each import process process will insert 20k-30k records in this table.3. This is going to be one of the most usable table in the application.Please let me know if that helps and feel free to ask any specific question, if needed. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 13:40:47
|
| It'll be BULK INSERT.With each import, it will add around 20k rows in the table which is having 7 columns. And there will be around 20 imports a day.Clustered index is on the Primary Key which is an Identity column.Yes, I am considering the table partitioning too. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 13:57:41
|
| If we go with single table, then how will multiple users be able to load the data? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 14:13:43
|
| What if we use OPENXML instead of BULK INSERT, which option would be better in that case?Secondly, if we go for BULK INSERT, we will also have some records that should be Updated instead of BULK INSERT. How those would be managed? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 14:31:17
|
| Basically, what we are doing is, if we have 100 records in our file, we check for the existence of records in database; now if there are 60 new records and 40 old records. We make two separate OPENXMLs to insert and update in the table.So, we have aforesaid scenario can we still use BULK INSERT for both insert and update operation?Secondly, if we keep on filling in data in this single table, it would be too slower to retrieve the data from it which would affect the performance badly.What you think about these two queries? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2011-08-17 : 14:35:07
|
| BULK INSERT is only for inserts. If you have a requirement to update, then you'll need to either update the data before the BULK INSERT or BULK INSERT into a staging table, update the data, and then use T-SQL to move the data between tables.I don't see that this import process is going to cause a problem for you. We have a multi-threaded system which does BULK INSERTs all day long and it's HUGE amounts of data (millions upon millions of rows per day). We have no issues querying this table.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
 |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 14:57:50
|
| Thanks for making the point clear.One thing, how do you manage the transformations while dong BULK INSERT? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 15:03:47
|
| Can you give any specific suggestion regarding transformations while using BULK INSERT in my case? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 15:10:58
|
| Scrub the data before the BULK INSERT is quicker than having a staging table, is that correct? |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-17 : 15:18:46
|
| Thanks for your help dear. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
imadiamond2016
Starting Member
29 Posts |
Posted - 2011-08-19 : 14:08:01
|
| Hi TaraI have mentioned my requirement more clearly and detailed. Can you please once again have a close look and provide your comments.Requirement:The query is regarding data modeling for core functionality of my application.I have a scenario where multiple users will be doing import processes, but all of them will be working for different clients. Also, at the same time client's data could be shown to the user and can be modified/inserted too, while the import process for the same or different client is in process.I have 2 core tables which get the most hits whenever import processes run. I have 2 options now1. To have 2 core tables and do the sequential imports by making queue for the import processes.Table 1IDClientIDSourceIDCountAnotherCol1AnotherCol2AnotherCol3Table 2IDClientIDOrderIDCountAnotherCol4AnotherCol5AnotherCol62. To have 1000 core table, 2 for each client (I may have maximum 500 clients), it will allow the users to work on the import processes simultaneously without waiting for one another.More information about the import process:1. These table is not going to be used in any Reporting.2. Each import process will insert 20k-30k records (7 columns) in these each table. And there will be around 40-50 such imports in a day.3. While the import process is going on, data could be retrieved from these tables by some other user and INSERT OR UPDATED too.4. These are going to be one of the most usable tables in the application.5. BULK INSERT will be used for insertion.6. Clustered index is on the Primary Key which is an Identity column.7. We are considering the table partitioning too.Can you please suggest which option is better and why?Also, if you suggest to go with option 2, then would it not be a performance hit to create so many tables in the database? Should we create a separate database for these 1000 tables in this case? |
 |
|
|
Next Page
|