Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2012 Forums
 Transact-SQL (2012)
 SQL Server Maximum Write Performance

Author  Topic 

fderfder
Starting Member

1 Post

Posted - 2014-04-22 : 02:17:16
Hi SQL Gurus,

I want to know if we are already getting the fastest SqlSever Write Performance for our application.

We created a sample application that performs a BulkCopy operation to a local SQL Server database. The BulkCopy operation writes 100,000 rows of data from a DataTable in memory. The table being inserted into has no indexes. This is because we just want to get the maximum write speed of SQL Server.

Here is the schema of the table we are inserting into:


CREATE TABLE [dbo].[HistorySampleValues](
[HistoryParameterID] [bigint] NOT NULL,
[SourceTimeStamp] [datetime2](7) NOT NULL,
[ArchiveTimestamp] [datetime2](7) NOT NULL,
[ValueStatus] [int] NOT NULL,
[ArchiveStatus] [int] NOT NULL,
[IntegerValue] [int] SPARSE NULL,
[DoubleValue] [float] SPARSE NULL,
[StringValue] [varchar](100) SPARSE NULL,
[EnumNamedSetName] [varchar](100) SPARSE NULL,
[EnumNumericValue] [int] SPARSE NULL,
[EnumTextualValue] [varchar](256) SPARSE NULL
) ON [PRIMARY]


We measure the performance from our C# code.

public double PerformBulkCopy()
{
DateTime timeToBulkCopy = DateTime.Now;
double bulkCopyTimeSpentMs = -1.0;

DataTable historySampleValuesDataTable = CreateBulkCopyRecords();

//start the timer here
timeToBulkCopy = DateTime.Now;
using (SqlConnection sqlConn = ConnectDatabase())
{
sqlConn.Open();

using (SqlTransaction sqlTransaction = sqlConn.BeginTransaction())
{
try
{
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(sqlConn, SqlBulkCopyOptions.KeepIdentity, sqlTransaction))
{
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_HISTORY_PARMETER_ID, SqlServerDatabaseStrings.SQL_FIELD_HISTORY_PARMETER_ID);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_SOURCE_TIMESTAMP, SqlServerDatabaseStrings.SQL_FIELD_SOURCE_TIMESTAMP);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_VALUE_STATUS, SqlServerDatabaseStrings.SQL_FIELD_VALUE_STATUS);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_ARCHIVE_STATUS, SqlServerDatabaseStrings.SQL_FIELD_ARCHIVE_STATUS);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_INTEGER_VALUE, SqlServerDatabaseStrings.SQL_FIELD_INTEGER_VALUE);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_DOUBLE_VALUE, SqlServerDatabaseStrings.SQL_FIELD_DOUBLE_VALUE);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_STRING_VALUE, SqlServerDatabaseStrings.SQL_FIELD_STRING_VALUE);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_ENUM_NAMEDSET_NAME, SqlServerDatabaseStrings.SQL_FIELD_ENUM_NAMEDSET_NAME);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_ENUM_NUMERIC_VALUE, SqlServerDatabaseStrings.SQL_FIELD_ENUM_NUMERIC_VALUE);
sqlBulkCopy.ColumnMappings.Add(SqlServerDatabaseStrings.SQL_FIELD_ENUM_TEXTUAL_VALUE, SqlServerDatabaseStrings.SQL_FIELD_ENUM_TEXTUAL_VALUE);

sqlBulkCopy.DestinationTableName = SqlServerDatabaseStrings.SQL_TABLE_HISTORYSAMPLEVALUES;

sqlBulkCopy.WriteToServer(historySampleValuesDataTable);
}

sqlTransaction.Commit();
//end the timer here
bulkCopyTimeSpentMs = DateTime.Now.Subtract(timeToBulkCopy).TotalMilliseconds;
}
catch (Exception ex)
{
sqlTransaction.Rollback();
}

CleanUpDatabase(sqlConn);
}
sqlConn.Close();
}

return bulkCopyTimeSpentMs;
}


I have tried the different overloads of SqlBulkCopy.WriteToServer(): DataTable, DataReader and DataRow[].

On a machine with this specs:
I3-2120 CPU @ 3.30GHz
8GB of RAM
Seagate Barracuda 7200.12 ST3500413AS 500GB 7200 RPM

I am getting a throughput of ~150K-160K rows inserted per second using the different overloads.

I am asking now, given our sample data and the sample table, is this the most we can get out of SQL Server SE? Or is there something we can do to make this even faster?

Let me know if there are more information you need about our setup

Lamprey
Master Smack Fu Yak Hacker

4614 Posts

Posted - 2014-04-22 : 11:59:01
I've not dealt with bulk inserting from C# in a long time. But, if you look at the MS whitepapers on loading a data warehouse from SSIS, you will see that they use parallelism to get the maximum throughput. So, you might be able to gain something buy separating the inserts into chucks and launching each on their own thread or something.

Out of curiosity, I couple of questions:

1. Can you tell where the bottleneck is? I'd guess because you are using a single HD that IO is your bottleneck.
2. What are the specs of the server you plan to be running on? I'd assume in a real server environment you'd have much more disk bandwidth to play with, so that might change your approach form what you do on a local/dev machine of lesser hardware.
Go to Top of Page
   

- Advertisement -