how do i do very fast inserts to SQL Server 2008

I have a project that involves recording data from a device directly into a sql table.

I do very little processing in code before writing to sql server (2008 express by the way)

typically i use the sqlhelper class’s ExecuteNonQuery method and pass in a stored proc name and list of parameters that the SP expects.

This is very convenient, but i need a much faster way of doing this.

Thanks.

How to do pagination in SQL Server 2008

How do you do pagination in SQL Server 2008 ?

Migrated classic ASP to Win2008/IIS7/SQL Server 2008. Now some inserts/updates are very slow from ASP, but very fast from Management Studio

Title covers it pretty well. I am new to Windows 2008 R2, IIS7, and SQL Server 2008. Some very basic insert and update statements run very fast from Management Studio, but take several seconds each wh

How do I pass a table-valued parameter to SQL Server 2008 via EntLib 5.0?

How do I pass a table-valued parameter to SQL Server 2008 via EntLib 5.0?

How do I handle large SQL SERVER batch inserts?

I’m looking to execute a series of queries as part of a migration project. The scripts to be generated are produced from a tool which analyses the legacy database then produces a script to map each of

Do while loop in SQL Server 2008

Is there any method for implement do while loop in SQL server 2008?

How do I convert a SQL Server 2008 .mdf file to SQL Server 2012?

I need to convert a file DATABASE.MDF from SQL Server 2008 to SQL Server 2012? Database ‘Sales’ cannot be upgraded because its non-release version (539) is not supported by this version of SQL Server

How do you read XML column in SQL Server 2008?

I have never used XML in SQL Server 2008, I need to extract a list of customers into a variable table how do you do it? Given that I have a column called CustomerList in a Sales table that looks like

How do I finish this SQL Server 2008 query?

I need a SQL 2008 query that returns all consumers who have > 1 userfield record WITH THE SAME USERFIELDUUID linked to them — in other words, all those consumers who have duplicate userfield recor

How do I to find out if I have a local SQL Server 2008 R2 installed

They have installed SQL Server 2008 R2 on my work machine. Connecting to (local) win auth does not seem to work. How do I determine if I have a local instance of it? thanks

How do I backup the data in SQL Server 2008 on a third party host?

I have a SQL Server 2008 database that is hosted by a third party host (heart internet). How would I go about backing this up? I used SQL Server Management Studio Express 2008 to create the tables wit

Answers

bulk insert would be the fastest since it is minimally logged

.NET also has the SqlBulkCopy Class

This is typically done by way of a BULK INSERT. Basically, you prepare a file and then issue the BULK INSERT statement and SQL Server copies all the data from the file to the table with the fast method possible.

It does have some restrictions (for example, there’s no way to do “update or insert” type of behaviour if you have possibly-existing rows to update), but if you can get around those, then you’re unlikely to find anything much faster.

If you mean from .NET then use SqlBulkCopy

Things that can slow inserts include indexes and reads or updates (locks) on the same table. You can speed up situations like yours by avoiding both and inserting individual transactions to a separate holding table with no indexes or other activity. Then batch the holding table to the main table a little less frequently.

It can only really go as fast as your SP will run. Ensure that the table(s) are properly indexed and if you have a clustered index, ensure that it has a narrow, unique, increasing key. Ensure that the remaining indexes and constraints (if any) do not have a lot of overhead.

You shouldn’t see much overhead in the ADO.NET layer (I wouldn’t necessarily use any other .NET library above SQLCommand). You may be able to use ADO.NET Async methods in order to queue several calls to the stored proc without blocking a single thread in your application (this potentially could free up more throughput than anything else – just like having multiple machines inserting into the database).

Other than that, you really need to tell us more about your requirements.

ExecuteNonQuery with an INSERT statement, or even a stored procedure, will get you into thousands of inserts per second range on Express. 4000-5000/sec are easily achievable, I know this for a fact.

What usually slows down individual updates is the wait time for log flush and you need to account for that. The easiest solution is to simply batch commit. Eg. commit every 1000 inserts, or every second. This will fill up the log pages and will amortize the cost ow log flush wait over all the inserts in a transaction.

With batch commits you’ll probably bottleneck on disk log write performance, which there is nothing you can do about it short of changing the hardware (going raid 0 stripe on log).

If you hit earlier bottlenecks (unlikely) then you can look into batching statements, ie. send one single T-SQL batch with multiple inserts on it. But this seldom pays off.

Of course, you’ll need to reduce the size of your writes to a minimum, meaning reduce the width of your table to the minimally needed columns, eliminate non-clustered indexes, eliminate unneeded constraints. If possible, use a Heap instead of a clustered index, since Heap inserts are significantly faster than clustered index ones.

There is little need to use the fast insert interface (ie. SqlBulkCopy). Using ordinary INSERTS and ExecuteNoQuery on batch commits you’ll exhaust the drive sequential write throughput much faster than the need to deploy bulk insert. Bulk insert is needed on fast SAN connected machines, and you mention Express so it’s probably not the case. There is a perception of the contrary out there, but is simply because people don’t realize that bulk insert gives them batch commit, and its the batch commit that speeds thinks up, not the bulk insert.

As with any performance test, make sure you eliminate randomness, and preallocate the database and the log, you don’t want to hit db or log growth event during test measurements or during production, that is sooo amateurish.

Here is a good way to insert a lot of records using table variables…

…but best to limit it to 1000 records at a time because table variables are “in Memory”

In this example I will insert 2 records into a table with 3 fields – CustID, Firstname, Lastname

–first create an In-Memory table variable with same structure

–you could also use a temporary table, but it would be slower

declare @MyTblVar table (CustID int, FName nvarchar(50), LName nvarchar(50))

insert into @MyTblVar values (100,’Joe’,’Bloggs’)

insert into @MyTblVar values (101,’Mary’,’Smith’)

Insert into MyCustomerTable

Select * from @MyTblVar