.net - Bulk insert via Dapper is slower than inserting rows one-by-one -


i'm using dapper insert data realtime feed sql server, care performance. recently, noticed strange.
out of box, if give dapper collection , insert query, fires insert statements each element. tests show can insert 1800 objects 12 fields in 1 second way (counting connection.execute(...) running time.
now, didn't find batch insert functionality in dapper , implemented own (constructing parameter list , sql query). after that, found out can insert 1 batch in 3 seconds (which limited 1000 rows) (again, counting connection.execute(...) calls.
so, makes batching 6 times slower sending each row in separate query. can explain me? thought people use batch operations speed process.
insert time 1 second @ most. use sql server 2012 standard on local network. table i'm inserting has clustered index on primary key (which bigint field), no non-clustered indexes , triggers.
can post code, there's nothing special

i'm not sure why using dapper execute extension method if want best performance available.

the best free way insert best performance using sqlbulkcopy class directly.

disclaimer: i'm owner of project dapper plus

this project provides easy support following operations:

  • bulkinsert
  • bulkupdate
  • bulkdelete
  • bulkmerge

example:

// configure & map entity dapperplusmanager.entity<order>()                  .table("orders")                  .identity(x => x.id);  // chain & save entity connection.bulkinsert(orders)           .alsoinsert(order => order.items);           .include(x => x.thenmerge(order => order.invoice)                          .alsomerge(invoice => invoice.items))           .alsomerge(x => x.shippingaddress);    

Comments

Popular posts from this blog

python - How to insert QWidgets in the middle of a Layout? -

python - serve multiple gunicorn django instances under nginx ubuntu -

module - Prestashop displayPaymentReturn hook url -