c# - What is the proper way to insert millions of records from a csv file into a SQL Server database? -


i have insert 40 million records .csv file database , below process followed.

  • windows service 1:

    • reading csv
    • validating records
    • inserting valid records success table (intermediate table) using sqlbulkcopy.
  • windows service 2.

    • getting 10 000 records success table @ time
    • running foreach these 10 000
    • sending each record database insert main table.

windows service 1 takes 30-40 min windows service 2 takes 5 hours complete task (minimum time). have 2 ways cannot decide better , open suggestions.

  1. creating 4 separate windows service , processing 40000 records simultaneously
  2. using job watch can use while loop
  3. calling procedure async windows service

my biggest doubt here using transactions in procedure , async work because per assumption using transaction locks table , other process needs work on it.

i think using wrong tool job. c# apps might trick there more powerful way using integration services.

i guessing here, these tables in middle transform or check, or maybe batch import down?

ssis can of these things using log limit , sql bulk import tools. hospital data imports around 8,000,000 records each night , takes me matter of minutes not hours do.

a read around how sql deals such large data inputs this article


Comments

Popular posts from this blog

ios - RestKit 0.20 — CoreData: error: Failed to call designated initializer on NSManagedObject class (again) -

java - Digest auth with Spring Security using javaconfig -

laravel - PDOException in Connector.php line 55: SQLSTATE[HY000] [1045] Access denied for user 'root'@'localhost' (using password: YES) -