0% found this document useful (0 votes)
9 views2 pages

Import_ Large Excel data to MySQL database

Uploaded by

Social Accounts
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
9 views2 pages

Import_ Large Excel data to MySQL database

Uploaded by

Social Accounts
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 2

List of critical points to speed up the Excel import:

1. Use LOAD DATA INFILE. This is a MySQL-specific statement that directly


inserts data into a table from a CSV / TSV file. It is the most efficient way to insert
large amounts of data into a MySQL table.

2. Use bulk inserts. This involves creating a single SQL statement that inserts
multiple rows into a table at once. This can be much faster than inserting rows
one at a time.

3. Optimize your queries. Make sure that your queries are well-optimized and that
they are not performing unnecessary operations. This can make a big difference
in the speed of your inserts.

4. Use a smaller Excel file. If your Excel file is very large, it may be faster to break
it up into smaller files and then insert each file individually.

5. Use a temporary table. You can use a temporary table to store the data from
your Excel file before inserting it into your main table. This can help to improve
performance if your main table is very large.

6. Test your queries. Once you have created your queries, it is important to test
them to make sure that they are working correctly and that they are performing
as expected.

7. Remove existing indexes - Inserting data to a MySQL table will slow down once
you add more and more indexes. Therefore, if you're loading data to a new table, it's
best to load it to a table without any indexes, and only then create the indexes, once
the data was loaded.

Example:

LOAD DATA LOCAL INFILE '/path/to/employees.csv'


INTO TABLE employee_info
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id, name, age, department);
To-Do Process:

During the Proof of Concept (POC) with 5 lakhs of data, we will follow a two-part process:

i. Dumping Data into Temporary Table:


ii. Reading Data from Temporary Table and Data Transformation:

Throughout the POC, we will carefully analyze the performance and efficiency of the
process.

Prerequisite for File Processing and Data Migration:

1. File Type: Only CSV files are accepted for data processing.

2. File Size: The file size must be below 2 MB.

3. Temporary Tables: Temporary tables will be created during the data migration process,
similar to how Excel creates temporary tables when processing data.

4. Data Loading: Data will be loaded into the temporary tables for computation.

5. Computation: Computation of data will take place during the data migration from the
temporary tables to the main table.

6. Batch Processing: Batch processing of data is required, with batches of 10, 20, or 50
records processed together.

7. Database Connections: Ensure that multiple database connections are not created
during the process to avoid any potential issues or conflicts.

Reports:

1.

You might also like