[Fixed]-Best practice of bulk_create for massive records

6👍

While bulk_create is useful for saving a small number of records while processing an HTML form, it’s not ideally suited for saving thousands of records. As you have found out already, it’s slow because it needs a lot of memory and sends a very large query to the database. Fortunatley LOAD DATA IN FILE comes to the rescue.

The LOAD DATA INFILE statement reads rows from a text file into a
table at a very high speed. LOAD DATA INFILE is the complement of
SELECT … INTO OUTFILE.

We can produce a file similar to what’s produced by using csv writer the following example is from the documentation.

import csv
    with open('some.csv', 'wb') as f:
    writer = csv.writer(f)
    writer.writerows(someiterable)

Finally as you have already found out, the LOCAL option to LOAD DATA .. can sometimes be used for convenience.

LOCAL works only if your server and your client both have been
configured to permit it

When using this option, the file does not need to be manually transferred to the server. You can generate the CSV file on the client side and the local option will cause the mysql client to automatically transfer the file to the server.

If LOCAL is not specified, the file must be located on the server host
and is read directly by the server.

👤e4c5

Leave a comment