Save data into mysql database in batch mode on transaction basis

I am trying to load data(from df) into mysql database in batch mode, Loading should be on single transaction and data should be rolled back if load fails at any stage during batch processing. I’ve coded currently by using MySQL: transaction package and would like to check if this is the efficient approach or any other julia packages are readily available?

My df consists of around 300k records and trying to load each batch with 6k records in it. I would also like to consider if there is way to do multi insert query(have another df to load data into different table)

There is no support in ODBC.jl for batch inserts. I coded it mysel, but did not submit it to ODBC.jl yet.

@stene Thanks for replying, Sure, I would like to know what method you have used to load the data? Have you insert data using load() method or bulk inserts using execute() or executemany() ?

I’ve also written while loop using indexing to run in batches using insert statements, If you don’t mind can I see your code?