I am trying to load data(from df) into mysql database in batch mode, Loading should be on single transaction and data should be rolled back if load fails at any stage during batch processing. I’ve coded currently by using MySQL: transaction package and would like to check if this is the efficient approach or any other julia packages are readily available?
My df consists of around 300k records and trying to load each batch with 6k records in it. I would also like to consider if there is way to do multi insert query(have another df to load data into different table)