Hi, I’m trying to use Liquibase to import data into a large database table. I’m running with this command:
liquibase --driver=oracle.jdbc.OracleDriver ^
The changeset is using loadData to import a csv file. The csv file I’m importing is 650Mb. Whenever I run the update, I get OutOfMemoryErrors (after a couple of hours of waiting).
Both the changeset and csv were generated with generateChangeLog. So full marks to Liquibase for at least being able to extract the large table!
I’m running with -Xmx2048m, and Liquibase 3.3.2.
I’m considering breaking the csv into multiple smaller files and multiple changesets, but I’m unsure how small the csv’s would need to be. Does anyone have any info on the largest csv filesize Liquibase loadData can handle?
Has anyone else encountered and resolved this kind of problem?
Alternatively I can exclude this table from my Liquibase scripts as it’s the only one causing issues.
Thanks in advance,