It sounds reproducible - I have created https://liquibase.jira.com/browse/CORE-2510 to track it.
Steve Donie
Principal Software Engineer
Datical, Inc. http://www.datical.com/
It sounds reproducible - I have created https://liquibase.jira.com/browse/CORE-2510 to track it.
Steve Donie
Principal Software Engineer
Datical, Inc. http://www.datical.com/
Hello,
I have a problem with loading data from a file into a MySQL table in
3.4.1. I am trying to update from 3.0.8.
It works fine in all other previous versions.
The same problem occurs running through Spring or liquibase command tool.
Here is a changeset to create a simple table and populate it with
reference data.
changeSet
author=“xxx” id=“1429175025865-3”
context=“run”>
7:5bbcdefbcd53b3314f671dcd1e673fa9
onFail=“MARK_RAN”>
select count(*) from
Organisation
Add Organisation
table
(I had to put in a 'validCheckSum' tag as the checksum calculation was different from my original checksum generated in liquibase 3.0.8.)
and here is a sample of our data set:
id,approved,titlewhich continues for 3824 rows.
The load data changeset doesn't seem able to cope with more than 50 rows. Here is the error:
There is nothing wrong with the row data - it just happens to be the
51st row. It is always the 51st row that is the offender.
If I truncate dataset to < 50 rows it works fine.
At present I've just updated to 3.3.5 as this is the latest version for which the data is loaded in properly.
Thanks
Richard