Validation Failed running updateSQL from 2 diff machines

Hello everybody,
in my actual project we have the need to run the updateSQL from an Intellij (via plugin) and from a Jenkins process (of course using the same target Oracle DB)

Unfortunately I got “Validation Failed” … I have already set up the logicalpath to be the same …

Here what I did

In the table.xml file

> <databaseChangeLog xmlns=""
>                    xmlns:xsi=""
>                    xsi:schemaLocation=""
>                    logicalFilePath="src/main/resources/db/scripts/tables/attribute_value.xml">

I run the updateSQL from Intellij and I got this

VALUES ('1439890345106-1', 'xxxxx', 'src/main/resources/db/scripts/tables/attribute_value.xml', SYSTIMESTAMP, 92, 
'7:1af9602de533591c505295bb7acf4820', 'createTable', '', 'EXECUTED', NULL, NULL, '3.4.1');

Then I run insert statement into the DB (Oracle) … till here all is ok

Now I ran the Jenkins build process and I got the

Validation Failed:
     1 change sets check sum
          src/main/resources/db/scripts/tables/attribute_value.xml::1439890345106-1::xxxxx was: 7:1af9602de533591c505295bb7acf4820 but is now: 8:64800df46380c744bd812f9c7261406c

REading on internet I erase the checksum in the db to “recalculate” it … and the build was ok … but then I ran again Intellj and I got the same error …

Any suggestion?

Hi @clxt1

Sorry if my question sounds very basic, but I am unclear as why do you want to manually run updateSQL first and then run the INSERT query. As after this step you file is considered executed already, when you try to run it through your Jenkins process, it results into an error.

What if you directly run Jenkins process and not run the first manual steps?

Rakhi Agrawal

@clxt1 A checksum error indicates that a changeset that has already been deployed to your database, was modified. Liquibase uses the MD5SUM on the databasechangelog to determine if a modification was made.

Did you modify that changeset after you manually inserted the row into the databasechangelog table?

I know that I’m following a non conventional approach … but here the situation …

We are developing our application and we are continuing changing the db structure… so it means adding “chabgeset” … we have already our code in test and acceptance … so now we want to group ALL the changeset (by table) in one big changeset … to do that normally u drop the object and u create it via liquibase … but we cannot drop the object.

so the idea was:

  • removed all the changeset in the databasechangelog
  • insert the new (single) changeset log in the databasechangelog (generated by updateSQL)

I cannot run in jenkins the updateSQL to obtain the changeset signature … so i was thinking to run in my liquibase under Intellij … but checksum generated in intellij is not “recognize” by Jenkins update command … so it tries again to create the objects :frowning:

hope that its more clear now.


I would recommend one of these approaches:

(Note: Liquibase best practice, only 1 DDL per changeset)

  1. Add a precondition on each changeset that contains an object that may already exists. If it already exists, have the changeset “mark_ran”.
  2. Use the markNextChangeSetRan command to mark the next pending changeset as ran in the database that already contains the object.
  3. Use the changelogSync command to mark all pending changesets as ran.

I thought it should exist a way to define to calculate the checksum based only on id and “logicalpath/file” … like that does not matter if i run in liquibase or in jekins/bamboo

The checksum is calculated on the changeset itself. It does not matter how the changset is executed.

well … so explain me why if u run the “insert databasechangelog” generated by liquibase plugin … then I run jenkins deployment (on the same db) … the jenkins one is complaining about the checksum …