Hello
Can’t determinate if this is a SQL server Bug or a liquibase 2.0.5 bug …
We have 2 schema in our sql server (2008 R2) database , one is dbo , the other is L1 (example) , with same tables in each
I have a changelog who try to add a column to a table if it does not exist ( not column exist precondition)
I run the changelog for the dbo schema fine … the column is added to the table and the dbo.DATABASECHANGELOG record is marked as EXECUTED .
I run the changelog for the L1 schema … MARK_RAN in L1.DATABASECHANGELOG instead of EXECUTED , and the column is not added to the table
Precision , each update is launched with a different user , each user have his respective schema (dbo, L1) as default schema in sqlserver
Digging into debugging i found that when liquibase goes to test column existance (ColumnExistPreCondition.check()) it passes a null schema to the underlying jdbc dricver (same with Microsoft or Jtds driver) , and this one will try to check column in the dbo schema anyway instead of user’s default schema (L1 in my case) .
I’ve found a workaround by modifying the liquibase MSSQLDatabase implementation forcing the getDefaultDatabaseSchemaName() to return super.getDefaultSchemaName() instead of null (…) .
By the way doe’s someone knows is this is the normal SQL-Server behaviour to return dbo if a null schema is passed ?
IMO : anyway if a schema name is not provided in a changelog liquibase must insure that he return the defaut schema for the user (regardless off the underlying database) , this will give us a normalized acces method to database objects .