Troubleshoot matching specified properties
As a Liquibase user who also uses the Databricks Delta Lake tool, you may encounter the error code [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties
when you are using the
parameter. This parameter is specifically used with the Delta Lake allowColumnDefaults
DatabricksALTER TABLE
command when managing tables. You can set default values for columns in a Delta table schema and ensures that new rows inserted into the table will have default values for columns that aren't explicitly given a value. This is helpful when you want certain fields in your table to automatically receive a specific value when not provided.
Cause
When Liquibase Pro creates an external table from a modeled changeset it triggers the creation of these three table properties:
delta.feature.allowColumnDefaults=supported
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
Databricks creates metadata for two of these properties in an external location, but not for allowColumnDefaults
. If you delete the external table, you are not deleting the files for this table in external locations, so the metadata continues to exist. When you create this table after deleting it, the error occurs and properties will not match because Databricks will continue to look for the old, persisting metadata and not for the newly created table.
Error Message
Caused by: com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-test/test_table_properties.
== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
== Existing ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
This becomes an issue when you run liquibase update
which creates an external table in the CICD pipeline. If a failure occurs at any point in the pipeline, and you delete the table with the reference rather than the source containing the metadata, the file in the external location remains. In this situation, Liquibase Pro can't create a new table because Databricks is still referencing the old metadata property.
Remedy
Option 1: Manual deletion
Manually delete the table files in the external location.
Or
Option 2: Use the default value instead of allowColumnDefaults
property
If you can't delete table files directly, you must create an external file using SQL changetype without specifying the allowColumnDefaults
property and instead use the default value for a column.
-
In this example, you commit this changeset to your changelog that contains your Databricks table properties:
Copy<changeSet id="1" author="test">
<createTable tableName="test_table_properties">
<column name="test_id" type="int"/>
<column name="test_column" type="varchar(50)" defaultValue="42"/>
<databricks:extendedTableProperties tableLocation="s3://databricks-test/test_table_properties"/>
</createTable>
</changeSet>Liquibase generates this SQL response and deploys the change successfully. The table and corresponding files are created in the S3 external location as expected:
CopyCREATE TABLE main.testSchema.test_table_properties (test_id INT, test_column VARCHAR(50) DEFAULT '42')
USING delta TBLPROPERTIES('delta.feature.allowColumnDefaults' = 'supported', 'delta.columnMapping.mode' = 'name', 'delta.enableDeletionVectors' = true)
LOCATION 's3://databricks-test/test_table_properties'; -
The table containing the default files is dropped by a rollback procedure. The files created in step 1 are deleted, but Liquibase and Databricks are still looking for the S3 location. If you try to deploy this to Databricks, you will see this error message:
CopyError message
Caused by: com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-test/test_table_properties.
== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
== Existing ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true -
Rewrite this changeset into a SQL change without specifying allowColumnDefaults:
CopySQL change without
allowColumnDefaults
<changeSet id="1" author="test">
<sql>CREATE TABLE main.testSchema.test_table_properties (test_id INT, test_column VARCHAR(50) DEFAULT '42')
USING delta TBLPROPERTIES('delta.columnMapping.mode' = 'name', 'delta.enableDeletionVectors' = true)
LOCATION 's3://databricks-test/test_table_properties';
</sql>
</changeSet>