An apache karaf feature for the liquibase RDMBS schema management, including slf4j logging.

Steinar Bang 8f1a353923 Add runtime dependencies to pax exam test 7 kuukautta sitten
.github 3336017910 Add github actions CI build 1 vuosi sitten
liquibase-bom 713d7b5ea2 [maven-release-plugin] prepare for next development iteration 11 kuukautta sitten
liquibase-core-karaf 713d7b5ea2 [maven-release-plugin] prepare for next development iteration 11 kuukautta sitten
liquibase-integration-test 8f1a353923 Add runtime dependencies to pax exam test 7 kuukautta sitten
.editorconfig 9d033c3e4e Add .editorconfig file 5 vuotta sitten
.gitignore c9cbe6983f Add karaf integration test for the liquibase karaf feature 2 vuotta sitten
LICENSE 2d28c9de5b Initial commit. 7 vuotta sitten
README.org 12ad01d995 Add version 4.24.0 to the release history in the README and use version 4.24.0 in the examples in the README 11 kuukautta sitten
pom.xml 4e6f45cd1c Use liquibase 4.26.0 8 kuukautta sitten

README.org

A Karaf feature for liquibase-core

NOTE! Version 4.15.0 is the first release of the liquibase karaf feature in nearly three years See Using liquibase from OSGi bundles if you come from version 3.8.0 of this feature, for changes that need to be made.

This project contains a karaf feature for easily using Liquibase from OSGi-based applications running in apache karaf.

If you haven't yet encountered it: liquibase is a really smooth solution for handling your RDBMS schemas. Smooth initial startup, and smooth evolution of schemas (adding columns, adding tables, dropping columns and dropping tables).

Liquibase does the same job as ad-hoc delta script solutions, but liquibase does the job in a clean and robust way, tested and refined over the 11 years of its existence.

Liquibase does pretty much the same thing as flyway but in a different way that fits my programmer's mind better. And liquibase is cross-database capable, i.e. done right it's possible to write schema migrations in ways that will make them work on all databases with a JDBC driver.

Project status

file:https://maven-badges.herokuapp.com/maven-central/no.priv.bang.karaf/liquibase-core-karaf/badge.svg file:https://github.com/steinarb/liquibase-karaf-feature/actions/workflows/liquibase-karaf-feature-maven-ci-build.yml/badge.svg

Release history

Installing the liquibase feature in karaf

Date Version Liquibase version Liquibase slf4j version Comment
<2023-12-11 Mon 22:33> 4.24.0 4.24.0
<2023-12-11 Mon 22:23> 4.23.2 4.23.2
<2023-12-11 Mon 21:07> 4.25.0 4.23.1 Mistaken release! Sorry!
<2023-12-11 Mon 20:44> 4.23.1 4.23.1 Integration test had to replace derby with h2
<2023-06-28 Wed 23:53> 4.23.0 4.23.0 First OSGi compatible version since 4.19.0
<2023-03-05 Sun 21:10> 4.19.0 4.19.0
<2022-10-30 Sun 15:48> 4.17.1 4.17.1
<2022-08-20 Sat 19:27> 4.15.0 4.15.0 First liquibase 4.x release of the feature
<2019-11-18 Mon 21:25> 3.8.0 3.8.0 2.0.0
<2019-11-18 Mon 20:42> 3.7.0 3.7.0 2.0.0 Use snakeyaml 1.23
<2019-11-18 Mon 19:33> 3.6.3 3.6.3 2.0.0
<2019-11-17 Sun 22:58> 3.6.2 3.6.2 2.0.0
<2019-11-17 Sun 22:09> 3.6.1.1 3.6.1 2.0.0 Loads snakeyaml 1.18 instead of 1.17
<2019-11-17 Sun 17:35> 3.6.1 3.6.1 2.0.0 Broken because of wrong snakeyaml version
<2019-11-17 Sun 21:27> 3.6.0.1 3.6.0 2.0.0 Loads snakeyaml 1.18 instead of 1.17
<2019-11-17 Sun 16:01> 3.6.0 3.6.0 2.0.0 Broken because of wrong snakeyaml version
<2019-11-16 Sat 23:09> 3.5.5 3.5.5 2.0.0 Use version 3.5.1 of maven-bundle-plugin
<2019-11-16 Sat 11:28> 3.5.4 3.5.4 2.0.0 Updated pom.xml release config, update karaf to 4.2.7
<2017-08-06 Sun 18:48> 3.5.3 3.5.3 2.0.0 First release with the same version as the liquibase version
<2017-08-06 Sun 15:18> 1.0.2 3.5.3 2.0.0 First successful release
<2017-08-06 Sun 12:03> 1.0.1 3.5.3 2.0.0 Failed release
<2017-08-05 Sat 21:37> 1.0.0 3.5.3 2.0.0 Failed release
    To install this feature:
  1. start karaf and give the following commands to the karaf console:
  2. #+BEGIN_EXAMPLE feature:repo-add mvn:no.priv.bang.karaf/liquibase-core-karaf/4.24.0/xml/features feature:install liquibase-core #+END_EXAMPLE

After this, the liquibase Java API is available to your OSGi applications and the liquibase logging will go to the karaf log.

Using liquibase from a karaf feature

To use liquibase from your own, manually edited, karaf feature, include the feature's feature repository and depend on the liquibase-core feature:


<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<features xmlns="http://karaf.apache.org/xmlns/features/v1.4.0" name="ukelonn.bundle.db.liquibase">
    <repository>mvn:no.priv.bang.karaf/liquibase-core-karaf/4.24.0/xml/features</repository>
    <feature name="ukelonn-db-liquibase">
        <feature>liquibase-core</feature>
    </feature>
</features>

Using liquibase from a generated karaf feature

If you generate your karaf feature repository using the karaf-maven-plugin, you can include the liquibase-core feature into your generated feature repository, by adding this dependency to the maven project building the feature repository:


  <dependency>
      <groupId>no.priv.bang.karaf</groupId>
      <artifactId>liquibase-core-karaf</artifactId>
      <version>4.24.0</version>
      <type>xml</type>
      <classifier>features</classifier>
  </dependency>

Building the feature for a different version of Liquibase

The version number of this karaf feature is intended to be the same as the Liquibase version it is a feature for.

This makes it simple for me to roll and release a new version of the feature when a new version of Liquibase is out.

    But unfortunately this means that a SNAPSHOT version of the feature won't be able to refer to a real Liquibase version... at least not without a little edit:
  1. clone this project:
  2. #+BEGIN_EXAMPLE mkdir -p ~/git cd ~/git git clone https://github.com/steinarb/liquibase-karaf-feature/ #+END_EXAMPLE
  3. edit the pom, changing the liquibase.version property
  4. #+BEGIN_SRC xml ${project.version} #+END_SRC change it into an actual version #+BEGIN_SRC xml 3.5.4 #+END_SRC
  5. then build the project with maven:
  6. #+BEGIN_EXAMPLE cd liquibase-karaf-feature mvn clean install #+END_EXAMPLE

Test a new version

I have created the project liquibase-sample to test new versions of this karaf feature.

The liquibase-sample has a minimal OSGi component that loads and creates a schema in a derby in-memory database from a liquibase changelog file.

The liquibase-sample application can also be used to verify that the liquibase logs are redirected to the karaf logs.

Using liquibase from OSGi bundles

Liquibase 4 is built internally with an inversion-of-control architecture, and uses java.util.ServiceLoader to find the implementations of its services.

The ServiceLoader doesn't work well with OSGi. The ServiceLoader expects a single, flat, classloader, and a single thread, and this is not what OSGi has.

It is possible to make the ServiceLoader work in OSGi, using the Service Loader Mediator.

A single implementation of the service loader mediator exists: Apache Aries SPI Fly.

The liquibase karaf feature created from this project will load SPI Fly at the same start-level as the liquibase-core bundle.

    But to make liquibase usage work, the bundles that instanciates the Liquibase classes and loads and parses the changelog files, needs to do two things:
  1. Add a Require-Capabilities header that requires all capabilities provided by the liquibase-core bundle (this must be done for all bundles that need to instantiate Liquibase classes)
  2. Extract the XSD file(s) for the changelog schema version(s) used and add them to the bundle doing the parsing (this must be done for all bundles XML files are loaded from)

org.apache.felix maven-bundle-plugin 5.1.8 osgi.extender; filter:="(osgi.extender=osgi.serviceloader.processor)", osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.serializer.ChangeLogSerializer)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.parser.NamespaceDetails)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.database.Database)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.change.Change)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.database.DatabaseConnection)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.precondition.Precondition)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.serializer.SnapshotSerializer)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.configuration.AutoloadedConfigurations)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.diff.DiffGenerator)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.lockservice.LockService)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.changelog.ChangeLogHistoryService)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.datatype.LiquibaseDataType)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.configuration.ConfigurationValueProvider)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.logging.LogService)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.snapshot.SnapshotGenerator)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.parser.ChangeLogParser)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.servicelocator.ServiceLocator)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.diff.compare.DatabaseObjectComparator)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.command.LiquibaseCommand)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.license.LicenseService)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.diff.output.changelog.ChangeGenerator)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.executor.Executor)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.structure.DatabaseObject)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.parser.SnapshotParser)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.hub.HubService)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.command.CommandStep)"; cardinality:=multiple, osgi.serviceloader; filter:="(osgi.serviceloader=liquibase.sqlgenerator.SqlGenerator)"; cardinality:=multiple =target/classes, /www.liquibase.org/=target/dependency/www.liquibase.org/ org.apache.maven.plugins maven-dependency-plugin copy-liquibase-xsd validate unpack org.liquibase liquibase-core **/dbchangelog-3.5.xsd

In the above example only dbchangelog-3.5 is copied. If a different schema version is used, that version must be copied instead.

To copy all schemas, change includes to this (Disclaimer: not tested):

*/.xsd

Current problems under OSGi

    Apart from the issues worked around in the previous section I see the following problems:
  1. These messages are still present:
  2. #+begin_example 2023-12-12T07:45:21,915 | INFO | features-3-thread-1 | servicelocator | 93 - org.liquibase.core - 4.24.0 | Cannot load service: liquibase.change.Change: liquibase.change.core.LoadDataChange Unable to get public no-arg constructor 2023-12-12T07:45:21,916 | INFO | features-3-thread-1 | servicelocator | 93 - org.liquibase.core - 4.24.0 | Cannot load service: liquibase.change.Change: liquibase.change.core.LoadUpdateDataChange Unable to get public no-arg constructor #+end_example
  3. Starting with liquibase 4.19.1 and fixed in 4.21.0 using liquibase in OSGi failed with the error message
  4. #+begin_example java.lang.NullPointerException: Cannot invoke "liquibase.logging.mdc.MdcManager.put(String, String)" because the return value of "liquibase.Scope.getMdcManager()" is null at liquibase.Scope.addMdcValue(Scope.java:416) #+end_example This was reported as https://github.com/liquibase/liquibase/issues/3910
  5. Starting with version 4.21.0 the Liquibase facade stopped working for me. To avoid messages like this:
  6. #+begin_example 2022-09-10T13:47:54,302 | ERROR | CM Configuration Updater (ManagedServiceFactory Update: factoryPid=[org.ops4j.datasource]) | HandleregProductionDbLiquibaseRunner | 125 - no.priv.bang.handlereg.db.liquibase.production - 1.0.0.SNAPSHOT | Failed to create handlereg derby test database liquibase.exception.LiquibaseException: java.lang.RuntimeException: Cannot end scope cpkebkpkfa when currently at scope bbldyrztji at liquibase.Liquibase.runInScope(Liquibase.java:2419) ~[?:?] at liquibase.Liquibase.update(Liquibase.java:209) ~[?:?] at liquibase.Liquibase.update(Liquibase.java:195) ~[?:?] ... #+end_example the Liquibase facade has to be replaced with ScopeRunner using ThreadLocalScopeManager. I.e. something like this #+begin_src java @Component(immediate=true, property = "name=sampledb") public class SampleDbLiquibaseRunner implements PreHook {

private Bundle bundle;

@Activate public void activate(BundleContext bundlecontext) { this.bundle = bundlecontext.getBundle(); }

@Override public void prepare(DataSource datasource) throws SQLException { try (Connection connection = datasource.getConnection()) { applyLiquibaseChangelist(connection, "sample-db-changelog/db-changelog-1.0.0.xml"); } catch (LiquibaseException e) { throw new RuntimeException("Error creating sampleapp test database schema", e); } }

private void applyLiquibaseChangelist(Connection connection, String changelistClasspathResource) throws LiquibaseException { try(Liquibase liquibase = createLiquibaseInstance(connection, changelistClasspathResource)) { liquibase.update(""); } }

private Liquibase createLiquibaseInstance(Connection connection, String changelistClasspathResource) throws LiquibaseException { DatabaseConnection databaseConnection = new JdbcConnection(connection); var resourceAccessor = new OSGiResourceAccessor(bundle); return new Liquibase(changelistClasspathResource, resourceAccessor, databaseConnection); }

} #+end_src has to be replaced with something like this: #+begin_src java @Component(immediate=true, property = "name=sampledb") public class SampleDbLiquibaseRunner implements PreHook {

private Bundle bundle;

@Activate public void activate(BundleContext bundlecontext) { Scope.setScopeManager(new ThreadLocalScopeManager()); this.bundle = bundlecontext.getBundle(); }

@Override public void prepare(DataSource datasource) throws SQLException { try (Connection connection = datasource.getConnection()) { applyLiquibaseChangelist(connection, "sample-db-changelog/db-changelog-1.0.0.xml"); } catch (Exception e) { throw new RuntimeException("Error creating sampleapp test database schema", e); } }

private void applyLiquibaseChangelist(Connection connection, String changelistClasspathResource) throws Exception { var database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(new JdbcConnection(connection)); Map scopeObjects = Map.of( Scope.Attr.database.name(), database, Scope.Attr.resourceAccessor.name(), new OSGiResourceAccessor(bundle));

Scope.child(scopeObjects, (ScopedRunner) () -> new CommandScope("update") .addArgumentValue(DbUrlConnectionCommandStep.DATABASE_ARG, database) .addArgumentValue(UpdateCommandStep.CHANGELOG_FILE_ARG, changelistClasspathResource) .addArgumentValue(DatabaseChangelogCommandStep.CHANGELOG_PARAMETERS, new ChangeLogParameters(database)) .execute()); }

Testing and debugging in karaf

} #+end_src If the integration test fails in the schema setup, I haven't yet found a way to debug in the integration test itself.

But it is possible to start a karaf process locally, attach an IDE to that karaf process for remote debugging, and then load the same feature as the integration tests.

License

    The procedure, is:
  1. [https://karaf.apache.org/get-started][Download a tar-ball or zip file from the newest binary release, and unpack it]]
  2. cd into the unpacked karaf distro, and start karaf in debug mode:
  3. #+begin_example ./bin/karaf debug #+end_example
  4. In the IDE create a remote debug configuration attaching to localhost port 5005 and start the debug configuration
  5. Also in the IDE, set a breakpoint where you want the debugger to stop (for me it's a point in my own code that occurs in the stack trace I'm trying to debug)
  6. Load the same karaf feature as the integration test
  7. #+begin_example feature:repo-add mvn:no.priv.bang.karaf/karaf.liquibase.sample.datasource.receiver/LATEST/xml/features feature:install karaf-liquibase-sample-datasource-receiver #+end_example
  8. The IDE will stop on the breakpoint in the debugger and it's possible to step into the release
  9. If you want to restart:
  10. Disconnect the debugger
  11. Stop karaf with Ctrl-D in the console
  12. delete the data directory in karaf:
  13. #+begin_example rm -rf data #+end_example
  14. Start karaf again in debug mode
  15. #+begin_example ./bin/karaf debug #+end_example
  16. Start a remote debug session from the IDE
  17. Use arrow up in the karaf console to rerun the feature:repo-add and feature:install commands for the feature

This maven project is licensed with the Apache v 2.0 license.

The details of the license can be found in the LICENSE file.

The liquibase-slf4j jar is covered with the MIT license, copyright 2012-2015 Matt Bertolini. This license and copyright also covers the rebundled version of the jar that results from the "com.mattbertolini.liquibase-slf4j-osgi" maven module.