An apache karaf feature for the liquibase RDMBS schema management, including slf4j logging.

Steinar Bang e3db16c081 Use karaf 4.4.7 and switch from loading spifly 1.3.6 bundle explictly to loading spifly karaf feature 1 week ago
.github 57fc77089e Replace java 17 with java 21 in github actions CI build 2 months ago
liquibase-bom c3c85f0c49 [maven-release-plugin] prepare for next development iteration 2 months ago
liquibase-core-karaf e3db16c081 Use karaf 4.4.7 and switch from loading spifly 1.3.6 bundle explictly to loading spifly karaf feature 1 week ago
liquibase-integration-test 3385f9eca6 Use h2 2.3.232 in the integration test 2 months ago
liquibase-libraries c3c85f0c49 [maven-release-plugin] prepare for next development iteration 2 months ago
.editorconfig 9d033c3e4e Add .editorconfig file 5 years ago
.gitignore c9cbe6983f Add karaf integration test for the liquibase karaf feature 2 years ago
LICENSE 2d28c9de5b Initial commit. 7 years ago
README.org 36e7bcd459 Add version 4.30.0 to the release history in the README and use version 4.30.0 in the examples in the README 2 months ago
pom.xml e3db16c081 Use karaf 4.4.7 and switch from loading spifly 1.3.6 bundle explictly to loading spifly karaf feature 1 week ago

README.org

A Karaf feature for liquibase-core

NOTE! Version 4.15.0 is the first release of the liquibase karaf feature in nearly three years See Using liquibase from OSGi bundles if you come from version 3.8.0 of this feature, for changes that need to be made.

This project contains a karaf feature for easily using Liquibase from OSGi-based applications running in apache karaf.

If you haven't yet encountered it: liquibase is a really smooth solution for handling your RDBMS schemas. Smooth initial startup, and smooth evolution of schemas (adding columns, adding tables, dropping columns and dropping tables).

Liquibase does the same job as ad-hoc delta script solutions, but liquibase does the job in a clean and robust way, tested and refined over the 11 years of its existence.

Liquibase does pretty much the same thing as flyway but in a different way that fits my programmer's mind better. And liquibase is cross-database capable, i.e. done right it's possible to write schema migrations in ways that will make them work on all databases with a JDBC driver.

Project status

file:https://maven-badges.herokuapp.com/maven-central/no.priv.bang.karaf/liquibase-core-karaf/badge.svg file:https://github.com/steinarb/liquibase-karaf-feature/actions/workflows/liquibase-karaf-feature-maven-ci-build.yml/badge.svg

Release history

Installing the liquibase feature in karaf

Date Version Liquibase version Liquibase slf4j version Comment
<2024-12-10 Tue 20:31> 4.30.0 4.30.0
<2024-07-31 Wed 18:13> 4.29.0 4.29.0
<2024-07-03 Wed 22:07> 4.28.0.1 4.28.0 Bugfix for 4.28.0 release
<2024-07-03 Wed 21:33> 4.28.0 4.28.0 Adds new liquibase-runner library
<2024-04-05 Fri 22:27> 4.27.0 4.27.0
<2023-12-11 Mon 22:33> 4.24.0 4.24.0
<2023-12-12 Tue 22:33> 4.23.2 4.23.2
<2023-12-11 Mon 21:07> 4.25.0 4.23.1 Mistaken release! Sorry!
<2023-12-11 Mon 20:44> 4.23.1 4.23.1 Integration test had to replace derby with h2
<2023-06-28 Wed 23:53> 4.23.0 4.23.0 First OSGi compatible version since 4.19.0
<2023-03-05 Sun 21:10> 4.19.0 4.19.0
<2022-10-30 Sun 15:48> 4.17.1 4.17.1
<2022-08-20 Sat 19:27> 4.15.0 4.15.0 First liquibase 4.x release of the feature
<2019-11-18 Mon 21:25> 3.8.0 3.8.0 2.0.0
<2019-11-18 Mon 20:42> 3.7.0 3.7.0 2.0.0 Use snakeyaml 1.23
<2019-11-18 Mon 19:33> 3.6.3 3.6.3 2.0.0
<2019-11-17 Sun 22:58> 3.6.2 3.6.2 2.0.0
<2019-11-17 Sun 22:09> 3.6.1.1 3.6.1 2.0.0 Loads snakeyaml 1.18 instead of 1.17
<2019-11-17 Sun 17:35> 3.6.1 3.6.1 2.0.0 Broken because of wrong snakeyaml version
<2019-11-17 Sun 21:27> 3.6.0.1 3.6.0 2.0.0 Loads snakeyaml 1.18 instead of 1.17
<2019-11-17 Sun 16:01> 3.6.0 3.6.0 2.0.0 Broken because of wrong snakeyaml version
<2019-11-16 Sat 23:09> 3.5.5 3.5.5 2.0.0 Use version 3.5.1 of maven-bundle-plugin
<2019-11-16 Sat 11:28> 3.5.4 3.5.4 2.0.0 Updated pom.xml release config, update karaf to 4.2.7
<2017-08-06 Sun 18:48> 3.5.3 3.5.3 2.0.0 First release with the same version as the liquibase version
<2017-08-06 Sun 15:18> 1.0.2 3.5.3 2.0.0 First successful release
<2017-08-06 Sun 12:03> 1.0.1 3.5.3 2.0.0 Failed release
<2017-08-05 Sat 21:37> 1.0.0 3.5.3 2.0.0 Failed release
    To install this feature:
  1. start karaf and give the following commands to the karaf console:
  2. #+BEGIN_EXAMPLE feature:repo-add mvn:no.priv.bang.karaf/liquibase-core-karaf/4.30.0/xml/features feature:install liquibase-core #+END_EXAMPLE

After this, the liquibase Java API is available to your OSGi applications and the liquibase logging will go to the karaf log.

Using liquibase from a karaf feature

To use liquibase from your own, manually edited, karaf feature, include the feature's feature repository and depend on the liquibase-core feature:


<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<features xmlns="http://karaf.apache.org/xmlns/features/v1.4.0" name="ukelonn.bundle.db.liquibase">
    <repository>mvn:no.priv.bang.karaf/liquibase-core-karaf/4.30.0/xml/features</repository>
    <feature name="ukelonn-db-liquibase">
        <feature>liquibase-core</feature>
    </feature>
</features>

Using liquibase from a generated karaf feature

If you generate your karaf feature repository using the karaf-maven-plugin, you can include the liquibase-core feature into your generated feature repository, by adding this dependency to the maven project building the feature repository:


  <dependency>
      <groupId>no.priv.bang.karaf</groupId>
      <artifactId>liquibase-core-karaf</artifactId>
      <version>4.30.0</version>
      <type>xml</type>
      <classifier>features</classifier>
  </dependency>

Building the feature for a different version of Liquibase

The version number of this karaf feature is intended to be the same as the Liquibase version it is a feature for.

This makes it simple for me to roll and release a new version of the feature when a new version of Liquibase is out.

    But unfortunately this means that a SNAPSHOT version of the feature won't be able to refer to a real Liquibase version... at least not without a little edit:
  1. clone this project:
  2. #+BEGIN_EXAMPLE mkdir -p ~/git cd ~/git git clone https://github.com/steinarb/liquibase-karaf-feature/ #+END_EXAMPLE
  3. edit the pom, changing the liquibase.version property
  4. #+BEGIN_SRC xml ${project.version} #+END_SRC change it into an actual version #+BEGIN_SRC xml 3.5.4 #+END_SRC
  5. then build the project with maven:
  6. #+BEGIN_EXAMPLE cd liquibase-karaf-feature mvn clean install #+END_EXAMPLE

Test a new version

I have created the project liquibase-sample to test new versions of this karaf feature.

The liquibase-sample has a minimal OSGi component that loads and creates a schema in a derby in-memory database from a liquibase changelog file.

The liquibase-sample application can also be used to verify that the liquibase logs are redirected to the karaf logs.

Using liquibase from OSGi bundles

Liquibase 4 is built internally with an inversion-of-control architecture, and uses java.util.ServiceLoader to find the implementations of its services.

The ServiceLoader doesn't work well with OSGi. The ServiceLoader expects a single, flat, classloader, and a single thread, and this is not what OSGi has.

It is possible to make the ServiceLoader work in OSGi, using the Service Loader Mediator.

A single implementation of the service loader mediator exists: Apache Aries SPI Fly.

The liquibase karaf feature created from this project will load SPI Fly at the same start-level as the liquibase-core bundle.

The Require-Capability header of OSGi bundle manifests can be used to start available SPI services, and once started they will behave as regular OSGi services.

The liquibase-core bundle will start all SPI services it requires in the maniest, so you, as a user, don't have to think about SPI or Apache Aries SPI Fly (but it can be helpful to know what's going on).

Note! One thing you will need to think about is if you use XML formatted Liquibase change logs: Liquibase will need to find the XSD schema files when parsing the change logs.

The XSD files are provideded as classpath resources in the liquibase-core OSGi bundle.

But these resources aren't available to other OSGi bundles (classpath resources are local to the bundle they reside in).

This means that all OSGi bundles that parse XML liquibase change logs needs to copy the XSD schemas of liquibase-core into its own classpath.

The of the maven-bundle-plugin config below will copy the XSD schemas into the place where liquibase will look for them.

Copy this config into all OSGi bundles that load liquibase XML change logs.

org.apache.felix maven-bundle-plugin 5.1.9 =target/classes, /www.liquibase.org/=target/dependency/www.liquibase.org/ org.apache.maven.plugins maven-dependency-plugin copy-liquibase-xsd validate unpack org.liquibase liquibase-core **/dbchangelog-3.5.xsd

In the above example only dbchangelog-3.5 is copied. If a different schema version is used, that version must be copied instead.

To copy all schemas, change includes to this (Disclaimer: not tested):

*/.xsd

Current problems under OSGi

    Apart from the issues worked around in the previous section I see the following problems:
  1. Starting with liquibase 4.19.1 and fixed in 4.21.0 using liquibase in OSGi failed with the error message
  2. #+begin_example java.lang.NullPointerException: Cannot invoke "liquibase.logging.mdc.MdcManager.put(String, String)" because the return value of "liquibase.Scope.getMdcManager()" is null at liquibase.Scope.addMdcValue(Scope.java:416) #+end_example This was reported as https://github.com/liquibase/liquibase/issues/3910
  3. Starting with version 4.21.0 the Liquibase facade stopped working for me. To avoid messages like this:
  4. #+begin_example 2022-09-10T13:47:54,302 | ERROR | CM Configuration Updater (ManagedServiceFactory Update: factoryPid=[org.ops4j.datasource]) | HandleregProductionDbLiquibaseRunner | 125 - no.priv.bang.handlereg.db.liquibase.production - 1.0.0.SNAPSHOT | Failed to create handlereg derby test database liquibase.exception.LiquibaseException: java.lang.RuntimeException: Cannot end scope cpkebkpkfa when currently at scope bbldyrztji at liquibase.Liquibase.runInScope(Liquibase.java:2419) ~[?:?] at liquibase.Liquibase.update(Liquibase.java:209) ~[?:?] at liquibase.Liquibase.update(Liquibase.java:195) ~[?:?] ... #+end_example the Liquibase facade has to be replaced with ScopeRunner using ThreadLocalScopeManager. I.e. something like this #+begin_src java @Component(immediate=true, property = "name=sampledb") public class SampleDbLiquibaseRunner implements PreHook {

private Bundle bundle;

@Activate public void activate(BundleContext bundlecontext) { this.bundle = bundlecontext.getBundle(); }

@Override public void prepare(DataSource datasource) throws SQLException { try (Connection connection = datasource.getConnection()) { applyLiquibaseChangelist(connection, "sample-db-changelog/db-changelog-1.0.0.xml"); } catch (LiquibaseException e) { throw new RuntimeException("Error creating sampleapp test database schema", e); } }

private void applyLiquibaseChangelist(Connection connection, String changelistClasspathResource) throws LiquibaseException { try(Liquibase liquibase = createLiquibaseInstance(connection, changelistClasspathResource)) { liquibase.update(""); } }

private Liquibase createLiquibaseInstance(Connection connection, String changelistClasspathResource) throws LiquibaseException { DatabaseConnection databaseConnection = new JdbcConnection(connection); var resourceAccessor = new OSGiResourceAccessor(bundle); return new Liquibase(changelistClasspathResource, resourceAccessor, databaseConnection); }

} #+end_src has to be replaced with something like this: #+begin_src java @Component(immediate=true, property = "name=sampledb") public class SampleDbLiquibaseRunner implements PreHook {

@Activate public void activate(BundleContext bundlecontext) { Scope.setScopeManager(new ThreadLocalScopeManager()); }

@Override public void prepare(DataSource datasource) throws SQLException { try (var connection = datasource.getConnection()) { applyLiquibaseChangelist(connection, "sample-db-changelog/db-changelog-1.0.0.xml"); } catch (Exception e) { throw new RuntimeException("Error creating sampleapp test database schema", e); } }

private void applyLiquibaseChangelist(Connection connect, String liquibaseChangeLogClassPathResource) throws Exception, DatabaseException { applyLiquibaseChangelist(connect, liquibaseChangeLogClassPathResource, getClass().getClassLoader()); }

public void applyLiquibaseChangelist(Connection connect, String liquibaseChangeLogClassPathResource, ClassLoader classLoader) throws Exception { try (var database = findCorrectDatabaseImplementation(connect)) { Scope.child(scopeObjectsWithClassPathResourceAccessor(classLoader), () -> new CommandScope(UPDATE) .addArgumentValue(DATABASE_ARG, database) .addArgumentValue(CHANGELOG_FILE_ARG, liquibaseChangeLogClassPathResource) .execute()); } }

private Database findCorrectDatabaseImplementation(Connection connect) throws DatabaseException { return DatabaseFactory.getInstance().findCorrectDatabaseImplementation(new JdbcConnection(connect)); }

private Map scopeObjectsWithClassPathResourceAccessor(ClassLoader classLoader) { return Map.of(resourceAccessor.name(), new ClassLoaderResourceAccessor(classLoader)); }

Testing and debugging in karaf

} #+end_src If the integration test fails in the schema setup, I haven't yet found a way to debug in the integration test itself.

But it is possible to start a karaf process locally, attach an IDE to that karaf process for remote debugging, and then load the same feature as the integration tests.

License

    The procedure, is:
  1. [https://karaf.apache.org/get-started][Download a tar-ball or zip file from the newest binary release, and unpack it]]
  2. cd into the unpacked karaf distro, and start karaf in debug mode:
  3. #+begin_example ./bin/karaf debug #+end_example
  4. In the IDE create a remote debug configuration attaching to localhost port 5005 and start the debug configuration
  5. Also in the IDE, set a breakpoint where you want the debugger to stop (for me it's a point in my own code that occurs in the stack trace I'm trying to debug)
  6. Load the same karaf feature as the integration test
  7. #+begin_example feature:repo-add mvn:no.priv.bang.karaf/karaf.liquibase.sample.datasource.receiver/LATEST/xml/features feature:install karaf-liquibase-sample-datasource-receiver #+end_example
  8. The IDE will stop on the breakpoint in the debugger and it's possible to step into the release
  9. If you want to restart:
  10. Disconnect the debugger
  11. Stop karaf with Ctrl-D in the console
  12. delete the data directory in karaf:
  13. #+begin_example rm -rf data #+end_example
  14. Start karaf again in debug mode
  15. #+begin_example ./bin/karaf debug #+end_example
  16. Start a remote debug session from the IDE
  17. Use arrow up in the karaf console to rerun the feature:repo-add and feature:install commands for the feature

This maven project is licensed with the Apache v 2.0 license.

The details of the license can be found in the LICENSE file.

The liquibase-slf4j jar is covered with the MIT license, copyright 2012-2015 Matt Bertolini. This license and copyright also covers the rebundled version of the jar that results from the "com.mattbertolini.liquibase-slf4j-osgi" maven module.