In-application JRebel support

JRebel is very useful during development. Now sometimes, you know you are doing things which are not compatible with live class reloading (like caching the results of reflection). In that case it is very useful to be able to hook into JRebel to be notified of class changes.

Unfortunately the guide for writing a plugin doesn’t work entirely. The plugin metadata is not picked up when loading JRebel (probably because the application is not yet in the classpath when JRebel loads).
I worked around this using Spring (as Spring is used in this application, it would work exactly the same using CDI).

First I have my service which caches reflection data in static variables. I included a package local clear method to allow clearing the caches.

@Component
public class ReflectionDataService {
 
    private static final Map<X,Y> MY_CACHED_DATA = ...;
 
    /**
     * Allow cached data to be cleared by JRebel plugin.
     */
    static void clear() {
        MY_CACHED_DATA.clear();
    }
 
    ... normal implementation ...
}

The JRebel plugin is quite simple. In this case, it has to react to any class which is annotated using Path.

public class ReflectionDataServiceJrebelPlugin implements Plugin {
 
    @Override
    public void preinit() {
        registerListener();
    }
 
    private void registerListener() {
        // Set up the reload listener
        ReloaderFactory.getInstance().addClassReloadListener(
                new ClassEventListener() {
                    public void onClassEvent(int eventType, Class klass) {
 
                        try {
                            if (klass.isAnnotationPresent(Path.class)) {
                                ReflectionDataService.clear();
                            }
                        } catch (Exception e) {
                            LoggerFactory.getInstance().error(e);
                            System.out.println(e);
                        }
                    }
 
                    public int priority() {
                        return 0;
                    }
                }
        );
 
    }
 
    @Override
    public boolean checkDependencies(ClassLoader classLoader, ClassResourceSource classResourceSource) {
        return classResourceSource.getClassResource("my.package.ReflectionDataService") != null;
    }
 
    @Override
    public String getId() {
        return "MyJrebelPlugin";
    }
 
    @Override
    public String getName() {
        return "My JRebel Plugin";
    }
 
    @Override
    public String getDescription() {
        return "Reload ReflectionDataService.";
    }
 
    @Override
    public String getAuthor() {
        return null;
    }
 
    @Override
    public String getWebsite() {
        return null;
    }
 
}

Now I have to make sure that this is loaded. The plugin discovery system cannot find it – so I have to register it later. I provided a bean to register the JRebel plugin (specifically the class reload listener).

@Component
public class ReflectionDataServiceJrebelLoader {
 
    private static final Logger LOG = LoggerFactory.getLogger(ReflectionDataServiceJrebelLoader.class);
 
    /**
     * Load jRebel plugin when jRebel is available.
     */
    @PostConstruct
    public void preinit() {
        try {
            ClassLoader cl = Thread.currentThread().getContextClassLoader();
            Class<?> plugin = cl.loadClass("my.package.ReflectionDataServiceJrebelPlugin");
            Method method = plugin.getMethod("preinit");
            if (null != method) {
                Object instance = plugin.newInstance();
                method.invoke(instance);
            }
        } catch (Throwable ex) {
            LOG.debug("JRebel not found - will not reload directory service automatically.");
        }
    }
}

EE integration testing using Arquillian

Writing tests in a EE environment is a challenge. Compared to test support in spring framework there is some work.

Fortunately though, thanks to Arquillian, there is hope.

A lot of the complications are related with the difference in environment. In spring framework, you include all the libraries in your application, which means everything is readily available. In EE you need a container which provides facilities which are not directly included. Arquillian fixes this by allowing you to run your tests inside the container.

An Arquillian tests is almost the same as a non-Arquillian test, except that you also have to build the deployment package to run the test. While the general recommendation seems that you make such package as small as possible, I believe this can be a challenge as you do need all dependencies for these minimal classes needed for the test. In my mind it seems best to simply deploy the full package under development (jar or war) to run your tests.

The situation in Spring is similar. While it is easy to have a different context for each test, in practice it is better to reduce the number of context for running tests. Partly to make sure there are no dangling dependencies and partly to reduce the run time of your test suite.

In Arquillian the complexity is again slightly increased as you also need to include all project dependencies (as in “other” jars). You have to specify the project classes to include and the location of all dependencies.

In my application I use a base class for all integration tests. The dependency details are picked up from the maven pom (including sub-dependencies – unfourtunately just including all dependencies at once did not work). For the project classes themselves, I use a helper to simply include all classes of a package include the sub-packages which are found in the source. You also need to make sure all required resources are managed. Here again they are scanned from the source.

@RunWith(Arquillian.class)
public abstract class AbstractIT {
 
    private static final String SOURCES_DIR = "src/main/java";
    private static final String RESOURCES_DIR = "src/main/resources";
 
    @Deployment
    public static Archive<?> createDeployment() throws Exception {
        String pathPrefix = ""; // trick to make this work from both the main as
        // module root directory
        File currentDir = new File(".");
        if (!currentDir.getCanonicalPath().endsWith("back-impl")) {
            pathPrefix = "back-impl/";
        }
        PomEquippedResolveStage pom = Maven.resolver().loadPomFromFile(pathPrefix + "pom.xml");
        File[] commonsLang = pom.resolve("org.apache.commons:commons-lang3").withTransitivity().asFile();
        File[] appApi = pom.resolve("be.myapp:myapp-api").withTransitivity().asFile();
        File[] jTransfoCore = pom.resolve("org.jtransfo:jtransfo-core").withTransitivity().asFile();
        File[] jTransfoCdi = pom.resolve("org.jtransfo:jtransfo-cdi").withTransitivity().asFile();
        File[] queryDsl = pom.resolve("com.mysema.querydsl:querydsl-jpa").withTransitivity().asFile();
        File[] flyway = pom.resolve("com.googlecode.flyway:flyway-core").withTransitivity().asFile();
        File[] festAssert = pom.resolve("org.easytesting:fest-assert").withTransitivity().asFile();
        File[] httpclient = pom.resolve("org.apache.httpcomponents:httpclient").withTransitivity().asFile();
        File[] deltaspikeCore = pom.resolve("org.apache.deltaspike.core:deltaspike-core-impl").withTransitivity().asFile();
        File[] deltaspikeData = pom.resolve("org.apache.deltaspike.modules:deltaspike-data-module-impl").withTransitivity().asFile();
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war").
                addAsLibraries(commonsLang).
                addAsLibraries(appApi).
                addAsLibraries(jTransfoCore).
                addAsLibraries(jTransfoCdi).
                addAsLibraries(queryDsl).
                addAsLibraries(flyway).
                addAsLibraries(festAssert).
                addAsLibraries(httpclient).
                addAsLibraries(deltaspikeCore).
                addAsLibraries(deltaspikeData).
                addAsResource("META-INF/persistence.xml").
                addAsResource("META-INF/beans.xml");
        addAllPackages(war, "be.fluxtock.back.impl", new File(pathPrefix + SOURCES_DIR + "/be/fluxtock/back/impl"));
        addAllResources(war, pathPrefix + RESOURCES_DIR);
        return war;
    }
 
    /**
     * Add all packages starting with given prefix from given path.
     *
     * @param war war archive to add packages to
     * @param prefix base package
     * @param dir directory for the base package
     */
    private static void addAllPackages(WebArchive war, String prefix, File dir) {
        war.addPackage(prefix);
        for (File file : dir.listFiles(File::isDirectory)) {
            addAllPackages(war, prefix + "." + file.getName(), file);
        }
    }
 
    /**
     * Add all resources from the given directory, recursively. Only adds
     * subdirectories when they start with a lower case letter
     *
     * @param war war archive to add packages to
     * @param directory directory with resources to add
     */
    private static void addAllResources(WebArchive war, String directory) {
        for (File file : new File(directory).listFiles(pathname -> pathname.isFile() || Character.isLowerCase(pathname.getName().charAt(0)))) {
            addAllResources(war, "", file);
        }
    }
 
    private static void addAllResources(WebArchive war, String prefix, File dir) {
        if (dir.isDirectory()) {
            prefix += dir.getName() + "/";
            for (File file : dir.listFiles()) {
                addAllResources(war, prefix, file);
            }
        } else {
            war.addAsResource(dir, prefix + dir.getName());
        }
    }
 
}

To allow the dependencies to be resolved from the pom, you need an extra dependency.

<dependency>
    <groupId>org.jboss.shrinkwrap.resolver</groupId>
    <artifactId>shrinkwrap-resolver-impl-maven</artifactId>
    <version>2.0.0</version>
    <scope>test</scope>
</dependency>

When using a database to run your tests, you also need to make sure your database is not polluted by your tests. Something similar to Spring’s AbstractTransactionalJUnit4SpringContextTests can be built quite easily.

/**
 * Transactional integration test. Transaction is rolled back at the end.
 */
public abstract class AbstractRollbackIT extends AbstractIT {
 
    @PersistenceContext
    EntityManager em;
 
    @Inject
    UserTransaction utx;
 
    @Before
    public void setUp() throws Exception {
        utx.begin();
        em.joinTransaction();
    }
 
    @After
    public void tearDown() throws Exception {
        utx.rollback();
    }
 
}

With these base classes test are running fine. Unfortunately it does seem that a new deployment is created for each test. It would be nice (and faster) if tests which share the same deployment could also reuse that deployment – reducing the number of times the deployment is started in the container. Fortunately – compared to context initialization in spring framework – this initialization is very fast, but it would probably make a big difference when there are many integration tests.

Activiti native query search on candidates using PostgreSQL

I had a need to find all Activiti tasks which are available for a user who has a set of roles (or candidate groups in Activiti speak). This cannot be done using normal task queries, so I thought I could use native queries and an expression like

i.group_id_ IN (#{candidates})

Unfortunately this does not work. The cross-database solution is to adjust your query for the number of candidates (something like “in (?,?,?)”).
Fortunately though, we use PostgreSQL in this project which allows a solution using arrays. When you change the expression below and pass an array instead of just setting the object.

i.group_id_ = any (#{candidates})

Unfortunately, Activiti and the underlying MyBatis have no direct support for this combination, so some patching is needed.

For starters, a customized Activiti configuration class for use in the spring context.

public class DcSpringProcessEngineConfiguration extends SpringProcessEngineConfiguration {
 
    @Override
    protected InputStream getMyBatisXmlConfigurationSteam() {
        return ReflectUtil.getResourceAsStream("my/app/location/for/activitiMybatisMappings.xml");
    }
 
}

Now copy the original mappings.xml file from Activiti and make some changes, building the new activitiMybatisMappings.xml file. Two type aliases are added and used as a type handler and a plugin.

<!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD Config 3.0//EN" "http://mybatis.org/dtd/mybatis-3-config.dtd">
 
<configuration>
    <settings>
        <setting name="lazyLoadingEnabled" value="false" />
    </settings>
    <typeAliases>
        <typeAlias type="org.activiti.engine.impl.persistence.ByteArrayRefTypeHandler" alias="ByteArrayRefTypeHandler"/>
        <typeAlias type="my.app.PostgresStringArrayTypeHandler" alias="PostgresStringArrayTypeHandler"/>
    </typeAliases>
    <typeHandlers>
        <typeHandler handler="ByteArrayRefTypeHandler"
                     javaType="org.activiti.engine.impl.persistence.entity.ByteArrayRef"
                     jdbcType="VARCHAR"/>
        <typeHandler handler="PostgresStringArrayTypeHandler"
                     javaType="[Ljava.lang.String;"/>
    </typeHandlers>
    <mappers>
        <mapper resource="org/activiti/db/mapping/entity/Attachment.xml" />
        <mapper resource="org/activiti/db/mapping/entity/ByteArray.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Comment.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Deployment.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Execution.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Group.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricActivityInstance.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricDetail.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricProcessInstance.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricVariableInstance.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricTaskInstance.xml" />
        <mapper resource="org/activiti/db/mapping/entity/HistoricIdentityLink.xml" />
        <mapper resource="org/activiti/db/mapping/entity/IdentityInfo.xml" />
        <mapper resource="org/activiti/db/mapping/entity/IdentityLink.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Job.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Membership.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Model.xml" />
        <mapper resource="org/activiti/db/mapping/entity/ProcessDefinition.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Property.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Resource.xml" />
        <mapper resource="org/activiti/db/mapping/entity/TableData.xml" />
        <mapper resource="org/activiti/db/mapping/entity/Task.xml" />
        <mapper resource="org/activiti/db/mapping/entity/User.xml" />
        <mapper resource="org/activiti/db/mapping/entity/VariableInstance.xml" />
        <mapper resource="org/activiti/db/mapping/entity/EventSubscription.xml" />
    </mappers>
</configuration>

The PostgresStringArrayTypeHandler uses setArray() instead of setObject() to pass the parameter to the prepared statement. Similar fixes are needed to catch other array types.

public class PostgresStringArrayTypeHandler extends BaseTypeHandler<String[]> {
 
    @Override
    public void setNonNullParameter(PreparedStatement ps, int i, String[] parameter, JdbcType jdbcType)
            throws SQLException {
        Array array = ps.getConnection().createArrayOf("varchar", parameter);
        ps.setArray(i, array);
    }
 
    @Override
    public String[] getNullableResult(ResultSet rs, String columnName)
            throws SQLException {
        return (String[]) rs.getArray(columnName).getArray();
    }
 
    @Override
    public String[] getNullableResult(ResultSet rs, int columnIndex)
            throws SQLException {
        return (String[]) rs.getArray(columnIndex).getArray();
    }
 
    @Override
    public String[] getNullableResult(CallableStatement cs, int columnIndex)
            throws SQLException {
        return (String[]) cs.getArray(columnIndex).getArray();
    }
}