Storing Activiti process variables as JSON text

By default Activiti will use Java serialization to store non-primitive process variables in the database.
This is nice and mostly works but has a few disadvantages:

  • You have to be careful about your serialVersionUID fields on the object (you need it).
  • It is not refactor friendly. Change a class or field name creates havoc.
  • You cannot use a database query to search on property value.

Here is some code to use JSON as serialization format. The result is put in the text_ field to allow database searches. However, this does limit the maximum length of the string (you can increase the field length if you want – by default Activiti defines it as 4000 characters).

You have to create a VariableType implementation and pass that to your configuration. When initializing using Spring, this can be done using something like:

<bean id="processEngineConfiguration">
    <!-- your usual configuration -->
 
    <property name="customPreVariableTypes">
        <list>
            <bean class="mypackage.SampleAsJsontype" />
            <bean class="mypackage.SampleListAsJsontype" />
            <bean class="mypackage.SerializeAsJsonType" />
        </list>
    </property>
 
</bean>

VariableType for a specific object type.

public class SampleAsJsontype implements VariableType {
 
    private static final Logger LOG = LoggerFactory.getLogger(SampleAsJsontype.class);
 
    private DcJsonMapper mapper = new DcJsonMapper();
 
    @Override
    public String getTypeName() {
        return "sample";
    }
 
    @Override
    public boolean isCachable() {
        return true;
    }
 
    @Override
    public Object getValue(ValueFields valueFields) {
        try {
            return mapper.readValue(valueFields.getTextValue(), Sample.class);
        } catch (IOException ioe) {
            LOG.error("Kan object niet converteren naar Sample: " + valueFields.getTextValue(), ioe);
            return null;
        }
    }
 
    @Override
    public void setValue(Object value, ValueFields valueFields) {
        if (null == value) {
            valueFields.setTextValue(""); // needed to allow removing variables
        } else {
            try {
                valueFields.setTextValue(mapper.writeValueAsString(value));
            } catch (IOException ioe) {
                LOG.error("Kan Sample niet converteren o(JSON) string: " + value, ioe);
            }
        }
    }
 
    @Override
    public boolean isAbleToStore(Object value) {
        return value instanceof Sample;
    }
 
}

When it is a list of this type you can use something like:

public class SampleListAsJsontype implements VariableType {
 
    private static final Logger LOG = LoggerFactory.getLogger(SampleListAsJsontype.class);
 
    private DcJsonMapper mapper = new DcJsonMapper();
 
    @Override
    public String getTypeName() {
        return "sampleList";
    }
 
    @Override
    public boolean isCachable() {
        return true;
    }
 
    @Override
    public Object getValue(ValueFields valueFields) {
        try {
            return mapper.readValue(valueFields.getTextValue(), new TypeReference<List<Sample>>() { });
        } catch (IOException ioe) {
            LOG.error("Kan object niet converteren naar Sample: " + valueFields.getTextValue(), ioe);
            return null;
        }
    }
 
    @Override
    public void setValue(Object value, ValueFields valueFields) {
        if (null == value) {
            valueFields.setTextValue(""); // needed to allow removing variables
        } else {
            try {
                valueFields.setTextValue(mapper.writeValueAsString(value));
            } catch (IOException ioe) {
                LOG.error("Kan Sample niet converteren o(JSON) string: " + value, ioe);
            }
        }
    }
 
    @Override
    public boolean isAbleToStore(Object value) {
        return value instanceof Collection && containsAllLijnLocatie((Collection) value);
    }
 
    private boolean containsAllLijnLocatie(Collection collection) {
        for (Object object : collection) {
            if (!(object instanceof Sample)) {
                return false;
            }
        }
        return true;
    }
 
}

Much more practical is a generic solution which converts many object. Here is the marker interface to trigger the serialization.

public interface SerializeAsJson {
}

In this case, the JSON data is prepended with the Java type to assure the JSON mapper knows what to do. This code also assures the length of the string fits the field.

public class SerializeAsJsonType implements VariableType {
 
    private static final Logger LOG = LoggerFactory.getLogger(SerializeAsJsonType.class);
 
    private static final String SEPARATOR = "=>";
    private static final int SERIALIZED_MAX_LENGTH = 44000; // Activiti default field length
 
    private DcJsonMapper mapper = new DcJsonMapper();
 
    @Override
    public String getTypeName() {
        return "serializeAsJson";
    }
 
    @Override
    public boolean isCachable() {
        return true;
    }
 
    @Override
    public Object getValue(ValueFields valueFields) {
        try {
            String value = valueFields.getTextValue();
            int pos = value.indexOf(SEPARATOR);
            if (pos < 1) {
                throw new IllegalArgumentException("Value for serializeAsJson does not contain type indicator.");
            }
            String className = value.substring(0, pos);
            String json = value.substring(pos + SEPARATOR.length());
            ClassLoader classLoader = Thread.currentThread().getContextClassLoader();
            if (null == classLoader) {
                classLoader = this.getClass().getClassLoader();
            }
            try {
                Class clazz = classLoader.loadClass(className);
                return mapper.readValue(json, clazz);
            } catch (ClassNotFoundException cnfe) {
                throw new IllegalArgumentException("Cannot find class  " + className + ".", cnfe);
            }
        } catch (IOException ioe) {
            LOG.error("Cannot convert JSON to object: " + valueFields.getTextValue(), ioe);
            return null;
        }
    }
 
    @Override
    public void setValue(Object value, ValueFields valueFields) {
        if (null == value) {
            valueFields.setTextValue(""); // needed to allow removing variables
        } else {
            try {
                String serialized = value.getClass().getName() + SEPARATOR + mapper.writeValueAsString(value);
                if (serialized.length() > SERIALIZED_MAX_LENGTH) {
                    throw new IllegalArgumentException("Serialized value for object of type " +
                            value.getClass().getName() + " is too long to store as JSON object.");
                }
                valueFields.setTextValue(serialized);
            } catch (IOException ioe) {
                LOG.error("Cannot convert " + value.getClass().getName() + " to (JSON) string: " + value, ioe);
            }
        }
    }
 
    @Override
    public boolean isAbleToStore(Object value) {
        return value instanceof SerializeAsJson;
    }
 
}

If you want the JSON length limit to be increased you can alter the SERIALIZED_MAX_LENGTH constant in the code above and increase the field size in your database using something like (PostgreSQL sample):

ALTER TABLE act_ru_variable ALTER text_ TYPE VARCHAR(16000);
ALTER TABLE act_hi_varinst ALTER text_ TYPE VARCHAR(16000);
ALTER TABLE act_hi_detail ALTER text_ TYPE VARCHAR(16000);

If you already have process variables in your database which did not use these serializers, then you will need some migration code.
The bg trick is to delete the process variable before storing it again. If you don’t delete it first Activiti will try to store the updated value using the same serializer (if that serializer can store the updated value).

This code handles the migration when the system starts. It is implemented as a Spring service.

@Component
public class MigrateProcessVariables {
 
    @Autowired
    private TaskService taskService;
 
    /**
     * Migrate existing variables to store again using the JSON serializers.
     */
    @PostConstruct
    public void fixSerializedVariables() {
        List<Task> tasks = taskService.createTaskQuery().list();
        for (Task task : tasks) {
            Map<String, Object> vars = taskService.getVariables(task.getId());
            for (Map.Entry<String, Object> entry : vars.entrySet()) {
                Object value = entry.getValue();
                if (value instanceof Sample || value instanceof SerializeAsJson) {
                    taskService.removeVariable(task.getId(), entry.getKey());
                    taskService.setVariable(task.getId(), entry.getKey(), value);
                }
                if (value instanceof List) {
                    // convert possible List<Sample> objects
                    List list = (List) value;
                    boolean changed = list.size() > 0;
                    for (int i = 0; i < list.size(); i++) {
                        Object item = list.get(i);
                        if (!(item instanceof Sample)) {
                            changed = false;
                        }
                    }
                    if (changed) {
                        taskService.removeVariable(task.getId(), entry.getKey());
                        taskService.setVariable(task.getId(), entry.getKey(), list);
                    }
                }
            }
        }
    }
 
}

In-application JRebel support

JRebel is very useful during development. Now sometimes, you know you are doing things which are not compatible with live class reloading (like caching the results of reflection). In that case it is very useful to be able to hook into JRebel to be notified of class changes.

Unfortunately the guide for writing a plugin doesn’t work entirely. The plugin metadata is not picked up when loading JRebel (probably because the application is not yet in the classpath when JRebel loads).
I worked around this using Spring (as Spring is used in this application, it would work exactly the same using CDI).

First I have my service which caches reflection data in static variables. I included a package local clear method to allow clearing the caches.

@Component
public class ReflectionDataService {
 
    private static final Map<X,Y> MY_CACHED_DATA = ...;
 
    /**
     * Allow cached data to be cleared by JRebel plugin.
     */
    static void clear() {
        MY_CACHED_DATA.clear();
    }
 
    ... normal implementation ...
}

The JRebel plugin is quite simple. In this case, it has to react to any class which is annotated using Path.

public class ReflectionDataServiceJrebelPlugin implements Plugin {
 
    @Override
    public void preinit() {
        registerListener();
    }
 
    private void registerListener() {
        // Set up the reload listener
        ReloaderFactory.getInstance().addClassReloadListener(
                new ClassEventListener() {
                    public void onClassEvent(int eventType, Class klass) {
 
                        try {
                            if (klass.isAnnotationPresent(Path.class)) {
                                ReflectionDataService.clear();
                            }
                        } catch (Exception e) {
                            LoggerFactory.getInstance().error(e);
                            System.out.println(e);
                        }
                    }
 
                    public int priority() {
                        return 0;
                    }
                }
        );
 
    }
 
    @Override
    public boolean checkDependencies(ClassLoader classLoader, ClassResourceSource classResourceSource) {
        return classResourceSource.getClassResource("my.package.ReflectionDataService") != null;
    }
 
    @Override
    public String getId() {
        return "MyJrebelPlugin";
    }
 
    @Override
    public String getName() {
        return "My JRebel Plugin";
    }
 
    @Override
    public String getDescription() {
        return "Reload ReflectionDataService.";
    }
 
    @Override
    public String getAuthor() {
        return null;
    }
 
    @Override
    public String getWebsite() {
        return null;
    }
 
}

Now I have to make sure that this is loaded. The plugin discovery system cannot find it – so I have to register it later. I provided a bean to register the JRebel plugin (specifically the class reload listener).

@Component
public class ReflectionDataServiceJrebelLoader {
 
    private static final Logger LOG = LoggerFactory.getLogger(ReflectionDataServiceJrebelLoader.class);
 
    /**
     * Load jRebel plugin when jRebel is available.
     */
    @PostConstruct
    public void preinit() {
        try {
            ClassLoader cl = Thread.currentThread().getContextClassLoader();
            Class<?> plugin = cl.loadClass("my.package.ReflectionDataServiceJrebelPlugin");
            Method method = plugin.getMethod("preinit");
            if (null != method) {
                Object instance = plugin.newInstance();
                method.invoke(instance);
            }
        } catch (Throwable ex) {
            LOG.debug("JRebel not found - will not reload directory service automatically.");
        }
    }
}

EE integration testing using Arquillian

Writing tests in a EE environment is a challenge. Compared to test support in spring framework there is some work.

Fortunately though, thanks to Arquillian, there is hope.

A lot of the complications are related with the difference in environment. In spring framework, you include all the libraries in your application, which means everything is readily available. In EE you need a container which provides facilities which are not directly included. Arquillian fixes this by allowing you to run your tests inside the container.

An Arquillian tests is almost the same as a non-Arquillian test, except that you also have to build the deployment package to run the test. While the general recommendation seems that you make such package as small as possible, I believe this can be a challenge as you do need all dependencies for these minimal classes needed for the test. In my mind it seems best to simply deploy the full package under development (jar or war) to run your tests.

The situation in Spring is similar. While it is easy to have a different context for each test, in practice it is better to reduce the number of context for running tests. Partly to make sure there are no dangling dependencies and partly to reduce the run time of your test suite.

In Arquillian the complexity is again slightly increased as you also need to include all project dependencies (as in “other” jars). You have to specify the project classes to include and the location of all dependencies.

In my application I use a base class for all integration tests. The dependency details are picked up from the maven pom (including sub-dependencies – unfourtunately just including all dependencies at once did not work). For the project classes themselves, I use a helper to simply include all classes of a package include the sub-packages which are found in the source. You also need to make sure all required resources are managed. Here again they are scanned from the source.

@RunWith(Arquillian.class)
public abstract class AbstractIT {
 
    private static final String SOURCES_DIR = "src/main/java";
    private static final String RESOURCES_DIR = "src/main/resources";
 
    @Deployment
    public static Archive<?> createDeployment() throws Exception {
        String pathPrefix = ""; // trick to make this work from both the main as
        // module root directory
        File currentDir = new File(".");
        if (!currentDir.getCanonicalPath().endsWith("back-impl")) {
            pathPrefix = "back-impl/";
        }
        PomEquippedResolveStage pom = Maven.resolver().loadPomFromFile(pathPrefix + "pom.xml");
        File[] commonsLang = pom.resolve("org.apache.commons:commons-lang3").withTransitivity().asFile();
        File[] appApi = pom.resolve("be.myapp:myapp-api").withTransitivity().asFile();
        File[] jTransfoCore = pom.resolve("org.jtransfo:jtransfo-core").withTransitivity().asFile();
        File[] jTransfoCdi = pom.resolve("org.jtransfo:jtransfo-cdi").withTransitivity().asFile();
        File[] queryDsl = pom.resolve("com.mysema.querydsl:querydsl-jpa").withTransitivity().asFile();
        File[] flyway = pom.resolve("com.googlecode.flyway:flyway-core").withTransitivity().asFile();
        File[] festAssert = pom.resolve("org.easytesting:fest-assert").withTransitivity().asFile();
        File[] httpclient = pom.resolve("org.apache.httpcomponents:httpclient").withTransitivity().asFile();
        File[] deltaspikeCore = pom.resolve("org.apache.deltaspike.core:deltaspike-core-impl").withTransitivity().asFile();
        File[] deltaspikeData = pom.resolve("org.apache.deltaspike.modules:deltaspike-data-module-impl").withTransitivity().asFile();
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war").
                addAsLibraries(commonsLang).
                addAsLibraries(appApi).
                addAsLibraries(jTransfoCore).
                addAsLibraries(jTransfoCdi).
                addAsLibraries(queryDsl).
                addAsLibraries(flyway).
                addAsLibraries(festAssert).
                addAsLibraries(httpclient).
                addAsLibraries(deltaspikeCore).
                addAsLibraries(deltaspikeData).
                addAsResource("META-INF/persistence.xml").
                addAsResource("META-INF/beans.xml");
        addAllPackages(war, "be.fluxtock.back.impl", new File(pathPrefix + SOURCES_DIR + "/be/fluxtock/back/impl"));
        addAllResources(war, pathPrefix + RESOURCES_DIR);
        return war;
    }
 
    /**
     * Add all packages starting with given prefix from given path.
     *
     * @param war war archive to add packages to
     * @param prefix base package
     * @param dir directory for the base package
     */
    private static void addAllPackages(WebArchive war, String prefix, File dir) {
        war.addPackage(prefix);
        for (File file : dir.listFiles(File::isDirectory)) {
            addAllPackages(war, prefix + "." + file.getName(), file);
        }
    }
 
    /**
     * Add all resources from the given directory, recursively. Only adds
     * subdirectories when they start with a lower case letter
     *
     * @param war war archive to add packages to
     * @param directory directory with resources to add
     */
    private static void addAllResources(WebArchive war, String directory) {
        for (File file : new File(directory).listFiles(pathname -> pathname.isFile() || Character.isLowerCase(pathname.getName().charAt(0)))) {
            addAllResources(war, "", file);
        }
    }
 
    private static void addAllResources(WebArchive war, String prefix, File dir) {
        if (dir.isDirectory()) {
            prefix += dir.getName() + "/";
            for (File file : dir.listFiles()) {
                addAllResources(war, prefix, file);
            }
        } else {
            war.addAsResource(dir, prefix + dir.getName());
        }
    }
 
}

To allow the dependencies to be resolved from the pom, you need an extra dependency.

<dependency>
    <groupId>org.jboss.shrinkwrap.resolver</groupId>
    <artifactId>shrinkwrap-resolver-impl-maven</artifactId>
    <version>2.0.0</version>
    <scope>test</scope>
</dependency>

When using a database to run your tests, you also need to make sure your database is not polluted by your tests. Something similar to Spring’s AbstractTransactionalJUnit4SpringContextTests can be built quite easily.

/**
 * Transactional integration test. Transaction is rolled back at the end.
 */
public abstract class AbstractRollbackIT extends AbstractIT {
 
    @PersistenceContext
    EntityManager em;
 
    @Inject
    UserTransaction utx;
 
    @Before
    public void setUp() throws Exception {
        utx.begin();
        em.joinTransaction();
    }
 
    @After
    public void tearDown() throws Exception {
        utx.rollback();
    }
 
}

With these base classes test are running fine. Unfortunately it does seem that a new deployment is created for each test. It would be nice (and faster) if tests which share the same deployment could also reuse that deployment – reducing the number of times the deployment is started in the container. Fortunately – compared to context initialization in spring framework – this initialization is very fast, but it would probably make a big difference when there are many integration tests.