Flowable Enterprise Documentation

Flowable Enterprise Documentation

  • How to
  • Modeler
  • Administrator
  • Developer
  • User

›Modeler Guide

Flowable Modeler Guides

  • Flowable Modeler Guides

Trial Download Tutorial

  • Flowable Trial Tutorial
  • First experience with Flowable
  • Taking it a bit further
  • How it was all put together
  • Going with the flow using event streams
  • Customizing the user interface
  • Trouble shooting

Modeler Guide

  • Modeler Guide Overview
  • Model a Simple Process
  • Decision Models
  • Content Models
  • Security Policy Models
  • Service Registry Models
  • Data Object Models
  • Event and Channel Models
  • Action Models
  • Case Views
  • Model Versioning

Expert Modeler Guide

  • Part 1 The Basic Process
  • Part 2 Make the Approval Process Multi-instance
  • Part 3 Add Conversations to the Process with Flowable Engage

Event and Channel Models

Introduction

In many modern use cases, data resides in and moves through different services and systems. These systems often communicate with each other through 'events', which e.g. indicate that new data has been created, something has changed, some functionality was triggered, etc. With these events, the pattern of

if this happens --> do this

can be implemented in a scalable way (both technically and functionally), giving rise to so-called event-driven architectures (or at least, event-capable architectures). In these kind of architectures, often the events are exchanged using a queueing or event streaming solution. Flowable Work (and thus Flowable Engage too) supports this out of the box

  • Apache Kafka
  • RabbitMQ
  • Any JMS compatible solution (e.g. Apache ActiveMQ)

Other solutions can be plugged in via extension mechanisms of Flowable.

Continuing on the 'if event happens -> do this' pattern, one fundamental problem to solve is that of context to apply the 'do this' part. The event originates from some system and is then propagated to other systems that are interested in it. However, the consequence of the event happening always needs to be applied in a certain context on the receiving side to be useful.

Process and case models are a natural fit when it comes to providing this context. If a process or case instances are waiting for a certain event to happen, the actual process or case instance is the context (data variables, documents, people, etc.) in which the event can be applied without having to worry about gathering everything that's relevant to handle the event.

Concepts

Flowable ships with an "event registry" which handles and manages two types of models of importance for this guide:

  • event model: a model describing the data structure of the event data.
  • channel model: an abstraction of receiving or sending events using a transport mechanism.

Like any model in Flowable, an event or channel model gets deployed (and is queryable) as a definition. Like any other definition type (process, case, action, template, etc.), it is versionable.

This is especially important for channel definitions, which are linked with a real listener that needs to be instantiated or removed. For example, a channel model linked to a JMS queue will create the listener at runtime when the channel model is deployed. Likewise, it will remove the listener when the definition is removed or a new version of the definition is deployed.

Event registry concepts

Channels

Channel models are used to define a way through which event data is received or sent to other systems, using a transport mechanism (like a Kafka topic, JMS queue, etc.).

When a channel is used to receive events, it's called an inbound channel. Likewise, a channel that sends events is an outbound channel.

Let's look at inbound channels first, which consist of two parts:

  • The adapter: The adapter is a technical implementation and listens for incoming events. Any event that is received is passed on to the channel pipeline.
  • The pipeline: The adapter receives the "raw" event, pretty much a collection of bytes. The channel pipeline will process this data to make it usable.

Inbound channel pipeline

Any of the dark blue boxes on the picture above are pluggable and extensible. This means that any such box can be swapped out by a custom implementation if the out-of-the-box implementations are insufficient.

The consumer in this picture is typically a BPMN or CMMN waiting activity.

The default pipeline consists of following steps:

  1. Deserialization: transforms the raw data to a more useful format (e.g. String to JSON)
  2. Event key detection: the channel can be used to receive multiple types of events. In this case, this step defines how the actual event type (indicated by the event key) is determined. If the channel receives only one type of events, this can be a fixed value.
  3. Payload extraction: events typically have data. This step extracts the data from the event and is called the event payload.
  4. (Optional) Tenant detection: In a multi-tenant setup, a channel can be used to receive events from different tenants. This step defines how the tenant is determined from the event data. It's also possible to have one channel per tenant. In that situation, this step isn't needed.
  5. Event transformation: The event data is transformed to an Event Instance, which is a special type of variable that can be used in process or case instances.

Whenever an event is received by the adapter, it is passed into the pipeline. Any consumer (implemented by various process and case constructs) now receives the event.

An outbound channel, compared to the inbound counterpart, is a bit simpler:

Outbound channel pipeline

The producer in this picture is typically a BPMN or CMMN automatic service task activity. Note that the external system can be the very same Flowable installation, listening to the same transport mechanism with an inbound channel.

The event data now follows the reverse trajectory. Starting from the producer, an outbound pipeline is followed which processes the event and eventually hands it to the adapter. The adapter will then do the actual sending. An outbound pipeline is (by default) quite simple:

  1. Event transformation: The event instance is transformed to something any adapter understands.
  2. Serialization: the event is serialized to the format that is expected by the other side. For example, it could be a JMS queue that is only capable of receiving XML messages.

Events

An event model defines the structure of the data that is exchanged between systems. This structure defines which data fields (and their data type) are available by users of this event.

For example, a Customer onboarded event might consist of

  • a name (type string)
  • a legal address (type string)
  • an expected deal size (type double)

Process and case models will only work with the data from such an event model. For example, the customer name might be stored as a case variable or the deal size might trigger a few sentries.

Similar to all Flowable models, an event model will be deployed as an event definition. Instances of this definition, for example when an event is received, are passed to a process or case instance as an event instance. This is consistent with for example a process model that gets deployed as a process definition and from which process instances are started.

The point is that these process/case models don't (need to) know that there is a Kafka, JMS, RabbitMQ, etc. listener nor that the data is serialized from XML or how the data actually looks like (or something else). This is all abstracted away for the modeler of those models which can focus on the important part of the model: how to handle the business change that is expressed by the event.

Channel Model

Now that all concepts are explained, it's time to take a look at how the models for a channel and an event are created.

A channel model can be created in Flowable Design by navigating to the Others section and then clicking on the Channel entry in the left-hand menu. The regular model creation popup is then shown where a model name and key can be set. A channel model can also be created from a process or case model when selecting a channel property in the property panel of event-capable activities.

At the top of the page, the type of channel needs to be chosen: Inbound or Outbound. As described above, an inbound channel is used to received events, while an outbound channel sends them out.

The channel model UI look as follows:

Channel Model UI

It consists of three distinct sections, described in the next subsections.

Inbound: Technical configuration

In this section, one of the supported adapter implementations is chosen: Kafka, JMS or RabbitMQ.

There is also a Custom option, which can be chosen when the out-of-the-box options are insuffient. Hover over the info icon to get more information about which Spring bean needs to be added to the Flowable configuration in that case.

The properties below the dropdown change when selecting a different value, because each adapter implementation has its own specific configuration settings. For example, Kafka has Topics, while JMS uses a Destination.

The configuration options in this section are mandatory to configure the adapter. All other (optional) options are in the next section.

Inbound: Advanced

All options here are optional and depend on the implementation type chosen in the Technical Configuration. Hover over the information icons to get more details on each configuration setting.

Inbound: Pipeline

The third section configures the pipeline of the channel, as explained above. Every event that is received, passes through this pipeline, which is also indicated in the UI:

Pipeline

The serialization format of the event (XML, JSON or Custom) can be selected in the dropdown above the pipeline steps. Switching this format changes the possible options in certain steps of the pipeline. For example, when selecting JSON, the Event key detection step can be configured to use a JSON Pointer expression while for XML an XPath option becomes available.

The small option Use default pipeline steps at the top is checked by default. This means that by default, the pipeline consists of five steps that are typically used by Flowable users. It's possible to have a fully custom pipeline implementation, but unchecking the checkbox and providing the proper Spring bean in the Flowable configuration (see the info icon for more details).

In the UI above, each gray rectangle is clickable and shows the details of that step when clicked. A pencil icon next to the step shows which step is currently being edited.

Pipeline

Outbound

An outbound channel model is similar to an inbound one, but simpler. The adapter configuration is similar as above, but for the pipeline there are no steps to configure. This is because the default pipeline consists of but two steps: the serialization and the event transformation. Both are dependent and fully defined by the serialization format which can be selected in the dropdown:

Pipeline

Event Model

An event model can be created in Flowable Design by navigating to the Others section and then clicking on the Event entry in the left-hand menu. The regular model creation popup is then shown where a model name and key can be set. An event model can also be created from a process or case model when selecting an event property in the property panel.

An event model consists of a number of fields, representing the data contained in the event. When creating a new event model, no fields exist and the Add field button must be clicked. It's now possible to provide a name and type for the field. We'll talk about the Correlation parameter later in this guide.

Event Model Field

Note that we're not configuring how we get this field from the data that we receive. This is actually defined in the channel, that defines a step to extract the payload from the raw data and put it in the structure we're defining here.

When more fields are added, each field is shown as a row:

Event Model

The icons in the Actions column allow to edit, remove, move up or move down the field.

Event Mapping

With the channel and event models having been discussed above, we've covered:

  • How events are received from and sent out to the world outside of Flowable.
  • How the data is mapped from the data (the mapping is defined in the channel) onto a data structure with a fixed structure (the fields of the event model).

There is one missing link to understand how this fits into a case or process model: event mapping. Let's look at the following simple process model:

Process with event start

The idea here is that we want to start a new process instance whenever a certain event is received.

The start event here is an 'event registry start event' and allows choosing both an event and channel model:

Event start properties

When selecting an event model, a new property appears now that allows configuring the event mapping:

Event start properties

When clicking on this property, a popup is shown that looks like this:

Event mapping

The fields defined in the event model are now shown here and the UI allows configuring how these event fields map to runtime process or case instance data.

To summarize: whenever this event is received on the channel, a new process instance will be started (because the event type matches the type configured in the property panel). The data will have been processed in the pipeline and converted to the fields as configured in the event model. This way, the case or process modeler can now configure how this data needs to be used at runtime without having to know any of the technical details behind the (rather) complex technicalities behind the scene.

Correlation

An important concept when it comes to building process or case models with events is that of Correlation. In this context, correlation means how to relate the incoming event data to a running process or case instance. Let's look at a snippet of a case model to explain this.

Case with event listener

Every case instance based on this model will be waiting for a customer address changed event that then triggers a stage with multiple steps (in this example a process and user task are shown).

Let's make it a bit more explicit. Imagine that we have a Customer case for each of the customers our company has. Also imagine that the address change event originates from some sort of CRM system and that this systems puts events containing the customerId and the new address on a channel:

Case with event listener

The question now is: how does Flowable know which instance exactly needs to be triggered? It would be extremely inefficient if all running instances would need to be checked for listening to a certain event. This is where correlation comes in.

In the event model section it was described that some fields can be marked as correlation parameters, in this example the customerId is marked as such:

Customer Event Model

When certain fields of an event model are marked as a correlation parameter, they are shown in a special way in the event mapping (section on top):

Customer correlation parameter

Suppose now that the Customer case has stored a variable named myCustomerId. As shown in the screenshot above, we've configured the event listener to listen for these customer address change events, but this particular instance is only receiving the event when the correlation parameter matches the value produced by ${myCustomerId}. If variable myCustomerId has value 12345 for example, we'll trigger the event listener for the case instance which has 12345 as value.

Note that an expression is used here, to resolve the variable. Without an expression, the matching would happen on the static text, being myCustomerId here, which is not wanted.

Without getting too technical, let's describe what happens behind the scenes. When the Flowable engine encounters such a correlation parameter definition, it will generate a unique hash key based on all permutations of the correlation parameters (for this reason, it's wise to keep the number of correlation parameters limited to the minimum). When an event is received on a channel, a similar hash key calculation is made for the incoming event data, using the same correlation parameter definitions. Looking up which instance has which hash keys if very fast. This way Flowable passes the event to the relevant instance(s) in an efficient and scalable way.

Example

Let's build a simple example to get familiar with channel and event model concepts. We'll use the "customer address change" event that's been used throughout the previous sections. To keep things simple, we'll model the "event sending" part in a BPMN process model and the "event receiving" part in a CMMN model, as shown in the picture below.

(In a realistic setup, the event probably comes from an external system like a CRM appliction)

Example Setup

Note

We'll be using Apache ActiveMQ in this example. Of course, this can be changed to Apache Kafka or RabbitMQ easily.

If you want to follow along, the Flowable Engage Trial is configured out of the box to have ActiveMQ enabled and Flowable able to connect to it.

The App Model

Let's create the app model first. In Flowable Design go to Apps, click the large plus icon and then select Create a new app. Give it a name and key. We'll be creating the process and case model from here. That way, all referenced models (like the channel and event model) will be part of the same app automatically.

The Process Model

Create a new BPMN process model. In the app view, click on Add model and select Create a new model for the app and then Process. Give it a name and key.

We're going to keep it simple:

  • Add a Send event task, following the start event
  • Add a none end event, following the send event task

Select the start event, and create the following start form, containing a customerId and newAddress form field:

Example Setup

Now select the Send event task and give it a new name like Send address changed event. In the property panel on the right-hand side, click on the Outbound event property and create a new event model we'll name Address changed:

Example Setup

Add two fields to the event model:

  • id, which we'll also mark as a correlation parameter
  • address

It should look like this:

Customer Event Model

Save all models (the second button on the toolbar). Switch back to the process model (select the tab above in the editor). A new property saying no outbound event configuration has appeared. Click it.

Now we need to map the fields from the start form to the event data structure. Note that typing ${ will show the variable assistant which makes this an easy job:

Outbound Event Mapping

The last thing we need to do is configure which channel we want to send out this event. Select the Outbound channel property and create a new channel model, like we did above for the event model.

Configuring this channel is easy:

  • Change the type to Outbound.
  • Select JMS for the implementation.
  • Pick a name for the destination (ActiveMQ creates a new queue automatically when it doesn't exist)

It should look like this:

Outbound Channel

Save all models (second button from the left on the toolbar) and switch back to the process model.

That's it! Let's have a look at the receiving CMMN side of things.

The Case Model

Let's create the case model now. From the current view, click the plus icon just below the toolbar and select Case. Give it a name and key as usual.

There's a plethora of things possible now, but for the sake of simplicity, let's only listen to the event and create a user task when it is received:

  • Create a start form for the case, with only one text field customerId
  • Add an event listener to the case
  • Add a user task to the case
  • Add a form to the user task with one text field customerAddress
  • Connect the event listener with an entry sentry to the event listener

It should now look like this:

Example Case Model

Note We'll create the case instance 'manually', passing in the customer ID in the form. It's also possible to have an event start the case instance. Select the case plan model and then select the Inbound event property to do this.

Select the event listener. In the property panel, select the Inbound event property and choose (second tab) the same event as the one that was created for the send event task in the previous section.

Now No inbound event configuration appears in the property panel. Click it and fill in the event mapping) as follows:

  • We want to correlate on the customer identifier. Fill in ${customerId} for the id.
  • No need to map the id event field (leave it blank), we already have it stored in the case.
  • Add customerAddress as the variable name to store the new address. Note that this needs to map the name of the variable we added to the task form, as we want to display the new address in this form.

Example Case Model

The last thing to do here is to create a channel model to receive the incoming channels.

Note Channel and event models can be reused over multiple process and case models.

Click on the Inbound channel property, and create a new channel model. Do the following steps:

  • Change the implementation to JMS
  • Fill in the same name for the Destination as the name used in the send event task above.
  • In the pipeline, select the Event key detection step, keep the first selection option and fill in customerAdddressChanged (the key of the event model created earlier).

This should look like this:

Example Inbound Channel Model

And for the pipeline:

Example Inbound Channel Model Pipeline

Save all models. That's it. It's now time to see in action!

Run it

Deploy the app model. Switch to Flowable Work or Engage.

We need to start the case instance first, mimicking that we've got a long-running case instance per customer:

Example Start Case Instance

The start form will be shown. Fill in a customer ID value, for example 123. The case instance will have nothing open right now (because the only thing it is doing is waiting for the event).

Now start the process from the same app. Fill in the start form and make sure the same 123 customer id is used:

Example Start Process Instance

What will happen now:

  1. The form fields are stored as process variables
  2. The process variables are mapped into the event fields in the send event task
  3. The event is passed to the channel, which will send it to the customers JMS destination
  4. The inbound channel is listening to the same customers queue. It receives the event, which is processed in the pipeline. The event key is extracted, so now is known which event type is received.
  5. The case instance we've started earlier, is waiting to receive an event of that type. It is correlating on the 123 value of the customer Id. It's received and the task is created.

Now refresh the case instance view. Before, there was no user task. But now there is a Verify new address task, showing the new address:

Example Case Task

Make It Repeatable

There's a downside to our current model: it only captures the event once. Let's fix that.

Switch back to Flowable Design and open the case model. Select the event listener. In the property panel, check the repetition checkbox under Execution. Do the same thing for the user task. This is visually indicated with a small marker on both elements:

Repetition

Deploy the app model again.

Note This actually creates a new channel definition version which gets deployed. When this happens, the previous JMS listener is removed and a new one, with the latest configuration properties, is instantiated.

Create a new case instance, giving it another customer id. Start a few process instances with the same customer id. For each address change, there will now be a user task created:

Example Multiple Case Tasks

Advanced Example

And advanced example using Apache Kafka, can be found on our blog:

https://blog.flowable.org/2020/03/24/flowable-business-processing-from-kafka-events/

Supported Model Elements

The BPMN palette in Flowable Design has following elements available:

  • Event registry start: Starts a process instance when an event of a certain type is received.
  • Event boundary event: Can be placed on any boundary of an activity (regular user tasks, subprocesses, etc.). Has the regular boundary event semantics, so can be used to interrupt the activity (or keep it active) when a certain event is received. Has the possibility for correlation parameter mapping to exactly specificy which event should trigger which instance.
  • Send event task: Used to send out an event over an outbound channel.
  • Send and receive task: Used to send out an event and wait for another event in one step. This is an important construct documented below.

The CMMN palette in Flowable Design has following elements available:

  • Case instance start: This option is a bit hidden away, due to CMMN not having a clear way of defining a start event like in BPMN. For this reason, it can be found when clicking on the case plan model. Starts a case instance when an event of a certain type is received.
  • Event listener: Waits for an event to be received to occur.
  • Send task Used to send out an event over an outbound channel

Send and Receive Task

A pattern that is used a lot is to send out and event, let some external service do something with this event and wait for a response back:

Send and receive problem

The problem is that a race condition can happen, when the service is very fast or when the database that Flowable uses is very slow. Flowable could still be processing the process or case instance and it's not yet in the state where the response event can be received (technically: the database transaction hasn't been comitted yet).

The Send and receive task solves exactly that problem: it guarantees that no race conditions can happen when implementing this pattern. (technically: it uses a combination of post-commit transaction listeners and async jobs on receival to do this).

Send and receive

Versioning

When it comes to versioning channel and event definitions, things are slightly different from the the standard way deployments and definition versions are handled.

When it comes to determining which channel definition is used, the latest deployed version (for the same key) will be used. The reasons are the following:

  • It's highly unlikely that multiple queueus (or topics, etc.) would be setup for handling the same use case.
  • For the receiving side, it would be inefficient to keep an instance of a 'listener' running for all versions of the inbound channel definition.

When it comes to event definitions, the same rule applies: the event detection step of the channel pipeline will always use the latest version of a deployed event definition.

Configuration properties

See the admin guide for details on how to enable and configure the connection with a queueing solution.

← Data Object ModelsAction Models →
  • Introduction
  • Concepts
    • Channels
    • Events
  • Channel Model
    • Inbound: Technical configuration
    • Inbound: Advanced
    • Inbound: Pipeline
    • Outbound
  • Event Model
  • Event Mapping
  • Correlation
  • Example
    • The App Model
    • The Process Model
    • The Case Model
    • Run it
    • Make It Repeatable
  • Advanced Example
  • Supported Model Elements
    • Send and Receive Task
  • Versioning
  • Configuration properties
Flowable Enterprise Documentation
Documentation
UsersModelersAdministratorsDevelopers
Community
Enterprise ForumEnterprise BlogOpen Source ForumOpen Source Blog
Follow @FlowableGroup
More
DisclaimerPoliciesGitHubStar
Copyright © 2021 Flowable AG