Event-Based Architecture Overview

   Journey Manager (JM) The transaction engine for the platform. /  Event-Based ArchitectureEvent-Based Architecture  |    System Manager / DevOps  |  22.10 This feature was introduced in 22.10.

Manager comes with the Event-Based Architecture (EBA) that allows you to publish various events to event topics run on Apache KafkaApache Kafka is a distributed event store and stream-processing platform, which provide a unified, high-throughput, low-latency platform for handling real-time data feeds. For more information, see https://kafka.apache.org/intro.

This makes it possible to decouple some Manager's core functionality so it can be externalized. For example, you may want to replace collaboration jobs workflow engine with an external product of your choice. This, in turn, improves integrations with various 3rd party systems.

Why we have introduced it. The answer lies in the fact that the Event-Based Architecture is based on the concept of events. This approach encourages you to think of events first, not things and their states. An event is something that happens, data can be an event. In Manager, most of our events are some sort of changes to data, for example a form submission.

Our event-based architecture uses the Outbox pattern. We've implemented this patten using the Kafka Producer, which is a Kafka client that publishes events to different topics in a Kafka's cluster.

The simplified Kafka event flow is illustrated below:

Kafka event flow

In terms of the event-based architecture, Journey Manager is a producer and any external system is a consumer of various events. This is illustrated below:

Event-Based Architecture overview diagram
Note

Currently, Manager doesn't support subscribing (listening) to event topics.

The event publishing mechanism kicks in when submission data is created or modified. For example, when a form user clicks a Submit button on an Onboarding application form, the Submission Service raises an event. A similar event is raised if a service or a task updates or deletes submission data. This is illustrated in a diagram below:

Event-Based Architecture overview diagram

The Submission Service passes the new event to the Event Outbox component to publish it to a Kafka's topic. The Event Outbox also persists it in a local database to ensure the event will be published in case the first publishing effort fails. There are several other components involved to make sure event processing is audited and all data is purged according to your data retention management.

Let's have a detailed look at the sequence of events when a form user submits a form:

  1. A customer fills in an onboarding form and submits details, as defined in the onboarding process, to Journey Manager's submission service.
  2. Various events are triggered depending on the custom event configuration. Events are persisted in the database and sent the Event Outbox for publishing.
  3. Auditing is also performed on the event generation.
  4. Further changes to the submission may result in additional events being triggered with additional entries persisted in the database Event Outbox
  5. Journey Manager publishes the event on the topic defined in the Event Configuration. If a connection error occurs, a retry will happen within a defined period of time and number of retries.
  6. The event is removed from the database on successful delivery to the defined service connection, in this case it's the Kafka service connection.
  7. Details of the triggered event are stored as a part of the Submission details in Event Log.
  8. External systems, which have subscribed to the same Kafka topics that are defined in the Event Configuration, are notified when the trigger occurs

For the event-based architecture to work correctly, you need to configure Manager in a way that certain events are published in corresponding topics in the Kafka cluster. This is illustrated in a diagram below:

Event-Based Architecture overview diagram

Let's look what an Administrator with access to the Manager UI has to do to configure Kafka-based event flow:

  1. Create and configure a Kafka Service Connection
  2. Create a custom Event Configuration Storage service and link it to the Kafka Service Connection
  3. Enable the Eventing Feature Enabled deployment property to activate the eventing framework, so all triggered events are stored in the Event Outbox
  4. The Event Outbox publishes the events to Kafka on a topic as configured in the Kafka Service Connection
  5. Any external system, subscribed to Kafka on the topics as defined in the Event Configuration, is notified when an event is triggered and published to Kafka by Manager

Next, learn about the event definition.