Sadhana Nandakumar
5 min readJul 12, 2021

Integrating Red Hat Process Automation Manager and Red Hat AMQ Streams on OpenShift in 4 steps

An event-driven architecture is a model that allows communication between services in a decoupled fashion. This pattern has evolved into a powerful software paradigm and covers a wide array of use cases. Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. With this industry evolution, we can now create event-driven business processes which can work seamlessly in a microservices-based environment. With the latest release of Red Hat Process Automation Manager (7.11), you can work with business processes capable of interacting with external services via events, either by emitting or consuming them.

Earlier this year, Karina Varela wrote a detailed article about the integration between Red Hat Process Automation Manager and Kafka. In this article, we will look at the integration of Red Hat Process Automation with Red Hat AMQ Streams on OpenShift. Red Hat AMQ Streams is an enterprise grade Kubernetes-native Kafka solution.

In IT today, it can be both challenging and time-consuming for operations and development teams to be experts in many different technologies knowing how to use them, while also knowing how to install, configure, and maintain them. Kubernetes operators help streamline the installation, configuration, and maintenance complexities. We will be using the AMQ Streams Operator and the Business Automation Operator to help simplify the process.

Next, you’ll see how to deliver and test an event-driven process application on OpenShift in four steps:

  1. Kafka deployment on OpenShift
  2. Red Hat Process Automation Manager deployment on OpenShift
  3. Creation and deployment of the business application
  4. Test of the business application using events

Step 1: Deploy the AMQ Streams operator

The Operator Hub is a collection of Operators from the Kubernetes community and Red Hat partners, curated by Red Hat. Let us first create a install the AMQ Streams operator.

We will install it in a namespace we have created (pam-kafka in the example).

Now that the operator is installed, we can now create a simple 3 node Kafka cluster. For this click on the Kafka tab and click on the Create Kafka button.

We will accept the defaults and create.

Step 2: Deploy the Business Automation Operator.

Let us now go back to Operator Hub and search for the Business Automation Operator.

We will install in the same namespace as we did before. The operator allows you to create a Red Hat Process Automation manager environment with the authoring and deployment capabilities. Once the operator is deployed, switch over to the KieApp tab and click on Create KieApp.

In this wizard, click on the Objects section and open up Servers. Here we will create a Kieserver definition with the environment properties for the Kafka connectivity settings.

We will start with a minimum configuration for this connectivity. Check out the documentation to explore the complete list of properties.

KIE_SERVER_KAFKA_EXT_ENABLED: TRUE

KIE_SERVER_KAFKA_EXT_BOOTSTRAP_SERVERS: my-cluster-kafka-brokers:9092

KIE_SERVER_KAFKA_EXT_GROUP_ID: kafkaintegration

This should now deploy an instance of Business Central and an instance of Kie Server.

Step 3: Create a Business Process to react based on events

Let us now create a simple business process that starts based on a message in a Kafka topic.

Let us open up business central by clicking on the route definition as below

This sample yaml definition shows the KieApp configuration that we created using the operator.

Login with the credentials as defined in KieApp yaml, and create a project.

Click on Add Asset and create a Business Process.

Create a simple workflow definition, as above. The Start Message Event node defines a node which allows us to read a message from a Kafka topic. The start event node message property is mapped to the input topic (topic it will subscribe to) for the business process. This topic will be mapped automatically as the producer starts pushing in messages.

Save the changes and build and deploy.

Step 4: Create an event emitter

Let us now produce an event on the event-input-stream topic. For this we will use a simple python utility. This application generates events every few minutes. Considering we’re logged in the OpenShift environment in a command line, we can use the following command:

oc new-app centos/python-36-centos7~https://github.com/snandakumar87/txn-event-emitter -e KAFKA_BROKERS=my-cluster-kafka-brokers:9092 -e KAFKA_TOPIC=event-input-stream -e RATE=1 — name=emitter

You can check the logs on the KieServer pod.

Using Business Central’s management perspective, we can also see the processes being created based on the events.

In addition to this, it is also possible to consume messages or publish every committed transaction to Kafka topics. Check out the documentation to learn more.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Sadhana Nandakumar
Sadhana Nandakumar

Written by Sadhana Nandakumar

Sadhana is a Sr Technical Product Marketing Manager specializing in Salesforce Platform

No responses yet

Write a response