Auditing case management executions with Kafka and Red Hat Process Automation Manager — KIE Community

Sadhana Nandakumar
4 min readNov 16, 2021

--

Case management provides problem resolution for dynamic processes as opposed to the efficiency-oriented approach of BPM for routine, predictable tasks. It manages one-off situations when the process flow is determined based on the incoming request. Red Hat Process Automation Manager provides the ability to define Case Management applications using the BPMN 2.0 notation. In this article, we will explore how you can capture audit metrics for a running case instance.

Using Case Listeners for fine-grained and async auditing

Case Events Listener can be used to capture notifications for case-related events and operations that are invoked on a case instance. This can then be sent downstream to analytical tools. The Case Events listener can be implemented by overriding any of the methods as defined by the CaseEventListener interface.

In our example, we will set up a listener to capture the following events:

We will then send them over to a Kafka topic from which this data can be visualized or analyzed.

private void pushToKafka(CaseDefinition caseDefinition) {
try {
Future<RecordMetadata> out = producer.send(new ProducerRecord<String,
String>("case_events", caseDefinition.getCaseId(), new ObjectMapper().writeValueAsString(caseDefinition)));
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
@Override
public void afterCaseStarted(CaseStartEvent event) {
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Started",null,null, new Date());
pushToKafka(caseDefinition);
}
@Override
public void finalize() {
System.out.println("case listener clean up");
producer.flush();
producer.close();}@Override
public void afterCaseDataAdded(CaseDataEvent event) {
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Data Added",event.getData(),null, new Date());
pushToKafka(caseDefinition);
};
@Override
public void afterCaseDataRemoved(CaseDataEvent event) {
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Data Removed",event.getData(),null, new Date());
pushToKafka(caseDefinition);
};
@Override
public void afterCaseClosed(CaseCloseEvent event) {
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Closed",null,null, new Date());
pushToKafka(caseDefinition);
};
@Override
public void afterCaseCommentAdded(CaseCommentEvent event) {
CaseComment caseComment = new CaseComment(event.getComment().getComment(),event.getComment().getAuthor());
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Comments Added",null,caseComment,new Date());
pushToKafka(caseDefinition);
};
@Override
public void afterCaseReopen(CaseReopenEvent event) {
CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Reopened",null,null, new Date());
pushToKafka(caseDefinition);
}

Notice how we extract the event data properties that we are interested in so that we can push it for analysis. We will then package the listener class as a maven project so that we can configure it on our case project. A complete example of the listener can be found in this git repository.

Configuring the listener on the case project:

Now that we have created the listener, we will now configure it in our case project.

First, we should add the listener jar to business central so that our case project can use it. You can use Business Central UI to upload the jar file. The following Aritfact upload option can be acessed through the menu Central → Settings →Artifacts. Upload the jar file:

Now, let’s add the dependency for the listener jar on our case project. You can do this by accessing, Business Central in Menu → Design → PROJECT_NAME → Settings →Dependencies

Next, you can configure the listener using the deployment descriptors. Access it in: Business Central in Menu → Design → PROJECT_NAME → Settings → Deployments.

Finally, we can build and deploy the changes, and the listener should be able to capture case changes as they occur.

Visualizing the collected data

In order to visualize the data, let us set up a simple UI application. This quarkus application reads from the kafka topic where we push our case metrics and shows it on a responsive UI. The application can be started using:

mvn quarkus:dev

The UI application should be available at http://0.0.0.0:8582/ListenerUI.html

Testing the Case Audit Metrics:

Let us create a case request.

We can now see that audit metrics start populating on the UI application we created.

Notice how the case start and case data added events have been captured. For every data element added to the case file, the event defines the payload associated with the data added.

Similarly, other case changes like comments being added and cases being closed can be captured similarly.

Summary

This simple demo project for case listeners and the UI can be used with any case project. It shows how we can set up a listener for a case, and how we could push it down to Kafka for effective monitoring and audit traceability.

References:

Case Listener

Originally published at https://blog.kie.org on November 16, 2021.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Sadhana Nandakumar
Sadhana Nandakumar

Written by Sadhana Nandakumar

Sadhana is a Sr Technical Product Marketing Manager specializing in Salesforce Platform

No responses yet

Write a response