setis gmbh logo

AA KAFKA INTEGRATION

Introduction

In a modern IT architecture in which events have to be processed and reliably distributed in real time, companies are increasingly turning to Apache Kafka – or its commercial variant Confluent Kafka.

An example: A globally active company wants to distribute status information from a system to several microservices with virtually no delay – including validation of the data formats via a schema registry.
The highlight: the data is to be processed and sent directly via Automic Automation.

What sounds simple in theory poses some challenges in practice – especially when connecting to Confluent Cloud.

What is Kafka?

Kafka was originally developed by LinkedIn and handed over to the Apache Software Foundation in 2012 and is a real-time messaging platform.
The original developers later founded Confluent, a commercial provider that builds on Kafka and extends it with numerous functions – including:

  • Easy cloud deployment
  • Intuitive web interface
  • Advanced authentication methods (e.g. OAuth über Microsoft Entra ID)
  • Schema registry for AVRO serialisation

Requirements

  • Kafka version: Confluent Kafka
  • Authentication
    • Topic Content: OAUTH Microsoft Entra ID (ServicePrincipal)
    • Schema Registry: BasicAuth Confluent Kafka
  • Schema Registry: AVRO

Automic Automation Kafka Agent?

The Automic Automation native Kafka agent does not currently support the advanced features of the Confluent Cloud, in particular Schema Registry with AVRO.

The solution: A customised Kafka Producer based on Python, which is controlled via Automic Automation.

Architecture interface

The final integration was realised completely via the Unix agent from Automic Automation.
The solution is divided into the following steps:

  • Installation of the Confluent Kafka libraries and dependencies on the Unix agent systems:
  • Install Python modules on the Automic Automation Unix agent systems
  • Python script – generically generated
  • Automic Automation Job – This generates the Python script generically and calls it

Installation Confluent Kafka Library

Installation of the Confluent Kafka libraries and dependencies on the Unix agent systems:

yum install -y python3 python3-pip python3-devel gcc make cyrus-sasl-gssapi librdkafka-devel

Installation Python modules

The necessary Python modules must then also be installed on the Automic Automation agents.

A separate job plan has been created in Automic Automations that automatically installs these on all active agent systems in the corresponding Automic Automation host group.

Note: The installation of the Python modules can be installed in the user space of the executing user account – e.g. usually necessary if they are installed automatically by Automic Automation Jobplan.

  • create requirements file e.g. (/tmp/kafka_python_requirements.txt).
    • cat <<< 'confluent-kafka == 2.4.0
      fastavro
      pydantic
      pydantic_avro' >/tmp/kafka_python_requirements.txt
  • Installing the modules
    • General installation:
      pip3 install -r /tmp/kafka_python_requirements.txt
    • IInstallation within the current user space:
      pip3 install -r /tmp/kafka_python_requirements.txt --user

Python script Confluent Kafka producer

Here, a generic Python script is generated by Automic Automation – including all parameters.

Important: Please do not pass any passwords into the script, these are transferred as parameters in the call.

Note: You can find an example producer for Confluent Kafka below.

AA Job

A customised Python producer is generated by an Automic Automation job, including dynamic transfer of all parameters.
Sensitive data (e.g. passwords) is managed in protected Automic Automation objects and transferred securely at runtime.

The script is called via the ‘UC4 JOBMELDER’ mechanism. This ensures that the sensitive passwords remain protected.

Download Confluent Kafka producer – Python

The attached sample script shows how a producer script for Confluent Kafka can look.
The enclosed readme describes how to use it.

Producer details:

  • Authentication Kafka Topic: OAuth via Microsoft Entra ID
  • Authentication Schema Registry: Access with BasicAuth Confluent Cloud
  • Schema registry: Definition of the data structure via AVRO schema
The Python code provided on this website is for testing. No guarantee is given regarding the accuracy, completeness, or functionality of the code. Use it at your own risk. The website operator is not liable for any direct or indirect damages resulting from the use of the code. 

Conclusion

This solution shows how customised integration between commercial platforms such as Confluent Kafka and established automation tools such as Automic Automation is possible – even if the native agent reaches its functional limits.

By using Python and modern authentication, a robust and expandable interface was created that is not only technically clean, but also secure to operate.