Exchanging data using the Hermes Messaging Service
-
- UpdatedJan 30, 2025
- 3 minutes to read
- Yokohama
- Now Platform Capabilities
You can produce and consume Kafka messages in your ServiceNow instance using the Hermes Messaging Service.
There are several methods for exchanging data between your ServiceNow instance and your Kafka environment using the Hermes Messaging Service. In all cases, data is produced from one entity and consumed by another.
- Using Stream Connect, you can produce messages from your ServiceNow instance using a Producer step from a flow action or the Producer API, and then consume the messages in your external application. You can also produce messages from an external application and then consume the messages in your ServiceNow instance via any of the following methods:
- Kafka flow trigger
- RTE consumer
- Transform map consumer
- Script consumer
See Stream Connect for Apache Kafka for more information.
- Using the Log Export Service, you can produce logs from your ServiceNow instance, and then consume the logs in your external application. For details on producing and consuming logs for Log Export Service, see Exploring Log Export Service (LES).
- Using the Kafka standard protocol, you can exchange messages with any application that produces messages. For example, you can produce messages from a Java application using the standard Kafka protocol and then consume them in your ServiceNow instance and vice versa.
Producing and consuming messages
To start exchanging messages, create a topic in the Hermes Kafka cluster. After you successfully create the topic, you create a total of three processes to communicate with the Hermes Kafka cluster.
- One process is required to produce messages to Hermes.
- Two processes are required to consume messages from Hermes. Two processes are required because Hermes uses a pair of Kafka clusters for failover purposes. If one cluster goes down, data is produced to the other Hermes Kafka cluster.Important: You must configure two distinct consumer bootstrap addresses, one for each consumer client.
When using the list
command to view a list of topics, a list of current topics are returned from one or both clusters. Topics might be returned from only one cluster, depending on when the last synchronization
occurred. Topics created for failover purposes are differentiated by a three-letter cluster identification prefix.
When accessing the Hermes clusters using the CLI, internal topics appear in addition to any topics you've created. For details, see KB1705399.
You can't delete a topic from both Kafka clusters using a single command in the command-line interface (CLI). However, you can delete the topic from both clusters by deleting the topic record in your instance. See Delete a topic in Hermes.
Required port ranges
- Producer: 4000-4050
- Consumer1: 4100-4150
- Consumer2: 4200-4250
Bootstrap addresses
Use the following port mappings to connect producers and consumers to the Kafka cluster bootstrap addresses. All Application Delivery Controllers used by Hermes follow this same convention.
- Producer: 4000-4050
- Consumer1: 4100-4150
- Consumer2: 4200-4250
Producer clients use ports ranging from 4000 to 4050. For example:
Because Hermes uses a pair of Kafka clusters, you must configure two consumer clients with separate consumer bootstrap addresses.
- The first consumer client uses ports ranging from 4100 to 4150. For example:
- The second consumer client uses ports ranging from 4200 to 4250. For example:
When you configure producer and consumer properties for your own Kafka client, use this string pattern.