Stream Connect quick start guide
-
- UpdatedFeb 1, 2024
- 3 minutes to read
- Washington DC
- Building applications
This quick start guide gives you an overview of how to set up and use each of the six Stream Connect producers and consumers to exchange data between your ServiceNow instance and a Kafka environment.
- Kafka Producer step in Flow Designer
- ProducerV2 API
- Kafka Message trigger in Flow Designer
- Extract Transform Load (ETL) consumer
- Transform Map consumer
- Script consumer
To link your ServiceNow instance to a Kafka environment, Stream Connect uses the Hermes Messaging Service. The following diagram shows each of the producers and consumers and how they connect to your Kafka environment, shown here as the customer site, through Hermes.

For more information, see Using Stream Connect for Apache Kafka and the Hermes Messaging Service.
Getting started
To represent the customer site, this guide uses Apache Kafka command-line tools run in your computer. The configuration parameters used with this tool can be used to configure any client connection to the Application Delivery Controller, version 2 (ADCv2) gateway using the Kafka protocol.
All the given commands are tested with OpenSSL version (LibreSSL 2.8.3) and Apache Kafka binary distribution version kafka_2.13-3.4.0.tgz. However, all the commands should work with any version you have in your local environment.
Quick start steps
This quick start guide has five steps. Steps 1 and 2 cover how to set up and test a connection to Hermes. Steps 3 through 5 show you how to configure and use each of the Stream Connect producers and consumers to send and receive data.
- The ADCv2 gateway uses mutual authentication to authenticate requests, so the first step is to create a keystore and truststore to use with your favorite Kafka client. The following page shows you how to create the required certificates: Set up a secure connection to the Hermes Messaging Service.
- After you set up a secure connection to Hermes, verify the keystore and truststore have been correctly generated by Testing the connection to Hermes through the ADCv2 gateway.
- Now you can Use the Kafka Message trigger and Script consumer to consume messages. The first part of this page shows you how to use the Kafka Message trigger in Flow Designer to retrieve messages from Kafka. The second part shows you how to use the Script consumer to retrieve messages.
- And Use the ETL and Transform Map consumers to import data. ETL definitions and transform maps specify the transformation logic to use when pulling data through scheduled imports. You can use the same ETL definitions and
transform maps to transform the events received through Kafka.Note: To use the ETL consumer, you need a robust import set transformer. Likewise, to use the Transform Map consumer, you need a transform map.
- Finally, you can Use the Kafka Producer step and the ProducerV2 API to publish messages. This page shows you how to use both the Kafka Producer step in Flow Designer and the ProducerV2 API to push messages to Kafka.