Configure the Apache Kafka Consumer connector
- UpdatedJan 30, 2025
- 5 minutes to read
- Yokohama
- Event Management
Configure the Apache Kafka Consumer connector instance to create events from streaming messages collected by the Apache Kafka connector.
Before you begin
Ensure that the Event Management Connectors (sn_em_connector) plugin is installed on the ServiceNow AI Platform instance.
Role required: evt_mgmt_admin
Procedure
- Navigate to All > Event Management > Integrations > Connector Instances.
- Select New.
-
On the form, fill in the fields.
Table 1. Connector instance form Field Value Name Descriptive and unique name for the Kafka Consumer connector. Description Description to be used by the Kafka Consumer event collection instance. Connector definition Name of the required connector definition, which in this case should be Kafka Consumer. Host IP The host IP. Note: This field must contain a value to complete the creation process so the placeholder 1.1.1.1 should be entered as a temporary value.Event collection last run time This field is automatically set to the last runtime value. Last event collection status This field is automatically set to the last runtime status. Event collection schedule (seconds) The frequency in seconds that the system checks for new events from the Kafka Consumer. The default value is 60 seconds. Last error message This field is automatically set to the last error message. -
For PLAINTEXT or SASL_PLAINTEXT, in the Credential field, create a Basic authentication credential.
Use the Kafka SSL credential type for a new Kafka pull connector instance with SASL_PLAINTEXT authentication. Go to step 5.
Use the Kafka SSL credential type when the Kafka server’s sasl.enabled.mechanisms or sasl.mechanism parameter's value don’t equal PLAIN. Go to step 5.
-
Select the Search icon
next to the Credential field.
- On the Credentials screen, select New.
- From the list of available credentials, select Basic Auth Credentials.
-
On the Basic Authentication screen, give information for either PLAINTEXT authentication, which doesn’t require credentials, or SASL_PLAINTEXT authentication, which requires credentials.
- For PLAINTEXT authentication, in the Name field, type any value and leave the User Name and Password fields empty.
- For SASL_PLAINTEXT authentication, give a name for the credential in the Name field, preferably prefaced by SASL. Provide the user name and password In the User Name and Password fields.
- Select Submit.
-
Select the Search icon
-
For a SASL_PLAINTEXT, SSL, or SASL_SSL authentication credential, in the Credential field, create a Kafka SSL credential.
-
Select the Search icon
next to the Credential field.
- On the Credentials screen, select New.
- From the list of available credentials, select Kafka SSL Credentials.
-
On the Kafka SSL Authentication screen, fill in the Kafka SSL credentials fields.
For the Kafka SSL credentials fields descriptions, see .Note: If you clear the Disable hostname verification field, then in the Additional Kafka consumer properties field, enter the value of the ssl.endpoint.identification.algorithm parameter the same as of the Kafka server parameter value. For example:
ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1; sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username=<username> password=<password>; sasl.enabled.mechanisms=PLAIN;ssl.endpoint.identification.algorithm=;
- Select Submit.
-
Select the Search icon
- Right-click the form header and select Save.
-
In the table presenting the connector instance values, verify the populated connector instance values based on your Kafka setup and the message (JSON payload) that you received from the Kafka topic.
Change the connector instance values if necessary.
Table 2. Connector instance Values table Field Description authentication_type The Kafka Consumer authentication type. The currently supported values are:
PLAINTEXT (no authentication) (Default)
SASL_PLAINTEXT (basic authentication – user name and password)
bootstrap_servers The servers that establish the connection with the Kafka cluster. Values in this field should be in the form host1:port1, host2:port2, and so on (comma-separated).
This field is required.
consumer_group_name The Consumer group name.
If the same Kafka topic is being consumed from two different instances, use different consumer group names so all events are captured for both instances.
This field is required.
time_of_event_field The name of the field in the JSON payload/message that includes the time of the event.
If a field in the JSON payload/message includes the time of the event, that value should be the value for this parameter.
If no time field value is given, then the received time of the event in the ServiceNow instance is set as time_of_event.
The time_of_event field should be a first-level field in the JSON payload/message. Nested field names aren’t allowed.
timezone The time zone of the time_of_event field in the JSON payload/message.
The timezone value isn’t used when the time_of_event field is empty.
For example, if the time_of_event in the payload/message is in the IST time zone, the value would be GMT+0530.
Default: GMT.
date_format The date_time format of the time_of_event field in the JSON payload/message.
If no time field value is given, then the time when the event was received in the instance in GMT time is used. The date_time value isn’t used when the time_of_event field is empty.
Default: yyyy-MM-dd HH:mm:sss.
debug Displays debug messages. Default value: false. Specify true to see debug messages. logPayloadForDebug Displays payload related debug messages. Default value: false. Specify true to see payload related debug messages. topic The Kafka topic from which the messages are fetched.
This field is required.
-
In the MID Servers for Connectors section, specify a MID Server that is up and is valid.
Note: You can configure several MID Servers. If the first server is down, the next MID Server is used. If that MID Server isn’t available, the next is selected, and so on. MID Servers are sorted according to the order that their details were entered into the MID Servers for Connectors section.
If you do not specify a MID Server, an available MID Server that has a matching IP range is used.
- Right-click the form header and select Save.
-
Test the connection between the MID Server and the Kafka Consumer connector.
- Select Test Connector.
-
If the test connection fails, verify whether the credential is valid, and that the network is connected from the MID Server to the Kafka broker.
Note: Kafka topic name validation occurs only in Test Connector validation.
- After a successful test, make the connector instance active by selecting the Active check box.
- Select the Update button.
What to do next
After the connector is created, you must map the fields. For more information, see Map Kafka message payload attributes to alert fields.