ServiceNow Kafka Consumer

Integrates your ServiceNow instance with Kafka Consumer and stores data in the ServiceNow tables.

Request apps on the Store

Visit the ServiceNow Store website to view all the available apps and for information about submitting requests to the store. For cumulative release notes information for all released apps, see the ServiceNow Store version history release notes.

Integration Hub subscription

This spoke requires an Integration Hub subscription. For more information, see Legal schedules - IntegrationHub overview.

Supported versions

This spoke was built for Confluent REST Proxy API v2.

Spoke dependencies

If you’re having trouble installing the app, ensure that these dependent plugins are installed:
  • Confluent Kafka REST Proxy Spoke
  • ServiceNow IntegrationHub Runtime (com.glide.hub.integration.runtime)
  • ServiceNow IntegrationHub Action Step - RTE (com.glide.hub.action_step.rte)
Note: Some of these plugins are licensable features and require appropriate licenses, if used outside the spoke implementation.

Connection and credential alias requirements

Integration Hub uses aliases to manage connection and credential information. Using an alias eliminates the need to configure multiple credentials and connection information profiles when using multiple environments. If the connection or credential information changes, you don't need to update any actions that use the connection. For more information, see Connections and Credentials.

ServiceNow Kafka Consumer uses the connection and credential alias of the Confluent Kafka REST Proxy Spoke.

Configure ServiceNow Kafka Consumer

Retrieve events pertaining to the specified topics and store it in the required ServiceNow tables.

Before you begin

Procedure

  1. Navigate to All > Process Automation > Kafka Consumer.
  2. Click New.
  3. On the form, fill these values.
    Note: You must create one Kafka Consumer record for each topic.
  4. Click the Start Sink Listener related link.
    The Kafka Partition Group Listener States table is populated with the topic partition data.
  5. (Optional) If you want to stop listening, click the Start Sink Listener related link.
  6. Navigate to System Scheduler > Scheduled Jobs.
  7. Open the record, Kafka Consumer Trigger.
  8. From the Trigger Type list, select Interval.
  9. Click Update.