Use a script to import and process data from your Kafka environment.

Before you begin

  • Role required: integration_hub_admin
  • This consumer requires a Stream Connect subscription. For more information, see https://www.servicenow.com/products/automation-engine.html.
  • The following plugins are required.
    • The ServiceNow IntegrationHub Kafka Consumer [com.glide.hub.kafka_consumer] plugin.
    • The ServiceNow IntegrationHub ETL Consumer - Kafka [com.glide.hub.etl_consumer.kafka] plugin.

About this task

To configure a consumer, you need to create two records.
  1. The consumer record, which specifies how to import and process data.
  2. A record for the Kafka stream, which defines the stream of data to your consumer.
This task covers the consumer creation. For instructions on creating a Kafka stream, see Create a Kafka stream.

Procedure

  1. Navigate to All > IntegrationHub > Consumers > Script Consumer.
  2. Select New.
  3. In the form, fill in the fields.
  4. Select Save.

Example

This example shows a sample script for processing messages.
(function process(messages) {
 // Add your code here to consume kafka messages 
 // sample message [ { 'key' : 'message_key' , 'message' : 'message' , 'headers' : [ { 'key' : 'header_key' , 'value' : 'header_value' } ] } ] 

 for (var i = 0; i < messages.length; i++) {
     var message = JSON.parse(messages[i].message);
     gs.info('Number ' + message.number + ', short description ' + message.short_description +
         ', headers ' + JSON.stringify(messages[i].headers));
 }
})(messages);

What to do next

Create a Kafka stream for this consumer. After the stream is activated, you can start receiving messages from your Kafka environment.