Configure the Apache Kafka Consumer connector instance to create events from streaming messages collected by the Apache Kafka connector.

Before you begin

Ensure that the Event Management Connectors (sn_em_connector) plugin is installed on the ServiceNow AI Platform instance.

Role required: evt_mgmt_admin

Procedure

  1. Navigate to All > Event Management > Integrations > Connector Instances.
  2. Select New.
  3. On the form, fill in the fields.
  4. For PLAINTEXT or SASL_PLAINTEXT, in the Credential field, create a Basic authentication credential.
    Use the Kafka SSL credential type for a new Kafka pull connector instance with SASL_PLAINTEXT authentication. Go to step 5.

    Use the Kafka SSL credential type when the Kafka server’s sasl.enabled.mechanisms or sasl.mechanism parameter's value don’t equal PLAIN. Go to step 5.

    1. Select the Search icon Search icon next to the Credential field.
    2. On the Credentials screen, select New.
    3. From the list of available credentials, select Basic Auth Credentials.
    4. On the Basic Authentication screen, give information for either PLAINTEXT authentication, which doesn’t require credentials, or SASL_PLAINTEXT authentication, which requires credentials.
      • For PLAINTEXT authentication, in the Name field, type any value and leave the User Name and Password fields empty.
      • For SASL_PLAINTEXT authentication, give a name for the credential in the Name field, preferably prefaced by SASL. Provide the user name and password In the User Name and Password fields.
    5. Select Submit.
  5. For a SASL_PLAINTEXT, SSL, or SASL_SSL authentication credential, in the Credential field, create a Kafka SSL credential.
    1. Select the Search icon Search icon next to the Credential field.
    2. On the Credentials screen, select New.
    3. From the list of available credentials, select Kafka SSL Credentials.
    4. On the Kafka SSL Authentication screen, fill in the Kafka SSL credentials fields.
      For the Kafka SSL credentials fields descriptions, see .
      Note: If you clear the Disable hostname verification field, then in the Additional Kafka consumer properties field, enter the value of the ssl.endpoint.identification.algorithm parameter the same as of the Kafka server parameter value. For example:
      ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1;
      sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username=<username> password=<password>;
      sasl.enabled.mechanisms=PLAIN;ssl.endpoint.identification.algorithm=;
    5. Select Submit.
  6. Right-click the form header and select Save.
  7. In the table presenting the connector instance values, verify the populated connector instance values based on your Kafka setup and the message (JSON payload) that you received from the Kafka topic.

    Change the connector instance values if necessary.

  8. In the MID Servers for Connectors section, specify a MID Server that is up and is valid.
    Note: You can configure several MID Servers. If the first server is down, the next MID Server is used. If that MID Server isn’t available, the next is selected, and so on. MID Servers are sorted according to the order that their details were entered into the MID Servers for Connectors section.

    If you do not specify a MID Server, an available MID Server that has a matching IP range is used.

  9. Right-click the form header and select Save.
  10. Test the connection between the MID Server and the Kafka Consumer connector.
    1. Select Test Connector.
    2. If the test connection fails, verify whether the credential is valid, and that the network is connected from the MID Server to the Kafka broker.
      Note: Kafka topic name validation occurs only in Test Connector validation.
  11. After a successful test, make the connector instance active by selecting the Active check box.
  12. Select the Update button.

What to do next

After the connector is created, you must map the fields. For more information, see Map Kafka message payload attributes to alert fields.