Resolve Virtual Agent NLU topic discovery issues
-
- UpdatedAug 3, 2023
- 9 minutes to read
- Vancouver
- Virtual Agent
If an intent is not being chosen when expected, you can troubleshoot Natural Language Understanding (NLU) prediction errors.
Quick troubleshooting tips
- Is NLU enabled for the session language?
- Enable the language in Natural Language Understanding (NLU) settings for Virtual Agent. For details, see Enable NLU languages in Virtual Agent settings.
- Is the NLU Model trained and published for the session language?
- Your topic may not be using the latest changes to your model. For ServiceNow® NLU, see Train and try your NLU model and Publish a Virtual Agent topic.
- Is the Virtual Agent topic published, discoverable, and bound to a published NLU Model and Intent for the session language?
- The topic should be bound to a single model and intent for a given language. Ensure there are no duplicate intents bound to other topics. Make sure that the topic, model, and intent are in the same domain. For more information, see Publish a Virtual Agent topic.
- Are any roles or conditions specified for the Virtual Agent topic on the Properties tab in Virtual Agent Designer?
- If the topic is shown only for certain roles or for certain conditions, this can impede topic discovery. For details, see Topic Properties tab.
- Why did Virtual Agent discover my topic in Spanish but not in French?
- There are several possibilities:
- Not all languages are supported by all NLU providers. For details, see Language support for NLU services.
- Not all topics are bound to a language-specific NLU model and intent.For example, Topic A may be mapped as follows:
- Bound to Model A and Intent A for English
- Bound to Model A and Intent A for Spanish
- NOT bound to a model or intent for French
- Why did a topic variable NOT slot fill?
- Check the following:
- The topic's user input node was not configured with an associated entity. Use the NLU entity property for the node in Virtual Agent Designer.
- The prediction result contained an NLU entity with a confidence score that was less than the configured confidence threshold.
- The NLU entity value for the node was invalid. For example, you can't apply the word "red" to a Date/Time entity type.
- The topic's user input node was not configured with an associated entity. Use the NLU entity property for the node in Virtual Agent Designer.
- Test discovery from the topic.
- Do one of the following:
- From the Topics page, select Test Active Topics. Enter your utterance, and watch the Analyze test phrases tab.
- Open the topic in Virtual Agent Designer, then select Test. In the Test window, select the Include topic discovery check box. Enter your utterance, and watch the Analyze test phrases tab.
For details, see Testing NLU/Keyword topics. For an understanding of how topic discovery works, see NLU topic discovery logic in Virtual Agent.
- Is "Setup Topic" the topic in question?
- If so, it needs to be configured in Conversational Interfaces Chat Settings for it to be discovered. For details, see Configure a Virtual Agent chat experience.
- Questions or issues with mid-topic switching.
- Why did the conversation return to Topic A after Virtual Agent switched to Topic B?
- The Resume topic flow after topic switching attribute is enabled on the topic. You can find this toggle switch on the Properties tab under .
- Why did the conversation NOT return to Topic A after Virtual Agent switched to Topic B?
- The Resume topic flow after topic switching attribute is disabled on the topic. You can find this toggle switch on the Properties tab under .
- If Virtual Agent doesn't find an intent, will it use a keyword search in mid-topic?
- No.
- Are Topic A's variables available to Topic B after switching?
- No.
- Are Topic B's variables available to Topic A when Topic A resumes?
- No.
For additional troubleshooting, check the information in the following tables.
Check NLU prediction information in the Open NLU tables
When reviewing or debugging topics that use Natural Language Understanding (NLU), you can use various Open NLU tables to view the NLU prediction results for your topics. For example, the Open NLU Predict Intent Feedbacks and Open NLU Predict Entity Feedbacks tables provide detailed information on the NLU processing performed by applications (such as Virtual Agent) to determine topic intent and entities (slot filling).
- Open NLU Predict Intent Feedbacks table
The Open NLU Predict Intent Feedbacks [open_nlu_predict_intent_feedback] table shows the intent processing that an application (in this case Virtual Agent) performs in response to an NLU intent prediction result. The goal for Virtual Agent is to map a predicted NLU intent to a Virtual Agent topic. Whenever Virtual Agent suggests a topic, a record of the prediction result is added to this table. If Virtual Agent doesn't find a topic, no prediction occurs, and a record marked Skipped is added to this table.
To view the table, navigate to All, and then enter open_nlu_predict_intent_feedback.list in the navigation filter.
- Open NLU Predict Entity Feedbacks table
The Open NLU Predict Entity Feedbacks [open_nlu_predict_entity_feedback] table shows the entity (slot-filling) processing that an application (in this case Virtual Agent) performs in response to an NLU entity prediction result. For example, the goal of Virtual Agent is to map a predicted NLU entity to a Virtual Agent topic input variable.
To view the table, enter open_nlu_predict_entity_feedback.list in the navigation filter.
- Open NLU Predict Logs
The Open NLU Predict Logs [open_nlu_predict_log] table provides a consolidated overview of the NLU prediction records for topics. Each record in the log identifies the utterance and corresponding intents (topics) and entities determined by the NLU service. Each record also includes the NLU prediction scores calculated during topic discovery (intent matching) and entity extraction.
Note: NLU prediction node logs are generated automatically. If you're using node logs for debugging but you want to suppress the automatic generation of NLU prediction node logs, add the com.glide.opennlu.predict.node_logging_enabled system property and set the value to false.To view the Open NLU Predict Logs, enter open_nlu_predict_log.list in the navigation filter.
Note: You can view the detailed intent and entity results in the Open NLU Predict Intent Feedbacks and Open NLU Predict Entity Feedbacks tables.
Review HTTP connection information for Open NLU integrations
Use the Open NLU Driver HTTP Connection [open_nlu_driver_http_connection] table to quickly check the HTTP credentials, connection details, and methods for the intents, entities, NLU models, and predictions for your NLU service provider.
To view the table, enter open_nlu_driver_http_connection.list in the navigation filter.