-
Notifications
You must be signed in to change notification settings - Fork 0
dialogflow
Dialogflow allows you to build rich conversational experiences with natural language conversations. It will give users new ways to interact with your application by building engaging voice and text-based conversational interfaces.
With Dialogflow's Natural Language Processing(NLP) and Machine Learning, it allows your agent to understand a user's interactions as natural language and convert them into structured data. Your agent uses machine learning algorithms to match user requests to specific intents and uses entities to extract relevant data from them.
Over time an agent “learns” both from the examples you provide in the Training Phrases and the language models developed by Dialogflow. Based on this data, it builds a model (algorithm) for making decisions on which intent should be triggered by a user input and what data needs to be extracted. This algorithm is unique to your agent.
Dialogflow will show conversations with the agent for review and performance improvements. Each user request is a list item, showing the intent that will be used for processing, as well as the current parameter annotation. You can reassign inputs to correct intents and fix annotations.
Single button click integrations allow you to bring your conversational app to any platform your users are on, such as the Google Assistant, Slack, Spark, Alexa and SMS.
Dialogflow will process a conversation with a user in order to parse parameters and send them to your application for processing.
Dialogflow Terminology (Rosetta Stone)
Term | Translation |
---|---|
Agent | Think of it as an application name |
Intent | Conversational Flow |
Entities | Parameters to pass to fulfillment |
Contexts | Conversational mapping for complex flows |
Fulfillment | How to respond to the users intent (a webhook) |
In short: An agent transforms natural user requests into actionable data. Agents can also be designed to manage a conversation flow in a specific way. This can be done with the help of:
- Intents,
- Context mapping,
- And fulfillment via a webhook
Agents are platform agnostic, so you only have to design an agent once. From there, you can integrate it with a variety of platforms using SDKs and Integrations.
An intent represents a mapping between what a user says and what action should be taken by your software.
Intent interfaces have the following sections: Training Phrases, Action, Response, Contexts
Entities are used for extracting parameter values from natural language inputs. Any important data you want to get from a user's request, will have a corresponding entity. You will not need to create entities for every possible concept mentioned in the agent – only for those needed for actionable data.
Examples of Entities could be:
- Given Name
- Technician ID
- Circuit ID
- Contact Information
Contexts represent the current context of a user's request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences, geographic location, the current page in an app, or the topic of conversation.
For example, if a user is listening to music and finds a band that catches their interest, they might say something like: “I want to hear more of them”. As a developer, you can include the name of the band in the context with the request, so that the agent can use it in other intents.
Dialogflow fulfillments allow you to pass information from a matched intent into a web service and get a result from it. Once you have captured all actionable data from the user's conversation, you will be able to make a Webhook request to your backend service.
Your web service receives a POST request from Dialogflow. This is in the form of the response to a user query, matched by intents with webhook enabled.