Besides the Atura Core and customisable elements such as FORMS ENGINE, there are some elements common to all Atura bots that you should be aware of before building your own.
Building Dialogs
Root (Coordinator) Dialogs
New conversations are always initiated from a root or "coordinator" dialog. Root dialogs are defined as follows:
The two important members that are applicable to CoordinatorDialogs specifically are "DialogLauncher" and "HandleDontUnderstandResponse". The DialogLauncher property points to a client-specific subclass of AturaDialogLauncherBase (explained below) that tries to launch the appropriate non-root dialog based on the user's message text. The HandleDontUnderstandResponse method executes when the user types something the DialogLauncher does not understand (i.e. there is no intent that can be mapped to the utterance).
In this approach, all the root dialog needs to do to enable an assistant that can "divide and conquer" and expose a very large set of functionality is have the CoordinatorDialog ask the DialogLauncher to interpret and launch a subdialog (or to delegate handling to HandleDontUnderstandResponse where appropriate).
DialogLauncher
Each AI assistant needs its own specialised subclass of AturaDialogLauncherBase. It is this class's responsibility to inspect an already interpreted LanguageResult and user message text, and to determine which sub dialog (if any) to launch. For more details on how interpreted LanguageResults are created, see the "Language API" section below.
Here is a sample of a dialog launcher class that knows how to launch a ContactUsDialog or a MoreFundInfoDialog (but only for a specific type of user!). This approach is very unit-testable, extensible and flexible:
Non-root Dialogs
Most AI assistant logic will be implemented in "non-root" dialogs, simply called "Dialogs".
Dialogs must inherit from AturaDialogBase and implement the following members:
StartDialogMessage is a text message that will be shown to the user as soon as the Dialog launches, or NULL if nothing should be rendered.
ExtLanguageApiId and ExtLanguageApiAppSecret are keys to pass to the NLP engine, whichever has been chosen. Any NLP engine may be used as long as it accepts an APIId and an APISecret when connecting.
ClientSpecificStrings should be a reference to the .resx file's "ResourceManager". This is used by the Atura base classes when accessing strings for common messages like "Thanks" and "Back".
Dialog logic is usually implemented by overriding the PostStartDialogMessageAsync or ProcessAturaMessageAsync methods. The following dialog shows a "Welcome back" message to existing users, and asks users if they would like to see a product list. Note the use of the "context" to show prompts to the user.
Implementing the Consultant/Agent Takeover
Consultant/agent takeover is a useful feature, not implemented natively by many bot building frameworks. Atura adds customised consultant ("live agent") takeover capability to the Bot Framework. To pause the AI assistant and transfer the user to engage with a consultant, simply call SetUserNeedsAssistance() on a sub class of AturaLiveAgentDialog. For example:
Including Unit Tests
Since bots hide most features behind a single text input, it is often difficult to test the entirety of a bot end-to-end. Thus unit tests are especially important to ensure that all flows continue to behave as expected, even if they haven't been tested in a while.
Atura has unit testing helper classes that make writing TDD-style tests very simple. For example, the following test was written before the actual implementation, and served as a good "roadmap" to build the flow from start to finish:
Note how even the expected responses from the NLP engine (the "intents") are mocked out to ensure the tests are fast and don't actually call the NLP engine.
The Language API
Flow
Each dialog has an instance of an Interpreter, and an interpreter has an Actualizer. The Interpreter is responsible for calling an external NLP engine to get back an intent - that is it takes a raw string and returns what the user's "intent" with the text was.
From there, the Interpreter passes the raw NLP-intent to the Actualizer. The Actualizer is responsible for turning the raw NLP result into a concrete Language Result that the Dialog knows what to do with. The concrete Language Result instance may be very specific and even contain additional information i.e. information about bot-specific entities fetched from a data store.
2.Overriding the Defaults
The AturaDialogBase class defines the default interpreter as follows. If desired, your Dialog subclass can replace the Interpreter or the Actualizer with different implementations.
An interpreter simply has to give an implementation to the following member and return a LanguageResult instance:
The custom Interpreter may use an Actualizer, but it does not have to as long as the Interpreter knows how to transform a raw user Message into a LanguageResult that the dialog can handle.
It is important to note that usually when a dialog receives a LanguageResult that it does not know how to handle, it should finish the dialog and pass to the parent dialog for processing (eventually adding up in the root dialog which knows how to do everything). This is the recommended approach which you may deviate from depending on individual Dialog's requirements.
Helper Dialogs
The Atura framework provides useful dialogs that can be used within your flows to handle common challenges. These include:
SlowCallPurgatoryDialog
MultiSlowWebServiceCallerDialog
PollingPurgatoryDialog
SlowCallPurgatoryDialog
Sometimes external API calls take a long time to complete, and often you don't have much control over the APIs themselves.
This is problematic, since the Bot Framework will throw an exception if the assistant takes longer than 15 seconds to respond to a user (15 seconds is a long time to wait in a chat). If you don't have control over external APIs, you may have to cater for the possibility that some calls will exceed this limit and find a way to handle the situation gracefully.
The SlowCallPurgatoryDialog keeps the user in the dialog until a slow call is complete. If the user types while the slow call is still in progress, the SlowCallPurgatoryDialog will respond with a nice message asking the user to wait.
If the user gets frustrated at the wait, the dialog can allow the user to "break out", but it remains up to the calling code to handle a null response from the dialog if the "break out" is triggered.
MultiSlowWebServiceCallerDialog
The MultiSlowWebServiceCallerDialog serves the same purpose as the SlowCallPurgatoryDialog, except that it executes multiple web service calls in parallel. Imagine the following scenario:
You have five "investors" for which you want to load data, so you have five InvestorIDs.
You want to call the GetBalances service for each investor.
And you want to call the GetRecentHistory service for each investor.
You would use the MultiSlowWebServiceCallerDialog as follows:
PollingPurgatoryDialog
The PollingPurgatoryDialog is similar to the SlowCallPurgatoryDialog, except it "polls" another service as long as is necessary (until a predicate matches).
In the following sample we poll a login service until a user has accepted or rejected a second factor auth request:
Publishing to the Service Bus
Chat messages are sent to the service bus whenever a "Post" method is called on the AturaDialogContext. So as long as the AturaDialogContext methods are used, and not the Bot Framework methods in IDialogContext, no additional work is needed:
Archiving Conversations
Chat messages that arrive at the service bus are archived automatically, no additional work is necessary. Archiving is done by an Azure Function app that subscribes to messages on the bus.
Conversations are stored in the Azure Table Store in the "conversations" table (which is part of every Atura installation).