As the world of Skynet slowly engulfs our every day, there is an entire computational ecosystem of services that handle the specifics of our computer services with us. Starting with IBM Watson, an emergent computer personality of functionality has emerged. We now have Watson, Siri, Alexa, and Google to help our devices respond to our requests in natural language. Fundamentally, we are now approaching the ability of interacting with “cloud being” using natural language audio requests.
If we take a step back and attempt to really visualize what we are trying to create with the very natural language interfaces, we will arrive at the fictional “computer” on the USS Enterprise.
In this post we investigate the creation of a fictional chat bot to handle some of the more basic Enterprise functionality.
We are going to try to implement the very simple chat bot feature that can locate a crew member using IBM’s Conversation API. With this product you define various semantic landmarks, and a conversation flow and a series of responses for various states of Intents and Entities.
Semantically, most requests to the computer have an intent and a set of entities for which that intent is being applied. In language we call this a verb and a direct object of the verb. For example, I hammer the nail – the intent here would be hammer and the entity would be the nail.
In the Video, our computer understands a series of locations as entities, as well as crew members. There is an intent to locate crew members as various locations. So with the Conversation API we can set up this underlying structure to our interactions and create a chat bot that can do much the same.
Below shows the configurations of our bot in the Coversation Workspace configuration screen.
Since our computer we are attempting to model only has one function at this point, to locate crew members, we have defined only one intent. We train it with example ways to ask for someones location. Watson will generalize this set of requests and then respond to any similar request even one you did not train it for.
Here we define two types of Entities. We define a CrewMembers type that includes Lt Cmd Data and Cmd Riker. We additionally define a Location entity of various locations on the enterprise.
Here we define the conversational flow. Since we only have one thing the Computer responds to, we create one flow. We say if our #locate intent is true and the @CrewMember is Riker it responds with (albeit predefined) his location. The same for Data. We then define a response if not matching intent is found.
Lets See It Work
As you can see, our Computer responds to various ways to locate various Crew Members. We even get Spanish translation for free. We are essentially approaching a universal language of intents on entities.