Bot Builder Community – Alexa Adapter Update Preview

Since its launch, the Alexa Adapter for the Bot Framework, part of the Bot Builder Community Project on GitHub, which allows you to surface a Microsoft Bot Framework bot via an Amazon Alexa skill, has received great feedback. Today, I am excited to announce a preview of the next major iteration of the Alexa adapter, one which I hope provides additional benefits for developers, ensures full comparability with the latest developments in the Bot Framework and also ensures we are in the best position we can be to support the Alexa platform moving forward. This post is intended to provide an insights into what is happening and why, along with details of how you can try the preview update yourself and provide feedback.

So, here are the key details this post will cover.

  1. Obtaining / installing the preview and providing feedback
  2. Key Changes
    Adoption of Alexa.NET
    Adding support for Bot Builder Skills and the Virtual Assistant
    New Activity Mapping Middleware
    Integration Changes
  3. Updated Sample
Read More

Building conversational forms with FormFlow and Microsoft Bot Framework – Part 2 – Customising your form

In my last post I gave an introduction to FormFlow (Building conversational forms with FormFlow and Microsoft Bot Framework – Part 2), part of the Bot Framework which allows you to create conversational forms automatically based on a model and allows you to take information from a user with many of the complexities, such as validation, moving between fields and confirmation steps handled for you. At this point if you have not read the last post I encourage you to give it a quick read now as this post follows on directly from that.

As promised, in this post we will dig further into FormFlow and how you can customise the form process and show you how you can change prompt text, the order in which fields are requested from the user and concepts like conditional fields.

Read More

Building conversational forms with FormFlow and Microsoft Bot Framework – Part 1

Forms are common. Forms are everywhere. Forms on web sites and forms in apps. Forms can be complicated – even the simple ones. For example, when a user completes a contact form they might provide their name, address, contact details, such as email and telephone, and their actual contact message.  We have multiple ways that we might take that information, such as drop down lists or simply free text boxes. Then there is the small matter of handling validation as well, required fields, fields where the value needs to be from a pre-defined set of choices and even conditional fields where if they are required is determined by the user’s previous answers.

So, what about when we need to get this type of information from a user within the context of a bot? We could build the whole conversational flow ourselves using traditional bot framework dialogs, but handling a conversation like this can be really complex. e.g. what if the user wants to go back and change a value they previously entered?  The good news is that the bot framework has a fantastic way of handling this sort of guided conversation – FormFlow.  With FormFlow we can define our form fields and have the user complete them, whilst getting help along the way.

In this post I will walk through what is needed to get a basic form using FormFlow working.

Read More

Making Amazon Alexa smarter with Microsoft Cognitive Services

Recently those of us who work at Mando were lucky enough to receive an Amazon Echo Dot for us to start to play with and to see if we could innovate with them in any interesting ways and as I have been doing a lot of work recently with the Microsoft Bot Framework and the Microsoft Cognitive Services, this was something I was keen to do.  The Echo Dot, hardware that sits on top of the Alexa service is a very nice piece of kit for sure, but I quickly found some limitations once I started extending it with some skills of my own.  In this post I will talk about my experience so far and how you might be able to use Microsoft services to make up for some of the current Alexa shortcomings. Read More

Using speech recognition and synthesis in Windows 10 to talk to your bot (and have it talk back!)

In my previous posts I have shown how we can use the Microsoft Bot Framework to create smart, intelligent bots that users can interact with using natural language, which is interpreted using the LUIS service (part of Microsoft Cognitive Services).  The bots that we can create using these tools are really powerful but, in most cases users are still typing text when communicating with the bot.  This is perfectly ok in most situations, but it would certainly be fitting for some requirements to actually be able to allow he user to just talk to the bot using speech and have the bot respond in the same way. There are examples of where this can be done on the existing bot channels already, such as users using the Slack mobile app with a voice recognizer function so you don;t have to type your text, but what if we wanted to bake this into our own app?  In this post I am going to provide some basic code that allows a user to converse with a bot using speech in a Windows 10 UWP app.

Read More

Give your bot some ‘manners’ with the BestMatchDialog

As a follow up to my earlier post which introduced the new BestMatch Dialog, now available via NuGet. One of the best uses for the BestMatch Dialog, and the reason I created it in the first place, is adding ‘manners’ to a bot. i.e. being able to respond to those common things that people say that fall outside of the usual conversation you would handle with your bot. Therefore this post will focus on and show how to use a BestMatch Dialog as a child dialog to respond to general messages like “hello” and “thanks”.  It is amazing what a huge difference handling these sorts of messages can make and how much more natural talking to your bot will feel for your end users. Read More

BestMatchDialog for Microsoft Bot Framework now available via Nuget

Recently I have been developing a number of bots using the Microsoft Bot Framework, with the LUIS service to allow users to use natural language to interact with them.  One thing that struck me when we released the bots to a wider user base was just how polite everybody was towards them.  The bot was receiving messages like “hi”, “how are you?”, “thanks” and “bye bye” – the only problem was that the bots didn’t know how to deal with these messages.  Sure, they could deal with a ton of more complex messages / intents using LUIS, but wasn’t able to ‘understand’ and provide such common responses.

I set out to think about how I could solve this problem in a re-usable way and whilst doing so I ended up looking through the open source code for the Bot Framework on GitHub to see how dialogs like the LUIS dialog worked. Then, whilst browsing I came across some code in the Node section of the Framework that handled the matching of Intents within the IntentDialog – a few hours later I had my first version of the BestMatchDialog. Read More