I’ve unintentionally developed a bit of a reputation as the “food-guy” at conferences. It’s selfish, mostly. I like good food (who doesn’t?) and don’t want to spend a ton of money, so I’m always looking for hidden gems in new cities…the places that the locals prize and knowing visitors covet. 

My recent Yelp projects have been centered on developing a Power App of sorts which capitalizes on an “Eat Like Ed” algorithm. The end goal is to hopefully bring good food to the traveling masses, but I'm starting internally with Teams first. Being a polymath, I was distracted by chatbots and decided to first integrate my Yelp calls with them. Here is part of that journey:


Connecting Microsoft Flow and Yelp

It may be easiest to first describe the process from the inside out with the Yelp call being at the core. Yelp has very robust documentation for developers and detailed out the parameters to use so that even a code-averse builder like myself could fumble through and produce something useful.

Important Insight: The methodology used here can be applied to any site with open-enough connectors. Don’t get locked into thinking this will only be useful if you want to connect to Yelp. Use the process to fit your specific use-case. Experiment, Fail, Learn.

My first effort to translate my food-picking methods into a quantifiable Microsoft Flow action looked like this:

image

I cover the connection aspect in my last article, but the query parameters on this one are a little different. I am building in limitations on the following:

  • Price (one or two dollar signs)
  • Limit of 5 results
  • Sort by rating
  • Limit the results to roughly 2 miles (in meters, per Yelp).  

The only dynamic value (currently) is the location. My next step might be a question about the user’s mood, or maybe a guess based on the time of the request: “lunch”, “dinner”, or something like that.

Atbot for Microsoft Teams

Triggering specific actions such as, "where to eat?" is accomplished by a no-code chatbot from AtBot. If you are just getting started, Atbot has a free tier which offers 3 personal skills per user, and 6 shared skills per tenant. A “Skill” would be a learned Flow, but AtBot has several customizable templates already available to fit your needs. 

Insight: To get the natural language integration and adaptive cards (both covered later) you will need the Enterprise level license. After the 30-day trial, pricing can start as low as $6.99 per named user per month.

Getting started takes just a few clicks from the main AtBot page. Click on the “Get AtBot for Teams” and log in with your usual tenant credentials. You will want to choose either a personal chatbot, or add it to a Teams channel. The Flow will connect to either, but you can add them both at any point.

image

 

To build the trigger, start with a new, blank Flow and find the trigger from AtBot called, “When an Intent is Used”:

 

image

 

When you are testing out your Flow and bot, it’s okay to leave the Trigger Type as “Personal”.  Just remember, before anyone else can use the bot, you will need to switch the Trigger Type to “Shared".

Natural Language Chatbots Meet AI

With the trigger just as it is now, the user will need to type exactly “Where to eat” in the chatbot window before the flow will be triggered. To fix that, we use Microsoft’s Language Understanding or Luis.ai. Without using any code, Luis will help translate phrases (Microsoft calls these ‘utterances’) like “I’m Hungry”, “Where should I eat?”, and “What’s good to eat?” into the intent, “Where to eat?” The utterance triggers the Flow just as if the user had typed in the keywords exactly. We will add the natural language bit now, before moving forward.

Begin at Luis.ai and sign in with your Microsoft tenant credentials, then from your dashboard, click “Create new app”. From there, you’ll be taken to the "Intents" section where you’ll add the "Utterances" or "Phrases" people might say to do the thing (intent) you want. You can select from some pre-built Intents, but in this case, I just added one called “Where to Eat”.  From there, you can use (at least) five different phrases people might say to initiate the intent. Clicking “Train” and then “Publish” will make the AI portion usable for your flow.

To connect your Luis.ai app to your AtBot trigger:

  1. Click on “Manage” near the top of the app you had set the intent and utterances for
  2. Click on “Keys and Endpoints” on the left side
  3. Copy the "Authoring Key"
  4. Go back to the trigger on your flow and select “Show advanced options” 
  5. Paste the Authoring Key in the “LUIS API Key” field
  6. Select your App and Intent from the drop down

Now, phrases that are close will trigger the intent and the Flow.

image

Yelp HTTP GET Response

Now we have a Flow triggered by a natural language query from Teams. We still need to collect additional information from the user (location), submit the query to Yelp, and then present the results somehow.

Notice, in the very first screenshot, "location" has a dynamic value in Flow labeled “Response Text”; which comes from the AtBot action, “Get Response from User”.  In this action, you specify the phrase you would like the chatbot to use. Additionally, you decide whether you will take a response from ‘any user’ or only the ‘original user’. I envisioned a group meal scenario, so I wanted anyone in the Microsoft Teams Channel to have the ability to respond.

Insight: Reply Activity took me a while to figure out; it lets AtBot and Flow know that the input is part of a conversation. Use the ‘dynamic’ value “Reply Activity” from the trigger as you continue to build your AtBot Flows.

 

image 

With our intent trigger (layered with language understanding), and the location dialogue, combined with the HTTP action from the top, we have a Flow that will fetch information when someone asks for it.

Run With The Flow

My suggestion, at this point, is to run the Flow if you haven’t already. The HTTP GET will return all of our Yelp data in JSON. To keep this no-code, we want some kind of results from the HTTP GET to help with the Parse JSON Schema. 

  1. Go to the Flow runs and open one up.
  2. Go to the HTTP step and open that up so you can see the Output Body (you’ll need to scroll a bit to get to that).
  3. Copy the whole Body section to your clipboard.
  4. Add an action called “Parse JSON”.  
  5. Under Content, you’ll add the dynamic value “Body” from the HTTP step.
  6. Click the words at the bottom, “Use sample payload to generate schema”. 

This will pop up a window where you can paste the Body from your previous Flow run, and when you click OK, Flow will translate that to the actual schema.

imageimage 

Insight: I’m going to step outside of the “No-Code” bubble for a moment, but don’t worry…we’ll come right back after I talk about some errors you most likely will get. The biggest one vexed me for a long while until I figured out the right search terms to find John Liu’s article, A Thesis on the Parse JSON action in Microsoft Flow. When (not “if”) you get an error that the Schema is invalid, take a look at this article. Even though I don’t code, I was able to understand his section under “Problem 4 – null value properties” enough to see that was the thing causing my error. Stick with me as we look (quickly) at why.

When Flow developed the JSON schema based on the sample payload you submitted (from the trial run of the HTTP GET), it was quite literal and added everything. In the sample below, you can see that it brought in the longitude that Yelp would get and labeled it as a “Number” in the type.  Meaning, Flow will expect a number every time! If it got something else, or even a blank (null), Flow freaks out and gives you the Invalid Schema error. Imagine on a field like “Address 3,” where there is rarely data, how things could get a little bumpy. The fix, according to John, was to simply delete the line that identifies the “type”. This way, Flow expects something, but isn’t totally heartbroken if it doesn’t see it.

 

imageimage

Insight: The other error is what happens when your search results yield no results and present an empty array. More on that later.

Create Adaptive Cards in Atbot

Now that we’ve parsed out the information returned from our Yelp HTTP GET, using the Parse JSON (still with no code), we need to present the results to the user. We will display results using adaptive cards in AtBot. Atbot actually has a very nice video detailing this exact process, so I’ll cover some of the highlights here.

To create an adaptive card, begin at your AtBot Admin Portal:

  1. Go to AI Integrations and then Adaptive Card Templates.  
  2. Click on "Create Card" and choose if you want a Form or a Display (we want a Display since we’re only presenting data here). 
  3. Opening the Card Editor will give you an almost Power Apps looking interface which will let you add fields to your adaptive card template. Pay attention to the part of the video where they cover the ID’s. If you want to access the field in Flow, you must populate the ID field with something.

image

 

If you’re feeling adventurous, you can check out samples at adaptivecards.io and copy the code into the JSON portion of the editor at the bottom. This will take some editing in the interface to customize, but it is a great way to learn about the possibilities. Once you click “Save & Close Editor”, you’ll see your card and a list of available fields on the right.

 

image

 

As I’ve said before, I may not be able to code, but I can follow instructions. So I created the array as they did in the video. I used the “Generate Adaptive Card” from AtBot which triggered an automatic “Apply to Each” when I populated the first field:

 

image

 

Then, I added each card result to the array:


image

 

Finally, (outside of the “Apply to Each”) I presented the Adaptive Card set using another AtBot action inside Flow. Note: the “Reply Activity” keeping the conversation together here, as well.

 

image

Finished Flow: Conversations with Chatbots

Now, you should have a chatbot that answers natural language questions about food and hopefully presents relevant results:

 

image

 

I still have some work to do on it because I haven’t created an ‘out’ if the language query isn’t super close to my utterances. LUIS will score each query and I think there’s a way to say, “If the score is less than x, ask for clarification”. Currently, I could type “My teeth hurt” and it will ask where I’d like to eat.

I’d also like to add some time intelligence to narrow the search by meal (e.g. breakfast, lunch, and dinner), then maybe ask if the user was in the mood for something specific. All of this is possible now that I’ve got some of the key basics down and can begin to really Try, Fail, Learn, and Repeat!