Full free online course on NLP with Azure Cognitive Services

Hi,

With the recent publishing on my 6th episode, I just closed the chapter on using NLP with Azure Cognitive Services. In this course, I explain little by little how to build a chatbot that deals with various tasks, each task being associated to one of the Cognitive Services.

As the NLP chapter is closing, here is a recap of what I covered so far:

Episode 1  In this episode, I will draw the AI landscape of the Microsoft ecosystem. I want you to be a little more familiar with fundamental topics such as Machine Learning, Deep Learning and Natural Language Processing which might sound a little bit confusing for many developers. Once the high-level concepts will be covered, I’ll make an introduction of the Azure Cognitive Services and I’ll try to quickly answer the “what’s in it for me” question out of real world examples mapped to the various services. If you’re a hardcore developer, you might be disappointed by this episode as I will not show code yet, but by the end of it, you should understand when to use what and how to manage customer expectations. For the “how to bits”, I invite you to join me at Episode 2. 

Episode 2 Episode 1 was a global introduction to AI & Cognitive Services. Episode 2 is much more hands-on oriented and is aimed at building fundamental assets for the next episodes. You will learn how to get started with a minimal chatbot (that we’ll reuse throughout the entire course) and the typical steps involved in creating & consuming a Cognitive Service. This is a 15 minutes intensive session with 13 minutes of pure step by step demos.

Episode 3 Episode 1 was about analyzing what’s available in the Microsoft ecosystem in terms of artificial intelligence. In Episode 2, we setup the foundations to get us started with a minimal bot & we saw the typical steps involved in creating & consuming a Cognitive Service.
In this demo-intensive episode, we’ll first see what the challenges are when dealing with natural language and building chatbots. We’ll then capitalize on what we’ve done already, by adding a LUIS layer to our bot. I will not only explain how LUIS works but I’ll also give you concrete hands-on demos out of real world experience.

Episode 4  In the 3 first episodes, we have been building a minimal chatbot and we created the LUIS app that fullfills the following purposes: the ability of handling casual chat with end users, the ability to respond to IT-related questions, the possibility for users to view and report incidents, to find documents and to find experts who can help them on specific matters. Now that you got familiar with intents, entities, active learning and LUIS’training, it is time to implement the actual actions.

In this episode, we see how to take advantage of QnA Maker to handle the casual chat and IT knowledge base functionalities our chatbot has to deal with. I will also highlight the strengths and current limitations of QnA Maker.

Episode 5 So far in this course, we saw the high level AI concepts and we built a chatbot bound to a LUIS app. Our bot is already able to respond to IT-related questions and can tackle day-to-day conversations thanks to QnA Maker.

While QnA Maker is a great tool for textual questions/answers scenarios, it cannot be used when dealing with documents or data stored within a database and customers are not always willing to relocate their information into QnA Maker even when feasible.

In this episode, we’ll see how to take advantage of another Cognitive Service, namely the Linguistic Analysis API to perform Natural Search Queries against external sources of information. I will show two use-cases: querying a SharePoint document center and interacting with a SQL database but the principles depicted here may be used to query any kind of information system.

Episode 6 So far in this course, we saw the high level AI concepts and we built a chatbot bound to a LUIS app. We also saw how to take advantage of the Linguistic Analysis API to perform natural search queries against external data sources. In this episode, we will send documents to our chatbot that will automatically tag and route them into a document management system thanks to Text Analytics, Entity Linking & Language Understanding Intelligent Service.

I’ve already finished recording my 7th episode on Vision Services and mostly on the Custom Vision Service and it will be released soon. I’m currently working on Custom Speech, stay tunned!.

Feel free to watch & share this course!

Happy AI!

About Stephane Eyskens

Office 365, Azure PaaS and SharePoint platform expert
This entry was posted in Azure, Azure Cognitive Services, NLP and tagged , , . Bookmark the permalink.

Leave a comment