As you might have seen, the Linguistic Analysis API of the Azure Cognitive Services is available as part of the language category. It allows you to perform POS-tagging, which is basically a way to identify each word and its role within a piece of text.
I find POS-Tagging particularly useful whenever you want to capture the essence of a phrase. I’ve been using it a few times to simplify user search queries and build dynamic queries programmatically. So, whatever usage you want to make out of POS-tagging, the current implementation of Microsoft has a little shortcoming: they never answer with both tokens & tags regrouped. To give you a concrete example, here is a screenshot of all possible results (at the time of writing):
It is always dangerous to compare softwares/services from different vendors as benchmarking is rarely exhaustive and can sometimes be subject to interpretation and misunderstanding. On top of that, hardcore fans of vendors might lose their common sense and objectivity as it can quickly turn emotional.
However, I recently had the opportunity to have a demo of Watson from a seasoned IBM consultant which lead me to try out and explore Watson a little further. I’m working with Azure Cognitive Services for more than a year, especially using LUIS and the bot framework to build chatbots. On top of my Azure experience, I have some background in AI & NLP in general as I’ve been involved in multiple initiatives (as for instance a package I wrote on DBPedia Spotlight) for the past 3 years, using neither IBM, neither Microsoft services. Continue reading
Today, Bots & more particularly Chatbots are on every lip! Why this buzz? The answer is very easy: AI has become mainstream thanks to vendors such as Microsoft, IBM and others. Chatbots make use of computational linguistics behind the scenes, not a new concept though, since Alan Turing was already working on that in the nineteen-fifties! So, what has changed in the meantime, why do we sunddenly reach a new paradigm? Resources & Data are the answers as today, the amount of available information & hardware capabilities have increased dramatically. Continue reading
I recently realized thanks to a colleague @MMeuree, that the ID_TOKEN that’s supposed to contain the group membership as shown below:
does not list more than 4 groups (here I grabbed the token using another flow). So, if the user belongs to more than 4 groups, you’re going to see hasgroups: true as part of the token instead of the actual groups. This behavior is by design no matter what you specified in the App manifest with regards to the groupMembershipClaims attribute. So, the alternative is simply to query the Graph API.
In this blog post, I’ going to explain what I consider a creative way of exposing on-premises APIs. Let’s envision the following scenario:
You have an on-premises API that is secured using Windows Authentication and for which you need to know the identity of the caller. This API is already consumed by various on-premises consumers and you want to make it also available to online consumers but you want to benefit from throttling and caching capabilities of Azure API Management.
A traditional way of doing this could be by hosting your on-premises API into a DMZ and plugging the APIMGMT to that DMZ endpoint. Another way is using VNETs and VPN techniques to control and establish connectivity. That said, you’d control the connectivity but you should still be able to control identity as per our scenario, this is a pre-requisite for your backend API to know the identity of the user consuming it (via an App).
This year, at Techorama, I’m going to speak about API Management in Azure. Although this topic is not new, I realized it’s pretty unknown by peers I talk to. Here are the agendas of my sessions, as I cover this topic in two different sessions:
- What is APIMGMT?
- Let’s have a look at the management portals
- How to publish an API
- How to deal with policies
- Monetizing APIs & Integrate with other systems
- Using network-related techniques to prevent unexpected access to backend APIs
- Controlling who’s accessing backend APIs (gateway, other?)
- Enabling different consumption routes
- Exposing on-premises APIs from Azure AD Proxy published on-premises applications
Posted in Azure
Enabling Bing Spell Check, one of the Azure Cognitive Services at LUIS level is a piece of cake, indeed, this involves the following steps:
- Getting a key
- Registering the key in LUIS
- Associating that key to your LUIS application
Once done, you also need to change a few things in code: