Azure Data & AI MVP Challenge - Part 3

logo azure

It has been two weeks since I started the Azure Data & AI MVP Challenge. After exploring the machine learning and image analysis, the last week was more related to the analysis of texts, speech, natural language understanding and construction of bot for online help. Again, here is a summary of what I learned over the past week.

The first part is available here: Part 1
The second part is available here: Part 2
The third part: you are here
The fourth part is available here: Part 4
The fifth part is available here: Part 5

Text analysis

First of all, what is text analysis. It is a process whereby an artificial intelligence algorithm evaluates the attributes of a text to extract certain information. As text analyzes are made very sophisticated, i.e. they are able to understand according to the semantics of words, and not just a word-for-word translation, it is generally better to use services like the Text Analytics on Azure rather than programming it ourselves, especially considering that it is only a matter of creating the resource on Azure and then using the endpoint exposed.

Thanks to this kind of analysis, we can determine the language of a text, but also more advanced things such as the detection of sentiment in a text. This kind of analysis could, for example, make it possible to determine whether a review is positive or negative on a transactional site and even allow us to react more quickly. We would also be able to extract points of interest from a document.

Speech recognition and synthesis

Whether it's with our smart speakers or our mobile, we use this feature every day. This is the process of interpreting audio (our voice for example) and then converting it all into text that can be interpreted. Synthesizing is rather the reverse of this process, that is, it will take text to convert it to audio, with input as the text to say and the choice of voice to use.

Like everything I am currently exploring on AI with Azure, everything is in place on this side to offer us a service that is already trained for this. With the resource Speech (or the Cognitive Services if we use several features), we can access the APIs Speech-to-Text or Text-to-Speech to get the job done via a endpoint.

Text or speech translation

Again something that we use frequently in our daily life is translation. We are a long way from what it used to be now that semantics are considered for translations. One thing I learned about the service Translator Text Azure, but that doesn't surprise me, is that there are over 60 languages supported for doing text-to-text. You can also add filters to avoid vulgar words or add tags on content that you don't want to translate, like the name of a company.

Natural language comprehension

The Language Understanding, it's the art of taking what a user says or writes in their natural language and transforming it into action. Three things are essential in this concept. The first is what we call the statements (utterance), ie the sentence to be interpreted. For example the phrase "turn on the light". Then we have the intentions (intention), which represent what we want to do inside the statement. In this example, we would have an “Ignite” intention that would do the task of lighting something. Finally, we have the entities (entities), which are the items referred to in the statements. Still in the same example, it would be "light".

Once you have these concepts, setting it all up is pretty straightforward using LUIS (Language Understanding Intelligent Service). We define a list of statements, intentions and entities, something that can also be done from a predefined list for certain fields. We call this the authoring model. Once it is trained (a simple click of a button), we can publish it and use it predictively in our applications.

Construction of a bot

Cognitive service QnA Maker on Azure is the basis of what allows us to build a bot who will be able to answer automatically questions about an automated support service for example. We build a question-answer bank from which we train the knowledge base. What this does is to apply natural language processing to the questions and answers, to ensure that the system will be able to respond even if it is not phrased exactly as entered.

Once we have this, we just have to build a bot to use this knowledge base. It's possible to use an SDK and jump into some code to do it, but I must admit that I loved just pushing a button in my database. QnA Maker which automatically created me a bot. All I had to do next was use it in my app.

Microsoft Azure AI-900 certification

Once all these lessons were completed, I found that I had done all the lessons to prepare for the exam for the certification Azure AI Fundamentals. So I didn't wait, having spent the last two weeks learning all this, and I scheduled my exam and got my certification. If you follow these lessons on an ongoing basis, I am sure you will also have the chance to take the exam and get your certification, the lessons prepare us really well.


I have added a few lessons to my general lesson collection as well as a new collection for everything related to word analysis which has been mainly my last week. Do not hesitate to let me know in the comments if you are moving towards certification or if you have any questions!

Collection on Data & AI: See the collection
Collection on the Machine Learning: See the collection
Collection on image analysis: See the collection
Word Analysis Collection: See the collection


mVP Challenge

The first part is available here: Part 1
The second part is available here: Part 2
The third part: you are here
The fourth part is available here: Part 4
The fifth part is available here: Part 5

Author: Bruno

By day, I am a developer, team leader and co-host of the Bracket Show. By evening, I am a husband and the father of two wonderful children. The time I have left after all of this I spend trying to get moving, playing and writing video game reviews, as well as preparing material for the Bracket Show recordings and for this blog. Through it all, I am also passionate about music and microbreweries.