Blog Post

Azure Communication Services Blog
4 MIN READ

AI-Powered Chat with Azure Communication Services and Azure OpenAI

seankeegan's avatar
seankeegan
Icon for Microsoft rankMicrosoft
Dec 20, 2024

Many applications offer chat with automated capabilities but lack the depth to fully understand and address user needs. What if a chat app could not only connect people but also improve conversations with AI insights? Imagine detecting customer sentiment, bringing in experts as needed, and supporting global customers with real-time language translation. These aren’t hypothetical AI features, but ways you can enhance your chat apps using Azure Communication Services and Azure OpenAI today.

In this blog post, we guide you through a quickstart available on GitHub for you to clone and try on your own. We highlight key features and functions, making it easy to follow along. Learn how to upgrade basic chat functionality using AI to analyze user sentiment, summarize conversations, and translate messages in real-time.

Natural Language Processing for Chat Messages

First, let’s go through the key features of this project.

  • Chat Management: The Azure Communication Services Chat SDK enables you to manage chat threads and messages, including adding and removing participants in addition to sending messages.
  • AI Integration: Use Azure OpenAI GPT models to perform:
    • Sentiment Analysis: Determine if user chat messages are positive, negative, or neutral.
    • Summarization: Get a summary of chat threads to understand the key points of a conversation.
    • Translation: Translate into different languages.
  • RESTful endpoints: Easily integrate these AI capabilities and chat management through RESTful endpoints.
  • Event Handling (optional): Use Azure Event Grid to handle chat message events and trigger the AI processing.

The starter code for the quickstart is designed to get you up and running quickly. After entering your Azure Communication Services and OpenAI credentials in the config file and running a few commands in your terminal, you can observe the features listed above in action. There are two main components to this example.

  • The first is the ChatClient, which manages the capturing and sending of messages, via a basic chat application using Azure Communication Services.
  • The second component, OpenAIClient, enhances your chat application by transmitting messages to Azure OpenAI along with instructions for the desired types of AI analysis.

AI Analysis with OpenAIClient

Azure OpenAI can perform a multitude of AI analyses, but this quickstart focuses on summarizing, sentiment analysis, and translation. To achieve this, we created three distinct prompts for each of the AI analysis we want to perform on our chat messages. These system prompts serve as the instructions for how Azure OpenAI should process the user messages.

To summarize a message, we hard-coded a system prompt that says, “Act like you are an agent specialized in generating summary of a chat conversation, you will be provided with a JSON list of messages of a conversation, generate a summary for the conversation based on the content message.” Like the best LLM prompts, it’s clear, specific, and provides context for the inputs it will get. The system prompts for translating and sentiment analysis follow a similar pattern.

The quickstart provides the basic architecture that enables you to take the chat content and pass it to Azure OpenAI for analysis.

 

A printout from the console of AI-generated sentiment analysis, translations, and summarization.

The Core Function: getChatCompletions

The getChatCompletions function is a pivotal part of the AI chat sample project. It processes user messages from a chat application, sends them to the OpenAI service for analysis, and returns the AI-generated responses.

Screenshot of the full code for the getChatCompletions function

Here’s a detailed breakdown of how it works:

Parameters

The getChatCompletions function takes in two required parameters:

  • systemPrompt: A string that provides instructions or context to the AI model. This helps guide OpenAI to generate appropriate and relevant responses.
  • userPrompt: A string that contains the actual message from the user. This is what the AI model analyzes and responds to.

Deployment Name: The getChatCompletions function starts by retrieving the deployment name for the OpenAI model from the environment variables.

Message Preparation: The function formats and prepares messages to send to OpenAI. This includes the system prompt with instructions for the AI model and user prompts that contain the actual chat messages.

Sending to OpenAI: The function sends these prepared messages to the OpenAI service using the openAiClient’s getChatCompletions method. This method interacts with the OpenAI model to generate a response based on the provided prompts.

Processing the Response: The function receives the response from OpenAI, extracts the AI-generated content, logs it, and returns it for further use.

Explore and Customize the Quickstart

The goal of the quickstart is to demonstrate how to connect a chat application and Azure OpenAI, then expand on the capabilities. To run this project locally, make sure you meet the prerequisites and follow the instructions in the GitHub repository. The system prompts and user messages are provided as samples for you experiment with. The sample chat interaction is quite pleasant. Feel free to play around with the system prompts and change the sample messages between fictional Bob and Alice in client.ts to something more hostile and see how the analysis changes. Below is an example of changing the sample messages and running the project again.

A sample console output after changing the tone of the AI-analyzed messages
Real-time messages

For your chat application, you should analyze messages in real-time. This demo is designed to simulate that workflow for ease of setup, with messages sent through your local demo server. However, the GitHub repository for this quickstart project provides instructions for implementing this in your actual application. To analyze real-time messages, you can use Azure Event Grid to capture any messages sent to your Azure Communication Resource along with the necessary chat data. From there, you trigger the function that calls Azure OpenAI with the appropriate context and system prompts for the desired analysis. More information about setting up this workflow is available with "optional" tags in the quickstart's README on GitHub.

Conclusion

Integrating Azure Communication Services with Azure OpenAI enables you to enhance your chat applications with AI analysis and insights. This guide helps you set up a demo that shows sentiment analysis, translation, and summarization, improving user interactions and engagement.

To dive deeper into the code, check out the Natural Language Processing of Chat Messages repository, and build your own AI-powered chat application today!

 

Updated Jan 02, 2025
Version 2.0
No CommentsBe the first to comment