azure
6948 TopicsCreate your own QA RAG Chatbot with LangChain.js + Azure OpenAI Service
Demo: Mpesa for Business Setup QA RAG Application In this tutorial we are going to build a Question-Answering RAG Chat Web App. We utilize Node.js and HTML, CSS, JS. We also incorporate Langchain.js + Azure OpenAI + MongoDB Vector Store (MongoDB Search Index). Get a quick look below. Note: Documents and illustrations shared here are for demo purposes only and Microsoft or its products are not part of Mpesa. The content demonstrated here should be used for educational purposes only. Additionally, all views shared here are solely mine. What you will need: An active Azure subscription, get Azure for Student for free or get started with Azure for 12 months free. VS Code Basic knowledge in JavaScript (not a must) Access to Azure OpenAI, click here if you don't have access. Create a MongoDB account (You can also use Azure Cosmos DB vector store) Setting Up the Project In order to build this project, you will have to fork this repository and clone it. GitHub Repository link: https://github.com/tiprock-network/azure-qa-rag-mpesa . Follow the steps highlighted in the README.md to setup the project under Setting Up the Node.js Application. Create Resources that you Need In order to do this, you will need to have Azure CLI or Azure Developer CLI installed in your computer. Go ahead and follow the steps indicated in the README.md to create Azure resources under Azure Resources Set Up with Azure CLI. You might want to use Azure CLI to login in differently use a code. Here's how you can do this. Instead of using az login. You can do az login --use-code-device OR you would prefer using Azure Developer CLI and execute this command instead azd auth login --use-device-code Remember to update the .env file with the values you have used to name Azure OpenAI instance, Azure models and even the API Keys you have obtained while creating your resources. Setting Up MongoDB After accessing you MongoDB account get the URI link to your database and add it to the .env file along with your database name and vector store collection name you specified while creating your indexes for a vector search. Running the Project In order to run this Node.js project you will need to start the project using the following command. npm run dev The Vector Store The vector store used in this project is MongoDB store where the word embeddings were stored in MongoDB. From the embeddings model instance we created on Azure AI Foundry we are able to create embeddings that can be stored in a vector store. The following code below shows our embeddings model instance. //create new embedding model instance const azOpenEmbedding = new AzureOpenAIEmbeddings({ azureADTokenProvider, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, azureOpenAIApiEmbeddingsDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_EMBEDDING_NAME, azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, azureOpenAIBasePath: "https://eastus2.api.cognitive.microsoft.com/openai/deployments" }); The code in uploadDoc.js offers a simple way to do embeddings and store them to MongoDB. In this approach the text from the documents is loaded using the PDFLoader from Langchain community. The following code demonstrates how the embeddings are stored in the vector store. // Call the function and handle the result with await const storeToCosmosVectorStore = async () => { try { const documents = await returnSplittedContent() //create store instance const store = await MongoDBAtlasVectorSearch.fromDocuments( documents, azOpenEmbedding, { collection: vectorCollection, indexName: "myrag_index", textKey: "text", embeddingKey: "embedding", } ) if(!store){ console.log('Something wrong happened while creating store or getting store!') return false } console.log('Done creating/getting and uploading to store.') return true } catch (e) { console.log(`This error occurred: ${e}`) return false } } In this setup, Question Answering (QA) is achieved by integrating Azure OpenAI’s GPT-4o with MongoDB Vector Search through LangChain.js. The system processes user queries via an LLM (Large Language Model), which retrieves relevant information from a vectorized database, ensuring contextual and accurate responses. Azure OpenAI Embeddings convert text into dense vector representations, enabling semantic search within MongoDB. The LangChain RunnableSequence structures the retrieval and response generation workflow, while the StringOutputParser ensures proper text formatting. The most relevant code snippets to include are: AzureChatOpenAI instantiation, MongoDB connection setup, and the API endpoint handling QA queries using vector search and embeddings. There are some code snippets below to explain major parts of the code. Azure AI Chat Completion Model This is the model used in this implementation of RAG, where we use it as the model for chat completion. Below is a code snippet for it. const llm = new AzureChatOpenAI({ azTokenProvider, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION }) Using a Runnable Sequence to give out Chat Output This shows how a runnable sequence can be used to give out a response given the particular output format/ output parser added on to the chain. //Stream response app.post(`${process.env.BASE_URL}/az-openai/runnable-sequence/stream/chat`, async (req,res) => { //check for human message const { chatMsg } = req.body if(!chatMsg) return res.status(201).json({ message:'Hey, you didn\'t send anything.' }) //put the code in an error-handler try{ //create a prompt template format template const prompt = ChatPromptTemplate.fromMessages( [ ["system", `You are a French-to-English translator that detects if a message isn't in French. If it's not, you respond, "This is not French." Otherwise, you translate it to English.`], ["human", `${chatMsg}`] ] ) //runnable chain const chain = RunnableSequence.from([prompt, llm, outPutParser]) //chain result let result_stream = await chain.stream() //set response headers res.setHeader('Content-Type','application/json') res.setHeader('Transfer-Encoding','chunked') //create readable stream const readable = Readable.from(result_stream) res.status(201).write(`{"message": "Successful translation.", "response": "`); readable.on('data', (chunk) => { // Convert chunk to string and write it res.write(`${chunk}`); }); readable.on('end', () => { // Close the JSON response properly res.write('" }'); res.end(); }); readable.on('error', (err) => { console.error("Stream error:", err); res.status(500).json({ message: "Translation failed.", error: err.message }); }); }catch(e){ //deliver a 500 error response return res.status(500).json( { message:'Failed to send request.', error:e } ) } }) To run the front end of the code, go to your BASE_URL with the port given. This enables you to run the chatbot above and achieve similar results. The chatbot is basically HTML+CSS+JS. Where JavaScript is mainly used with fetch API to get a response. Thanks for reading. I hope you play around with the code and learn some new things. Additional Reads Introduction to LangChain.js Create an FAQ Bot on Azure Build a basic chat app in Python using Azure AI Foundry SDK20Views0likes0CommentsEnhancing Infrastructure as Code Generation with GitHub Copilot for Azure
Discover the latest update to GitHub Copilot for Azure, designed to streamline Infrastructure as Code (IaC) generation using Bicep or Terraform. With a new update panel, developers can easily modify project details, hosting services, target services, bindings, and environment variables—all within an intuitive UI. This enhancement eliminates the need for chat-based modifications, improving efficiency and reducing errors. Save time, automate infrastructure deployment, and experience seamless cloud configuration. Try the new GitHub Copilot for Azure update today and optimize your Azure development workflow effortlessly!Azure ADF ServiceNow connector can't retrieve table columns but same login can do in REST API
I have tried to create a pipeline using copy activity to extract data from a table in our ServiceNow dev platform. I have first used the latest version of the ServiceNow connector. However, it didn't work. When I tried to import schema, it shows below error message: Failed to load. The API request to ServiceNow failed. Request Url: https://airtrunkautemp.service-now.com/api/now/table/sys_dictionary?sysparm_query=name%3dfacilities_request^ORname%3dsm_order^ORname%3dtask, Status Code: Forbidden, Error message: {"error":{"message":"Insufficient rights to query records","detail":"Field(s) present in the query do not have permission to be read"},"status":"failure"} Activity ID: 5a99e871-893d-4426-809e-0b22654248f8 Then I tried to use the legacy version of the ServiceNow connector, extract full table data using query. After I executed the pipeline, only 1 column sys_id is returned. I have contacted the ServiceNow support for the issue, they checked and got back me the access login has no issue. Then I wrote python to use REST API to retrieve data from the same table, it works, I could extract all table columns without the insufficient rights issue. Does anyone have this experience before? How did you solve it?25Views0likes1CommentPartner Case Study | Esri
With Esri's cloud-forward geospatial platform, organizations can view, edit, manage, and analyze data to make more informed decisions. The agricultural industry depends on data to feed and clothe populations from roughly 38% of the global land surface. In fact, according to the United Nations Environment Programme, by the year 2050, there will be a need to feed 9 to 10 billion people as the population continues to experience steady growth. That, coupled with environmental factors like climate change, presents a challenge for growers. The data the world relies on should be organized and housed in a way that is easily accessible even by the smallest growers. Esri, a global market leader in geographic information system (GIS) software and location intelligence (LI), transforms decision-making with spatial insights and mapping through their ArcGIS technology. Esri supports its customers in the agricultural industry to ensure every farmer has the data they need. Esri developed a solution powered by Microsoft Azure to meet this growing need for accessible data. Continue reading here Explore all case studies or submit your own Subscribe to case studies tag to follow all new case study posts.22Views0likes0CommentsUse AI for Free with GitHub Models and TypeScript! 💸💸💸
Learn how to use AI for free with GitHub Models! Test models like GPT-4o without paying for APIs or setting up infrastructure. This step-by-step guide shows how to integrate GitHub Models with TypeScript in the Microblog AI Remix project. Start exploring AI for free today!How can avoid over consumption in Azure cloud esp AKS ?
Over consumption is Azure Kubernetes services can lead to unnecessary costs and resource wastage if you are not leverage properly. here are some key things to avoid it. 1.Right size your cluster: a. Optimise Node pools: use appropriate VM size for worker nodes based on work load requirements. do not go more than what is required. b.Use Autoscaling feature to scale nodes up/down c.Set requests & Limits: define CPU and memory requets/lmits for pods to prevent over allocation. 2.Optimise workload scaling 3.Enable Monitoring to see the optimise costs 4.Use managed AKS instead of Shared AKS 5.Use azure files or managed disk efficiently to optimise the storage 6.Cleanup unused resources. delete idle work loads, unused namespaces.12Views0likes0CommentsWhat service principal is used to authenticate Logic Apps to Azure resources?
This question is a bit more academic than practical, but I'm just trying to enhance my knowledge of how Azure authentication works under the hood. The default way to authenticate managed Logic Apps connections is through an OAuth popup asking you to grant permissions. Based on my reading of the Azure docs, this means that you're granting access to the delegated permissions of a service principal. For connectors that access the Graph API, such a service principal in your tenant with the correct delegated permissions: However, I'm struggling to find an equivalent service principal for connectors that use the Azure Resource Management API to interact with services like Log Analytics, sentinel, Logic Apps, etc. I do see a service principal called Azure Logic Apps, but it doesn't have any permissions associated with it. My understanding is that it would need to have the delegated permission user_impersonation to access Azure resources: So my questions here are What Service Principal is used for the OAuth connection to the Azure Resource Management API? If the Azure Logic Apps service principal is used, how is it able to connect to the ARM API without any permissions? Is there some Azure magic happening under the hood here?144Views0likes5CommentsAzure Database for MySQL triggers for Azure Functions (Public Preview)
Developers can now accelerate development time and focus only on the core business logic of their applications, for developing event-driven applications with Azure Database for MySQL as the backend data store. We are excited to announce that you can now invoke an Azure Function based on changes to an Azure Database for MySQL table. This new capability is made possible through the Azure Database for MySQL triggers for Azure Functions, now available in public preview. Azure Database for MySQL triggers The Azure Database for MySQL trigger uses change tracking functionality to monitor a MySQL table for changes and trigger a function when a row is created, updated, or deleted enabling customers to build highly-scalable event-driven applications. Similar to the Azure Database for MySQL Input and Output bindings for Azure Functions, a connection string for the MySQL database is stored in the application settings of the Azure Function to trigger the function when a change is detected on the tables. Note: In public preview, Azure Database for MySQL triggers for Azure Functions are available only for dedicated and premium plan of Azure Functions To enable change tracking on an existing Azure Database for MySQL table to use trigger bindings for an Azure Function, it is necessary to alter the table structure, for example, enabling change tracking on an employees data table: ALTER TABLE employees ADD COLUMN az_func_updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP; Azure Database for MySQL trigger uses the 'az_func_updated_at' and column's data to monitor the table for any changes on which change tracking is enabled. Changes are then processed in the order that they were made, with the oldest changes being processed first. Important: If changes to multiple rows are made at once, then the exact order they're sent to the function is determined on the ascending order of the az_func_updated_at and the primary key columns. If multiple changes are made to a row in-between an iteration, then only the latest changes for that particular rows are considered. The following example demonstrates a C# function that is triggered when changes occur in the employees table. The MySQL trigger uses attributes for the table name and the connection string. using System.Collections.Generic; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.MySql; using Microsoft.Extensions.Logging; namespace EmployeeSample.Function { public static class EmployeesTrigger { [FunctionName(nameof(EmployeesTrigger))] public static void Run( [MySqlTrigger("Employees", "MySqlConnectionString")] IReadOnlyList<MySqlChange<Employee>> changes, ILogger logger) { foreach (MySqlChange<Employee> change in changes) { Employee employee= change. Item; logger.LogInformation($"Change operation: {change.Operation}"); logger.LogInformation($"EmployeeId: {employee.employeeId}, FirstName: {employee.FirstName}, LastName: {employee.LastName}, Company: {employee. Company}, Department: {employee. Department}, Role: {employee. Role}"); } } } } Join the preview and share your feedback! We are eager for you to try out the new Azure Database for MySQL triggers for Azure Functions and build highly scalable event-driven and serverless applications. For more information refer https://aka.ms/mysqltriggers about using MySQL triggers for all the supported programming frameworks with detailed step-by-step instructions If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!153Views0likes0Comments