openai
50 TopicsAI-900: Microsoft Azure AI Fundamentals Study Guide
This comprehensive study guide provides a thorough overview of the topics covered in the Microsoft Azure AI Fundamentals (AI-900) exam, including Artificial Intelligence workloads, fundamental principles of machine learning, computer vision and natural language processing workloads. Learn about the exam's intended audience, how to earn the certification, and the skills measured as of April 2022. Discover the important considerations for responsible AI, the capabilities of Azure Machine Learning Studio and more. Get ready to demonstrate your knowledge of AI and ML concepts and related Microsoft Azure services with this helpful study guide.33KViews11likes3CommentsAzure OpenAI Service is now generally available
Early this week Satya Nadella, Microsoft CEO, and Eric Boyd, Corporate AI Vice President, announced Azure OpenAI Service generally available, which will soon include ChatGPT – the fine-tuned version of GPT-3.5 built upon Azure AI infrastructure gone viral in the last few weeks. But let’s take a step back. What is Azure OpenAI? And how can you get started?29KViews6likes3CommentsChatGPT- What? Why? And How?
This blog discusses ChatGPT, a pre-trained language model that has garnered significant attention in the AI community due to its innovative capabilities. It explores the technology behind ChatGPT, its purpose, function, and utilization, as well as its potential applications and impact on the field of artificial intelligence. This blog also covers the limitations of ChatGPT and its architecture and working, including the use of advanced machine learning techniques such as Transformer and fine-tuning.10KViews5likes0CommentsHow to Build an AI-Powered Developer Newsletter with Power Platform and ChatGPT3
Build a developer newsletter with Power Platform and ChatGPT3 to help developers stay up to date with the latest trends in technology. This solution will do the heavy lifting by taking advantage of the power of ChatGPT3 and Power Platform. You can generate a newsletter by following the steps outlined in this guide such as signing up for the OpenAI API, creating an adaptive card, and building a solution in Power Automate. Once the solution is established, you can run it and adjust the trigger to get the desired results. Use ChatGPT3 to generate a newsletter from the text input from Teams, and use Power Automate to send an email to the desired recipients.17KViews5likes10CommentsMastering Azure OpenAI Services: A Comprehensive Learning Path for Aspiring AI Engineers
Are you a computer science student looking to delve into the world of Azure OpenAI Services? Look no further! In this Microsoft Learning Pathway, "Develop Generative AI solutions with Azure OpenAI Service," you'll embark on an exciting journey to harness the power of OpenAI's vast language models like ChatGPT, GPT, Codex, and Embeddings. These models are pivotal for creating innovative Natural Language Processing (NLP) solutions that can comprehend, converse, and generate content.7.6KViews4likes0CommentsAzure OpenAI Services in teaching and education
With the advent of Large language models(LLMs) like GPT-3, we see a transformation in education. In this article, I present my views on the future of education considering these developments. The views presented here are based on my teaching - but are a personal perspective. Today, there is a lot of excitement and speculation about GPT-3, and it is natural to ask how intelligent GPT-3 is and whether it approaches human-level intelligence. But in many ways, that's the wrong question to ask. Instead, exploring the idea of how we can build ChatGPT-like functionality using our own data is more interesting. When framed this way, we focus on the pragmatic and ignore the esoteric. If the industry adopts the 'co-pilot first approach.', educators must follow this trend to keep up with the new job roles. This will need a complete rethinking of many of the current ideas on education and the adoption of some new ideas that I proposed in this article. The conversation changes from: 'chatGPT is used for exam cheating or not' to: How can I empower my students to take up jobs of the future if the co-pilot first mode of work becomes a default?'15KViews4likes4CommentsUnlock the Future of Secure Authentication: Moving to Keyless Authentication with Managed Identity
Why Managed Identity? Traditional authentication methods often rely on keys, secrets, and passwords that can be easily compromised. Managed identity, on the other hand, provides a secure and seamless way to authenticate without the need for managing credentials. By leveraging managed identity, you can: Reduce the Risk of Compromise: As most security breaches start from identity-related issues, moving to a keyless authentication system significantly reduces the chances of such compromises. Simplify Credential Management: Managed identity eliminates the need for managing keys and secrets, making the authentication process more straightforward and less error-prone. Enhance Security: With managed identity, your applications are granted access to resources securely, without the risk of exposing sensitive credentials. Getting Started with Managed Identity To help you get started with managed identity, Microsoft offers comprehensive training modules for different programming languages. These modules cover the basics of using managed identity to authenticate to Azure OpenAI, providing you with the knowledge and skills needed to implement secure authentication in your applications. Available Microsoft Learn Training Modules: Introduction to using Managed Identity to authenticate to Azure OpenAI with .NET - Training | Microsoft Learn Introduction to Azure OpenAI Managed Identity Authentication with Java - Training | Microsoft Learn Introduction to Azure OpenAI Managed Identity Authentication with Python - Training | Microsoft Learn Introduction to Azure OpenAI Managed Identity Authentication with JavaScript - Training | Microsoft Learn Why Should Students Learn Managed Identity? As a student, learning about managed identity and keyless authentication is not just about enhancing your technical skills; it's about preparing for the future. Here are a few reasons why you should dive into managed identity: Stay Ahead in the Job Market: With cybersecurity being a top priority for organizations, having expertise in secure authentication methods like managed identity will make you a valuable asset to potential employers. Build Secure Applications: By implementing managed identity, you can build applications that are more secure, reliable, and less susceptible to breaches. Understand Modern Security Practices: Gaining knowledge about managed identity and keyless authentication will give you a deeper understanding of modern security practices and how to protect applications in today's digital landscape. Conclusion In conclusion, moving to keyless authentication through managed identity is a game-changer for securing applications. As students and future developers, embracing this technology will not only enhance your skills but also contribute to building a safer and more secure digital world. So, take the first step today by exploring the training modules and mastering the art of managed identity!205Views3likes1CommentUnderstanding the Difference in Using Different Large Language Models: Step-by-Step Guide
Unlock the secrets of deploying Large Language Models on Azure with our comprehensive guide! Learn step-by-step integration techniques for models like GPT-2, Llama 2, and Dolly v1 in your Web Applications or Power Apps. Explore detailed instructions, ready-made code, and expert tips. Join us for a live session on November 2nd, 2023, to harness the power of AI and Microsoft tools. Become an entrepreneur with Microsoft Founders Hub, offering up to $2,500 OpenAI credits and $1,000 Azure credits. Dive into the world of tech solutions and creative writing ideas today!14KViews3likes1CommentBuilding a Basic Chatbot with Azure OpenAI
Overview In this turorial, we'll build a simple chatbot that uses Azure OpenAI to generate responses to user queries. To create a basic chatbot, we need to set up a language model resource that enables conversation capabilities. In this tutorial, we will: Set up the Azure OpenAI resource using the Azure AI Foundry portal. Retrieve the API key needed to connect the resource to your chatbot application. Once the API key is configured in your code, you will be able to integrate the language model into your chatbot and enable it to generate responses. By the end of this tutorial, you'll have a working chatbot that can generate responses using the Azure OpenAI model. Signing In and Setting Up Your Azure AI Foundry Workspace Signing In to Azure AI Foundry Open the Azure AI Foundry page in your web browser. Login to your Azure account. If you don't have an account, you can sign up. Setting Up Your Azure AI Foundry Workspace Select + Create project to create a new project. Perform the following tasks: Enter Project name. It must be a unique value. Select Hub you'd like to use (create a new one if needed). Select Create. Setting Up the Azure OpenAI Resource in Azure AI Foundry In this step, you'll learn how to set up the Azure OpenAI resource in Azure AI Foundry. Azure OpenAI is a pre-trained language model that can generate responses to user queries. We'll be using it in our chatbot. Select Models + endpoints from the left side menu. On this page, you can deploy language models and set up Azure AI resources. In this step, we will deploy the Azure OpenAI GPT-4 language model. Select + Deploy model. Select Deploy base model. In this tutorial, we will deploy the GPT-4o model. Select GPT-4o. Select Confirm. Select Deploy. The model will be deployed. Once the deployment is complete, you will see the model listed on the Models + endpoints page. Now that the model is deployed, you can retrieve the API key needed to connect the model to your chatbot application. Select the model you deployed on the Models + endpoints page. ` On the model details page, you can view information about the model, including the API key. We will come back this page later to add the required information into the environment variables. Setting Up the Project and Install the Libraries Now, you will create a folder to work in and set up a virtual environment to develop a program. Creating a Folder to Work Inside It Open a terminal window and type the following command to create a folder named basic-chatbot in the default path. mkdir basic-chatbot Type the following command inside your terminal to navigate to the basic-chatbot folder you created. cd basic-chatbot Creating a Virtual Environment Type the following command inside your terminal to create a virtual environment named .venv. python -m venv .venv Type the following command inside your terminal to activate the virtual environment. .venv\Scripts\activate.bat NOTE If it worked, you should see (.venv) before the command prompt. Installing the Required Packages Type the following commands inside your terminal to install the required packages. openai: A Python library that provides integration with the Azure OpenAI API. python-dotenv: A Python library for managing environment variables stored in an .env file. pip install openai python-dotenv Setting up the Project in Visual Studio Code To create a basic chatbot program, you will need two files: example.py: This file will contain the code to interact with Azure resources. .env: This file will store the Azure credentials and configuration details. NOTE Purpose of the .env File The .env file is essential for storing the Azure information required to connect and use the resources you created. By keeping the Azure credentials in the .env file, you can ensure a secure and organized way to manage sensitive information. Setting Up example.py File Open Visual Studio Code. Select File from the menu bar. Select Open Folder. Select the basic-chatbot folder that you created, which is located at C:\Users\yourUserName\basic-chatbot. In the left pane of Visual Studio Code, right-click and select New File to create a new file named example.py. Add the following code to the example.py file to import the required libraries. from openai import AzureOpenAI from dotenv import load_dotenv import os # Load environment variables from the .env file load_dotenv() # Retrieve environment variables AZURE_OPENAI_ENDPOINT = os.getenv("AZURE_OPENAI_ENDPOINT") AZURE_OPENAI_API_KEY = os.getenv("AZURE_OPENAI_API_KEY") AZURE_OPENAI_MODEL_NAME = os.getenv("AZURE_OPENAI_MODEL_NAME") AZURE_OPENAI_CHAT_DEPLOYMENT_NAME = os.getenv("AZURE_OPENAI_CHAT_DEPLOYMENT_NAME") AZURE_OPENAI_API_VERSION = os.getenv("AZURE_OPENAI_API_VERSION") # Initialize Azure OpenAI client client = AzureOpenAI( api_key=AZURE_OPENAI_API_KEY, api_version=AZURE_OPENAI_API_VERSION, base_url=f"{AZURE_OPENAI_ENDPOINT}/openai/deployments/{AZURE_OPENAI_CHAT_DEPLOYMENT_NAME}" ) print("Chatbot: Hello! How can I assist you today? Type 'exit' to end the conversation.") while True: user_input = input("You: ") if user_input.lower() == "exit": print("Chatbot: Ending the conversation. Have a great day!") break response = client.chat.completions.create( model=AZURE_OPENAI_MODEL_NAME, messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": user_input} ], max_tokens=200 ) print("Chatbot:", response.choices[0].message.content.strip()) Setting Up .env File To set up your development environment, we will create a .env file and store the necessary credentials directly. NOTE Complete folder structure: └── YourUserName . └── basic-chatbot . ├── example.py . └── .env In the left pane of Visual Studio Code, right-click and select New File to create a new file named .env. Add the following code to the .env file to include your Azure information. AZURE_OPENAI_API_KEY=your_azure_openai_api_key AZURE_OPENAI_ENDPOINT=https://your_azure_openai_endpoint AZURE_OPENAI_MODEL_NAME=your_model_name AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=your_deployment_name AZURE_OPENAI_API_VERSION=your_api_version Retrieving Environment Variables from Azure AI Foundry Now, you will retrieve the required information from Azure AI Foundry and update the .env file. Go to the Models + endpoints page and select your deployed model. On the Model Details page, copy the following information in to the .env file.: AZURE_OPENAI_API_KEY AZURE_OPENAI_ENDPOINT AZURE_OPENAI_MODEL_NAME AZURE_OPENAI_CHAT_DEPLOYMENT_NAME Paste this information into the .env file in the respective placeholders. Running the Chatbot Program Type the following command inside your terminal to run the program and see if it can answer questions. python example.py Interact with the chatbot by typing your questions or messages. The chatbot will generate responses based on the Azure OpenAI model you deployed. NOTE You can find the full example of this chatbot, including the code and .env template, in my GitHub repository: GitHub Repository778Views2likes0Comments