modern apps
53 TopicsBuilding new AI skills for developers
For this post, we’re focusing on learning new AI skills. We explore resources that will help developers take their AI skills (and their applications) to the next level. Whether you’re new to AI and don’t know where to get started, or you’re experienced but want to advance your skillset with some new tools and capabilities, we have resources that will get you there. Join a challenge, find a Microsoft Learn path, get info on the latest tools and updates, watch in-depth videos, join a live event for hands-on learning, and more. How to develop AI Apps and Agents in Azure: A Visual Guide Move beyond proof-of-concept and build production-ready AI apps and agents in Azure. This visual guide walks you through the essential steps and decisions for building intelligent apps in Azure. Join the Powerful Devs Challenge and level up your Power Platform skills Join the Powerful Devs Challenge and take your developer skills to the next level! Learn how to build, extend, secure, and deploy enterprise-ready solutions with Power Platform. Now through March 28, 2025. Microsoft: Reactor: Python + AI Use Python to build applications that support Gen AI. Join Python + AI—a live 6-part Reactor series—to learn how. Sessions will include live examples and code samples. Sessions are live throughout March and available on demand. Generative AI for Beginners .NET Edition The “Generative AI for Beginners .NET" course on GitHub is designed to help .NET developers dive into the world of Gen AI. It features short 5-10 minute videos and code samples, offering an easy way to start adding GenAI to your .NET projects. GitHub Copilot Agent Mode: Build an iOS App in Minutes Discover the power of Agent Mode in VS Code Insiders. Watch to see how you can build a fully functional iOS app from scratch in minutes. Let AI handle everything from generating code to fixing build errors. Start your AI learning journey Ready to get started with AI but don’t know where to begin? The AI learning hub from Microsoft Learn is packed with skilling resources, organized around roles, to help you build practical skills you can use right away. Let's Learn .NET: GitHub Copilot Discover the power of Copilot and .NET. Watch this Let’s Learn .NET workshop on demand to learn about GitHub Copilot, Copilot Chat, and AI-powered coding. Master the basics of Copilot, explore prompt engineering. and learn essential best practices. Bells and whistles: Building with Microsoft Copilot Studio Don’t know how to start building an agent in Microsoft Copilot Studio? Watch the Building with Copilot Studio series to explore the different capabilities of building agents with Copilot Studio. Get Started with GitHub Copilot in VS Code Use GitHub Copilot in Visual Studio Code to write better code faster. This video will show you how. AI Agents for Beginners AI agents for beginners! This free 10-part course on GitHub will get you started building AI agents from concept to code. Evolve with generative AI: Operationalize your AI solutions with fine-tuning and prompt flow This Microsoft Learn Plan will guide you through the process of managing and evolving your Gen AI solutions. Learn how to use a prompt flow, collaborate with others through hubs and projects, and integrate fine-tuned models. Sip and Sync with Azure Explore real AI and cloud use cases in the Sip and Sync with Azure video series. Watch to learn how customers use AI and cloud solutions to solve real-world business problems. Official plan: Accelerate gen AI model selection and evaluation with Azure AI Foundry This Official Plan from Microsoft Learn is designed to guide you through the process of selecting and applying the best Gen AI models for your needs using Azure AI Foundry. Work through this structured learning journey at your pace and build critical skills. A look at the new Copilot Next Edit Suggestions preview GitHub Copilot has 3 exciting new preview capabilities: Next Edit Suggestions, Agent Mode for Copilot Edits, and vision support. Take a closer look at Next Edit Suggestions and learn about the latest step in intelligent code completions. Microsoft Reactor: Model Mondays How well do you know your AI models? With thousands available, how do you choose the best one for your project? Model Mondays can help. Each week, this series will dig into a different model with the experts who know it best. Architecture Recipes for AI-Powered Applications Build intelligent apps at a live Reactor event near you. Explore practical examples and best practices as you learn how to integrate machine learning models, data pipelines, and cloud services. GitHub Copilot for Azure: 6 Must-Try features Supercharge your Azure game right within GitHub Copilot? Watch these short videos to quickly learn about 6 must-try features in GitHub Copilot for Azure. RAG Time: Ultimate Guide to Mastering RAG Become a master of Retrieval-Augmented Generation (RAG). This new series for developers will help you unlock the full potential of RAG with exert level discussions breaking along with code samples and step-by-step guides. New episodes on Wednesdays, March 5 - April 2. Microsoft Learn Collection: Microsoft 365 Copilot extensibility Want to extend Microsoft 365 Copilot? This collection of resources from Microsoft Learn has everything you need to know to extend Microsoft 365 Copilot with agents, Microsoft Graph connectors, and more. Find a learning path and get started. Introducing the Adaptive Cards documentation hub and new Adaptive Cards updates Learn about the latest Adaptive Cards updates and the new Adaptive Cards documentation hub. Discover how Adaptive Cards can transform your apps with interactive experiences that streamline workflows. Use DeepSeek R1 on your GPU to power custom engine agents Learn how to use the DeepSeek R1 (hosted locally on your GPU) to power a custom engine agent built with Teams Toolkit and Teams AI Library. POSETTE: An Event for Postgres 2025 schedule announced Get details about POSETTE, a free digital developer event organized by the Postgres team at Microsoft. Check out the schedule and speakers for this year’s event, taking place June 10-12, 2025. JDConf 2025: Code the Future with AI Get ready for JDConf 2025, Microsoft's annual event for Java developers. Learn about building modern apps in the cloud, integrating AI, using AI-assisted dev tools, and more. This year’s event will take place April 9-10, 2025. VS Code + GitHub Copilot Release Party Join the VS Code + GitHub Copilot release party (April 16, 2025). Join the Visual Studio Code and GitHub Copilot teams for a deep dive into the latest features and Q&A about the March release. More News and Resources from around Microsoft Introducing Copilot Next Edit Suggestions (preview) in VS Code New: Semantic Ranker Solution Accelerator for PostgreSQL New: GraphRAG Solution Accelerator for PostgreSQL Preview: Azure AI Agent Service Startup Spotlight Get Started with Copilot Free in VS Code Raising the bar for RAG excellence: New performance benchmarks Automate Dev Environments with Microsoft Dev Box and Teams Customizations Dev Proxy: Simulate APIs Microsoft 365 Copilot Chat Microsoft Copilot Studio: Enabling agents in Microsoft 365 Copilot Chat373Views0likes0CommentsMeet First Round of Speakers for Microsoft JDConf 2025: Code the future with Java and AI
We are excited to share the initial lineup of speakers and sessions for Microsoft JDConf 2025, taking place on April 9-10. Whether you are an experienced developer or just starting out, JDConf offers valuable opportunities to explore the latest advancements in Java, Cloud and AI technologies, gain practical insights, and connect with Java experts from across the globe. Secure your spot now at jdconf.com. Here are the initial sessions and speakers who will provide valuable insights into Java, Cloud, and AI. Java 25. Explore The Hidden Gems of Java 25 with Mohamed Taman as he uncovers key Java SE features, updates, and fixes that will simplify migration to new Java and enhance your daily development workflow. Virtual Threads. Virtual Threads in Action with Jakarta EE Core Profile by Daniel Kec will explore Helidon 4, the first Jakarta EE Core Profile runtime built on a pure Virtual Thread-based web server. Spring Boot. Bootiful Spring Boot: A DOGumentary by Josh Long will dive into Spring Boot 3.x and Java 21, exploring AI, modularity, and powerful optimizations like virtual threads, GraalVM, and AppCDS. Jakarta EE 12. What Comes After Jakarta EE 11? Reza Rahman and Emily Jiang will share roadmap, contribution pathways, and key updates, including Security, Concurrency, Messaging, and new APIs. GitHub Copilot. Use GitHub Copilot in your favorite Java IDEs by Julia Kordick and Brian Benz will show how to maximize productivity with GitHub Copilot’s latest features in IntelliJ, VS Code, and Eclipse. AI Dev Experience. Boosting AI Developer Experience with Quarkus, LangChain4j, and Azure OpenAI by Daniel Oh will demonstrate how this trio streamlines development and powers intelligent applications. Spring AI. How to Build Agents with Spring AI by Adib Saikali will showcase building intelligent AI agents, covering key patterns like self-editing memory, task orchestration, & collaborative multi-agent systems. LangChain4j. AI-Powered Development: Hands-On Techniques for Immediate Impact by Lize Raes will explore AI tools like Cursor, Devin, and GitHub Workspace to help developers accelerate workflows and embrace AI-driven coding practices. Data and AI. Powering Spring AI with RAG and NoSQL by Theo van Kraay will demo how integrating Cosmos DB as vector store with Spring AI enables scalable, intelligent and high performing applications. Automated Refactoring. The State of AI in Large Scale Automated Refactoring by Jonathan Schneider will show how OpenRewrite’s Lossless Semantic Tree enhances AI-driven refactoring for accurate decision-making. Java Modernization. Cloud Migration of Java Applications Using Various Tools and Technique by Yoshio Terada will demo modernizing legacy apps with tools like VS Code, GitHub Copilot, and Azure Migrate. AI-Driven Testing. Test Smarter, Not Harder: AI-Driven Test Development by Loiane Groner will demo how AI-powered tools like GitHub Copilot enhance TDD through automated test generation and improved test coverage, even for legacy code. RSVP Now Join us at Microsoft JDConf 2025 and code the future with Java, Cloud and AI. RSVP today at jdconf.com to secure your spot. Your registration grants access to live streams, on-demand sessions, and a collection of valuable resources. Stay tuned for updates on more engaging sessions and inspiring speakers. Connect with a community shaping tomorrow’s technology and gain practical insights from industry leaders. Follow the conversation using #JDConf, and visit jdconf.com for the latest agenda and schedule. Secure your spot now at jdconf.com!435Views2likes1CommentIntroducing Serverless GPUs on Azure Container Apps
We're excited to announce the public preview of Azure Container Apps Serverless GPUs accelerated by NVIDIA. This feature provides customers with NVIDIA A100 GPUs and NVIDIA T4 GPUs in a serverless environment, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs accelerate the speed of your AI development team by allowing you to focus on your core AI code and less on managing infrastructure when using NVIDIA accelerated computing. They provide an excellent middle layer option between Azure AI Model Catalog's serverless APIs and hosting models on managed compute. It provides full data governance as your data never leaves the boundaries of your container while still providing a managed, serverless platform from which to build your applications. Serverless GPUs are designed to meet the growing demands of modern applications by providing powerful NVIDIA accelerated computing resources without the need for dedicated infrastructure management. "Azure Container Apps' serverless GPU offering is a leap forward for AI workloads. Serverless NVIDIA GPUs are well suited for a wide array of AI workloads from real-time inferencing scenarios with custom models to fine-tuning. NVIDIA is also working with Microsoft to bring NVIDIA NIM microservices to Azure Container Apps to optimize AI inference performance.” - Dave Salvator, Director, Accelerated Computing Products, NVIDIA Key benefits of serverless GPUs Scale-to zero GPUs: Support for serverless scaling of NVIDIA A100 and T4 GPUs. Per-second billing: Pay only for the GPU compute you use. Built-in data governance: Your data never leaves the container boundary. Flexible compute options: Choose between NVIDIA A100 and T4 GPUs. Middle-layer for AI development: Bring your own model on a managed, serverless compute platform. Scenarios Whether you choose to use NVIDIA A100 or T4 GPUs will depend on the types of apps you're creating. The following are a couple example scenarios. For each scenario with serverless GPUs, you pay only for the compute you use with per-second billing, and your apps will automatically scale in and out from zero to meet the demand. NVIDIA T4 Real-time and batch inferencing: Using custom open-source models with fast startup times, automatic scaling, and a per-second billing model, serverless GPUs are ideal for dynamic applications that don't already have a serverless API in the model catalog. NVIDIA A100 Compute intensive machine learning scenarios: Significantly speed up applications that implement fine-tuned custom generative AI models, deep learning, or neural networks. High performance computing (HPC) and data analytics: Applications that require complex calculations or simulations, such as scientific computing and financial modeling as well as accelerated data processing and analysis among massive datasets. Get started with serverless GPUs Serverless GPUs are now available for workload profile environments in West US 3, Australia East, and Sweden Central regions with more regions to come. You will need to have quota enabled on your subscription in order to use serverless GPUs. By default, all Microsoft Enterprise Agreement customers will have one quota. If additional quota is needed, please request it here. Note: In order to achieve the best performance with serverless GPUs, use an Azure Container Registry (ACR) with artifact streaming enabled for your image tag. Follow steps here to enable artifact streaming on your ACR. From the portal, you can select to enable GPUs for your Consumption app in the container tab when creating your Container App or your Container App Job. You can also add a new consumption GPU workload profile to your existing Container App environment through the workload profiles UX in portal or through the CLI commands for managing workload profiles. Deploy a sample Stable Diffusion app To try out serverless GPUs, you can use the stable diffusion image which is provided as a quickstart during the container app create experience: In the container tab select the Use quickstart image box. In the quickstart image dropdown, select GPU hello world container. If you wish to pull the GPU container image into your own ACR to enable artifact streaming for improved performance, or if you wish to manually enter the image, you can find the image at mcr.microsoft.com/k8se/gpu-quickstart:latest. For full steps on using your own image with serverless GPUs, see the tutorial on using serverless GPUs in Azure Container Apps. Learn more about serverless GPUs With serverless GPUs, Azure Container Apps now simplifies the development of your AI applications by providing scale-to-zero compute, pay-as you go pricing, reduced infrastructure management, and more. To learn more, visit: Using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn Tutorial: Generate images using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn3.9KViews1like0CommentsSelf Hosted AI Application on AKS in a day with KAITO and CoPilot.
In this blog post I document my experience of spending a full day using KAITO and Copilot to accelerate deployment and development of a self managed AI enabled chatbot deployed in a managed cluster. The goal is to showcase how quickly using a mix of AI tooling we can go from zero to a self hosted, tuned LLM and chatbot application. At the top of this article I want to share my perspective on the future of projects such as KAITO. At the moment I believe KAITO to be somewhat ahead of its time, as most enterprises begin adopting abstracted artificial intelligence it is brilliant to see projects like KAITO being developed ready for the eventual abstraction pendulum to swing back, motivated by usual factors such as increased skills in the market, cost and governance. Enterprises will undoubtedly in the future look to take centralised control of the AI models being used by their enterprises as GPU's become cheaper, more readily available and powerful. When this shift happens open source projects like KAITO will become common place in enterprises. It is also my opinion that Kubernetes lends itself perfectly to be the AI platform of the future a position shared by the CNCF (albeit both sources here may be somewhat biased). The resiliency, scaling and existence of Kuberentes primitives such as "Jobs" mean that Kubernetes is already the de-facto platform for machine learning training and inference. These same reasons also make Kuberentes the best underlying platform for AI development. Companies including DHL, Wayve and even OpenAI all run ML or AI workloads already on Kubernetes. That does not mean that Data Scientists and engineers will suddenly be creating Dockerfiles or exploring admission controllers, Kubernetes instead, as a platform will be multiple layers of abstraction away (Full scale self service platform engineering) however the engineers responsible for running and operating the platform will hail projects like KAITO.671Views2likes0CommentsLearn New Skills in the New Year
New year’s resolution: Start writing better code faster in 2025. Kick off the new year by learning new developer skills and elevate your career to the next level. In this post, we explore learning resources and live events that will help you build critical skills and get started with cutting-edge technologies. Learn how to build custom agents, code intelligent apps with familiar tools, discover new possibilities in .NET 9, use Copilot for testing and debugging, and more. Plus, get details about using GitHub Copilot in Visual Studio Code—for free! New AI for Developers page Check out the new AI for Developers page. It's packed with free GitHub courses on building apps, machine learning, and mastering GitHub Copilot for paired programming. Learn your way and skill up for what's next in AI. Use GitHub Copilot in Visual Studio Code for free Did you hear the news? You can now use GitHub Copilot in Visual Studio Code for free. Get details about the new Copilot Free plan and add Copilot to your developer toolbox. What is Copilot Studio? Have questions about Copilot Studio? This article from Microsoft Learn covers all the basics you need to know about Copilot Studio—the low-code tool for easily building agents and extending Microsoft 365 Copilot. From C# to ChatGPT: Build Generative AI Solutions with Azure Combine your C# skills with the cutting-edge power of ChatGPT and Azure OpenAI Service. This free learning path introduces you to building GenAI solutions, using REST APIs, SDKs, and Azure tools to create more intelligent applications. Register for the Powerful Devs Conference + Hackathon Register for the Powerful Devs Conference + Hackathon (February 12-28, 2025) and get more out of Power Platform. This one-day online conference is followed by a 2-week hackathon focused on building intelligent applications with less effort. Code the future with Java and AI: RSVP for Microsoft JDConf 2025 today Get ready for the JDConf 2025—Microsoft's annual event for Java developers. Taking place April 9-10, this year’s event will have three separate live streams to cover different regions. Join to explore tools and skills for building modern apps in the cloud and integrating AI. Build custom agents for Microsoft Teams Learn how to build custom agents for Microsoft Teams. This free learning path will teach you about different copilot stacks, working with Azure OpenAI, building a custom engine agent. Start building intelligent Microsoft Teams apps using the LLMs and AI components. Microsoft Learn: Debug your app with GitHub Copilot in Visual Studio Debug more efficiently using GitHub Copilot. This Microsoft Learn article shows you how. Discover how Copilot will answer detailed questions about your code and provide bug fixes. Make Azure AI Real: Watch Season 2 Elevate your AI game with Make Azure AI Real on demand. Season 2 digs into the latest Azure AI advancements, with practical demos, code samples, and real-world use cases. GitHub Copilot Bootcamp Streamline your workflow with GitHub Copilot—craft more effective prompts and automate repetitive tasks like testing. This GitHub Copilot Bootcamp is a 4-part live streaming series that will help you master GitHub Copilot. 10 Days of GenAI – Gift Guide Edition Start building your own Gen AI application. These short videos outline 10 steps for creating your app—choose a model, add functions, fine tune responses, and more. Extend Microsoft 365 Copilot with declarative agents using Visual Studio Code Check out this new learning path from Microsoft Learn to discover how you can extend Microsoft 365 Copilot with declarative agents using VS Code. Learn about declarative agents and how they work. Developer's guide to building your own agents Want to build your own agents? Watch this Ignite session on demand for a look at the new agent development tools. Find out how to create agents built on Microsoft 365 Copilot or your custom AI engine. Master distributed application development with .NET Aspire Get started with .NET Aspire—an opinionated, cloud-ready stack for building distributed applications with .NET. This series covers everything from setup to deployment. Start your journey toward mastering distributed app development. Learn: What's new in .NET 9 Discover what's new in .NET 9. Learn about new features for AI, improvements for building cloud-native apps, performance enhancements, updates to C#, and more. Read the overview and get started with .NET 9. Become a .NET AI engineer using the OpenAI library for .NET Use your .NET skills to become an AI engineer. With the OpenAI library, .NET developers can quickly master critical AI skills and apply them to real world apps. Read the blog to learn more about the OpenAI library for .NET. Test like a pro with Playwright and GitHub Copilot Supercharge your testing using Playwright and GitHub Copilot. Watch this in-depth demo and discover how you can easily create end-to-end tests using Playwright's powerful built-in code generator. Other news and resources from around Microsoft · Microsoft Learn: Why and how to adopt AI in your organization · Microsoft Learn: Learn to use Copilot in Microsoft Fabric · AI Toolkit for Visual Studio Code: Update highlights · Teams Toolkit for Visual Studio Code update · RAG Deep Dive: Live streams · Learn Together: SQL database in Fabric · Become an AI security expert using OpenAI with Azure Managed Identity · Deploy, monitor, and manage development resources with Microsoft Dev Box · Microsoft Playwright testing · Introduction to artificial intelligence and Azure AI services · Azure AI-900 Fundamentals Training event series · Leveraging cloud-native infra for your intelligent apps · Platform engineering with GitHub · Extend declarative agents for Microsoft 365 Copilot with API plugins using Visual Studio Code · Introducing the Microsoft 365 Agents SDK · Azure Live Q&A events · Get started with multimodal parsing for RAG using GPT-4o, Azure AI Search, and LlamaParse3.3KViews2likes0CommentsEasily deploy .NET apps to Azure Container Apps with default configuration for data protection
The Azure Container Apps and .NET team have made it easier than ever to deploy your .NET application by supporting automatic configuration for data protection. This support is currently available as an opt-in feature in the Container Apps API version 2024-02-02-preview. This blog post will discuss the feature and what it enables, how to determine if your application is correctly configured, and how to enable configuration for data protection across a variety of .NET versions.2.1KViews1like1CommentOperationalize AI apps innovation at scale by modernizing apps and data on Microsoft Azure
This blog explores how modernizing apps and data on Microsoft Azure can help operationalize AI applications at scale, providing businesses with the tools and infrastructure needed to thrive.342Views0likes0CommentsDeploy secure App Service resources to prevent dangling DNS entries and avoid subdomain takeover
Back in May 2024, we announced the Public Preview of Secure Unique Default Hostnames on Web Apps. We are excited to announce that this feature is now in General Availability on Web Apps and is now in Public Preview for Functions! This feature works similarly for both Web Apps and Functions, so you can refer to the Public Preview announcement for more in-depth information regarding this feature. Secure unique default hostname feature is a long-term solution to protect your resources from dangling DNS entries and subdomain takeover. If you have this feature enabled for your App Service resources, then no one outside of your organization would be able to recreate resources with the same default hostname. This means that malicious actors can no longer take advantage of your dangling DNS entries and takeover your subdomains. We highly encourage everyone to enable secure unique default hostnames on their net-new App Service deployments. Addressing pre-existing resources without secure unique default hostnames enabled Since this feature can only be enabled upon resource creation, if you’d like to use this feature for your pre-existing resources, you can: Clone a pre-existing app to a new app with secure unique default hostname enabled Screenshot of cloning pre-existing app to an app that's about to be created with secure unique default hostname enabled. Use a backup of a pre-existing app to restore to a new app with secure unique default hostname enabled Screenshot of using a backup of a pre-existing app to restore to an app that's about to be created with secure unique default hostname enabled. Looking ahead We highly encourage everyone to enable secure unique default hostnames on all net-new App Service deployments. This is the time to integrate and to adopt this feature to your testing and production environments so that you can build more secure App Service resources to prevent dangling DNS entries and avoid subdomain takeover. Keep an eye out for future announcements where we will launch secure unique default hostnames in Public Preview for Logic Apps (Standard)!576Views1like0Comments