logic apps
299 TopicsAzure Logic Apps Hybrid Deployment Model - Public Preview Refresh
This past October, we announced the public preview of the Logic Apps hybrid deployment model that allows customers to run Logic Apps workloads on customer managed infrastructure. Since this time, we have had the opportunity to chat with many customers across many industries and have received a great response to our offering. Some of the drivers of this interest include customers looking to migrate from BizTalk Server and looking for an on-premises offering. Other customers, such as those in manufacturing, retail or mining settings, have requirements to co-locate their integration platform near key lines of business systems that avoid dependencies on public internet to process transactions. The sweet spot of this customer interest has come from existing Logic Apps customers who have some specific use cases where they need to process transactions locally and thus truly taking advantage of hybrid hosting while being able to leverage the same skills, tools and monitoring experiences across their landscape. This aligns with our promise of this offering being a deployment model and providing customers with more options that allow them to meet their business needs. Public Preview Refresh Updates: .NET Framework Custom Code Support on Linux Containers For BizTalk customers who are looking to 'lift and shift' their data transformations, this capability enables those solutions to execute in Logic Apps Hybrid. This added capability now provides parity with what we currently offer in Logic Apps (Standard) in Azure. Support for Running SAP Built-in Connector on Linux Containers: The SAP connector has dependencies on .NET Framework NCO dlls which are available from SAP marketplace. We’ve now added support for running the SAP built-in connector on Linux containers, which will enable enterprise customers to run SAP scenarios at edge locations with Hybrid Logic Apps. Rabbit MQ Built-in Connector for On-Premises Queue Scenarios: One popular request that we have received from customers is the ability to connect to on-premises message brokers. While Azure Logic Apps currently supports some on-premises message brokers, customers were looking for additional options including Rabbit MQ. In this release, we have included an In-App Rabbit MQ connector. Since the Logic Apps Hybrid Deployment model is Logic Apps Standard, this means that this connector will also be available in Logic Apps Standard that you run in the cloud soon. Please see the following video for more information on the Rabbit MQ solution. The Rabbit MQ connector is currently in public preview and currently works with non-durable queues. We are looking to add more capabilities to this connector and are actively seeking feedback from customers on what we should prioritize next. We would love to hear from you by submitting the following form: https://aka.ms/RabbitMQwithLAStandardSurvey Performance Optimization: We’ve made several performance improvements to ensure that Hybrid Logic Apps run efficiently and effectively. We plan to share some benchmarks/baselines soon that highlight the expected throughput when using Logic Apps Hybrid Deployment model. Next Steps Existing customers: When you make a change to an existing workflow or an app setting, it will force a new revision to be created. When that takes place, the latest container image will be pulled from the Azure Container Registry and will be instantiated in your runtime. New customers: Please follow our getting started guide on Microsoft Learn. When you follow these steps, you will automatically start with the latest runtime image which will include all these new capabilities. General Availability We anticipate promoting this offering to General Availability soon. We are now focused on publishing our performance benchmarks, documenting CI/CD processes and addressing customer feedback. So if you have any outstanding needs, please reach out.Access [Logic Apps / App Services] Site Files with FTPS using Logic Apps
You may need to access storage files for your site, whether it is a Logic App Standard, Function App, or App Service. Depending on your ASP SKU, these files can be accessed using FTP/FTPS. Some customers encounter difficulties when attempting to connect using Implicit/Explicit FTPS. This post aims to simplify this process by utilizing a Logic App to list files, retrieve file content, and update files. Explicit FTPS connection will be used in this scenario as it is the one mutually supported by the FTP Connector and by the FTP site when using FTPS. Steps: Create a user/pass credentials to access the FTP site. You can do it from the Portal or using CLI. You can run a command Shell from the below reference to execute the command. Reference: Configure deployment credentials - Azure App Service | Microsoft Learn CLI: az webapp deployment user set --user-name <username> --password <password> Portal: Enable FTP Basic Authentication on the Destination (Logic App Standard, Function App, App Services): It is highly advised to use "FTPS only" connection as it provides a secure encrypted connection. To disable unencrypted FTP, select FTPS Only in FTP state. To disable both FTP and FTPS entirely, select Disabled. When finished, select Save. If using FTPS Only, you must enforce TLS 1.2 or higher by navigating to the TLS/SSL settings page of your web app. TLS 1.0 and 1.1 aren't supported with FTPS Only. Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn The FTP Connector supports Explicit connection only: FTP - Connectors | Microsoft Learn For secure FTP, make sure to set up explicitFile Transfer Protocol Secure (FTPS), rather than implicit FTPS. Also, some FTP servers, such as ProFTPd, require that you enable the NoSessionReuseRequired option if you use Transport Layer Security (TLS) mode, the successor to Secure Socket Layer (SSL). The FTP connector doesn't work with implicit FTPS and supports only explicit FTP over FTPS, which is an extension of TLS. Create Logic App and FTP Connection: Create the Logic App workflow, add an FTP Action to List the Files, or any FTP Action based on your requirements. To test the connection for the first time, I recommend using the "List files in folder" Action. In the connection configuration: Server Address: xxxxxxx.ftp.azurewebsites.windows.net (Get this value from the Properties of the Destination service, don't add the "ftps://" section nor the "/site/wwwroot" section) User Name and Password: xxxxxx\xxxxx (This is what we created in the FTP credentials tab under the Deployment Center, in the User scope section, or using the CLI command) FTP Server Port: 21 (use Port 21 to force the connection to be Explicit) Enabled SSL?: checked (use SSL to force the connection to use FTPS) Create Logic App FTP Connection: After creating the connection, use "/site/wwwroot" to access your folder: Test this and see if it works! Troubleshooting: Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn I recommend to secure the connection password using KeyVault. More on that below. Secure Parameters in Keyvault: Main steps: Put the connection string in KeyVault. Give Access to the Logic App on KeyVault. Add the reference in App Settings for the Logic App. The steps mentioned here: Use Key Vault references - Azure App Service | Microsoft Learn Example of this at the end of this article: A walkthrough of parameterization of different connection types in Logic App Standard | Microsoft Community Hub And that's how you access those files! You can make use of this secure connection for multiple tasks based on your requirements.Deploying Standard Logic App to Storage Account behind Firewall using Service or Private Endpoints
Standard LogicApp is powered by new Azure single-tenant LogicApp runtime. Single-tenant Logic App offering runs as an extension on top of Azure Function runtime. Like Azure Functions, the standard Logic App leverages storage account services such as blob, File Share, Queue and Table for various purposes. For instance, the website content is stored in File Share which will be accessed by Logic App site for up and running. You can refer to this blog to understand how the storage services are leveraged in standard Logic App. In this article, we will see how to access the Storage account services on secured network using Service endpoints (or) Private Endpoints from Standard Logic App. By default, the portal creation experience of the Logic Apps expects the storage account must be accessible on public endpoint. Let us explore the below options on how to access the storage account over a secured network. Deploy Logic App standard resource having secured access to Storage account using Portal: Storage account need to be accessible on Public Endpoint i.e. Network access set to 'All Networks'. Create a Logic App resource, selecting the above created storage account during Logic App creation process. Enable the Service Endpoint or Private Endpoints (for all services i.e. blob, table, file, queue) on the storage account. Enable the VNET integration for Logic App resource with respective VNET and subnet have access to Storage account on Service Endpoint (SE) or Private Endpoints (PE). Set the below configuration settings (app settings) to access the storage on SE or PE’s. S.No Configuration Setting Value Comments Mandatory 1 WEBSITE_VNET_ROUTE_ALL (Legacy Setting and have new toggle button 'Route All' in networking tab) 1 It is legacy setting to route all outbound traffic through integrated subnet. There is new Route all toggle button available in the VNET integration blade. You can use either of these. If you don't set either of above and have VNET integration only private traffic will be going through your subnet and remaining will be through internet. Yes (Either Vnetruouteall setting in VNET blade or this setting in Environment variables) 2 WEBSITE_DNS_SERVER 168.63.129.16 (or) Custom DNS server IP Address It is to enforce Logic App to use specific DNS server. If its required you can set or else Logic App will use whatever the DNS servers configured on the integrated VNET. No 3 WEBSITE_CONTENTOVERVNET 1 It enables Logic App resource to access the website content over VNET traffic i.e. on SE or PE’s. Yes 4 WEBSITE_DNS_ALT_SERVER Alternate DNS server IP address It is to enforce Logic App to use specific DNS server while WEBSITE_DNS_SERVER unable to resolve. If its required you can set or else Logic App will use whatever the DNS servers configured on the integrated VNET. No No need to restart the Logic App site when you update any app settings or adding new workflows, Logic App gracefully restarts in an incremental mode without impacting any existing runs. However, it isn't suggested to make changes when there are any in-process transactions. Change the storage account network settings to selected networks Logic App may experience some interruption as the connectivity switch between public and private endpoints may take some time and you may not see workflows for sometime. You may restart the Logic App and wait for few mins and check whether workflows are loading or not. Deploy Logic App standard resource having secured access to Storage account using automated deployment tools: You can overcome the issue of exposing the storage account to public internet with the portal creation experience using ARM template deployment. With ARM deployment, you no need to open it for all Networks and can be deployed directly with Service endpoints or Private Endpoints. If you would like to deploy Standard Logic App to secure storage account from automated tools such as DevOps using ARM templates. You can refer to the sample templates available in the below Git-Hub. VeeraMS/LogicApp-deployment-with-Secure-Storage: Deploying Logic App standard resource with Storage account having Private endpoints (github.com) Note: When you are using Automated deployment tools, you must make sure FileShare is created in Storage account prior to the LogicApp template deployment and referred the same in Logic App config/app setting (WEBSITE_CONTENTSHARE). Below GIF's provides a glimpse on how to configure the Logic App to access storage account using Service or Private endpoints. Access over Storage Service Endpoint: Access over Storage Private Endpoints: Note: vnetRouteAllEnabled app setting is replaces, overrides and takes precedence over the legacy setting WEBSITE_VNET_ROUTE_ALL. Common Errors when Storage account is behind Firewall: We generally observe the below common errors when deployment storage account is behind firewall which indicates that Logic App is unable to access the storage account services. Access to host.json is denied: System.Private.CoreLib: Access to the path 'C:\\home\\site\\wwwroot\\host.json' is denied. Unable to load workflows in Logic App: You may observe that workflows are not visible in the Logic App while Storage access is changed to selected networks or Private Endpoints. You can check the browser logs and it may show below error. {"Code":"BadRequest","Message":"Encountered an error (ServiceUnavailable) from host runtime.","Target":null,"Details": [{"Message":"Encountered an error (ServiceUnavailable) from host runtime."},{"Code":"BadRequest"},{"ErrorEntity": {"Code":"BadRequest","Message":"Encountered an error (ServiceUnavailable) from host runtime."}}],"Innererror":null} Troubleshooting common errors: Make sure Storage account endpoints are reachable from Logic App VNET over 443 and 445 for respective endpoints. Validate the below if you are able to load the Kudu console of the LogicApp: Check name resolver command to make sure Storage endpoints are resolved to expected Ip addresses (Private (Private Endpoints)/Public (Service Endpoint or Public endpoint)). To validate this you can run below commands. nameresolver {StorageaccountName}.blob.core.windows.net nameresolver {StorageaccountName}.file.core.windows.net nameresolver {StorageaccountName}.queue.core.windows.net nameresolver {StorageaccountName}.table.core.windows.net Note: Avoid using nslookup in Kudu which by default uses the azure default DNS IP and wont use Custom DNS server if any if any configured at VNET or Appsetting. Validate the reachability of storage endpoints over 443 and 445 for respective services. tcpping {StorageaccountName}.blob.core.windows.net:443 tcpping {StorageaccountName}.file.core.windows.net:445 tcpping {StorageaccountName}.queue.core.windows.net:443 tcpping {StorageaccountName}.table.core.windows.net:443 If Kudu console itself not loading and have the errors, follow below troubleshooting steps: We can’t troubleshoot the above common errors from the kudu console when the Logic App site itself isn’t up or working . You can use the below to troubleshoot the access to the storage account services. Create an Azure VM with in the same Logic App integrated VNET, it can be a different subnet. The simpler test can be accessing the Storage account services using the Storage explorer tool. If any issues in connectivity using this tool, continue with the below steps. Check nslookup in the command prompt and make sure that the storage services are resolvable to intended IP addresses. If its on Service Endpoint, it should resolve to Public IP and if it has Private Endpoints, verify all services are resolvable to respective NIC private IP addresses. nslookup [StorageaccountHostName] [OptionalDNSServer] Verify for all storage services: nslookup {StorageaccountName}.blob.core.windows.net nslookup {StorageaccountName}.file.core.windows.net nslookup {StorageaccountName}.queue.core.windows.net nslookup {StorageaccountName}.table.core.windows.net If the DNS queries are resolvable, we can check psping or tcpping to storage account over 443 port. psping [StorageaccountHostName] [Port] [OptionalDNSServer] Verify for all storage services: psping {StorageaccountName}.blob.core.windows.net:443 psping {StorageaccountName}.file.core.windows.net:445 psping {StorageaccountName}.queue.core.windows.net:443 psping {StorageaccountName}.table.core.windows.net:443 If it is resolvable from Azure VM, we can check the DNS server is used for resolution in VM and set the same in the Logic App WEBSITE_DNS_SERVER setting and verify. Also, make sure that VNET integration is done with the appropriate VNET and subnet in Logic App. References: You may refer to the below blogs for deep dive into Standard Logic App runtime and deployment using DevOps. Azure Logic Apps Running Anywhere – Runtime Deep Dive (microsoft.com) Deploying an Azure Logic Apps Standard workflow through Azure DevOps Pipeline - Microsoft Tech Community44KViews5likes9CommentsUse PowerShell Script to Manage Your API Connection of Logic App (Consumption) Resources
When you are developing Logic Apps (Consumption) or testing existing Logic Apps, you might create many API connections which might be never used later. Those orphan resources could make your resource group a great mess and hard to choose the right API connection in the logic app. So, I wrote the PowerShell Script to help manage API connections in the resource group.13KViews7likes6CommentsScaling mechanism in hybrid deployment model for Azure Logic Apps Standard
Hybrid Logic Apps offer a unique blend of on-premises and cloud capabilities, making them a versatile solution for various integration scenarios. A key feature of hybrid deployment models is their ability to scale efficiently to manage different workloads. This capability enables customers to optimize their compute costs during peak usage by scaling up to handle temporary spikes in demand and then scaling down to reduce costs when the demand decreases. This blog will explore the scaling mechanism in hybrid deployment models, focusing on the role of the KEDA operator and its integration with other components.Sending Messages to Confluent Cloud topic using Logic App
With Logic Apps, we can create workflows that connect to various services and systems, allowing us to automate tasks and streamline the business operations. In this blog post, we will explore how to use Azure Logic Apps to send messages to Kafka Confluent topic. Currently, there is no out-of-box Kafka Confluent connector in logic app. We found that Kafka Confluent provides REST API Confluent Cloud API Reference Documentation. This sample shows how to use HTTP action in workflow to call the Kafka Confluent API which produce record to topic. Prerequisites Azure Account and access to Azure Portal. Access to Confluent Cloud. Confluent Cloud is a fully managed pay-as-you-go Kafka service. You can get a free trial here. Setup the Kafka cluster and topic If you are new to Confluent Kafka, you can check their tutorials: Quick Start for Confluent Cloud | Confluent Documentation. Create a new Kafka cluster on Confluent Cloud. Navigate to the cluster and click Cluster settings. Note the REST endpoint. We will use following endpoint in this example: Create a new Kafka Topic called "LAtest" using the default topic settings. Create a new API Key and Secret. Navigate to the cluster and from the left menu, select Cluster Overview -> API Keys. Click Create key and follow the prompts to create a Global access API key. Note down the value of key and secret. To communicate with the REST API, we need to use this API key ID and corresponding secret to create the base64 encoded string in the authorization header that will be included in the REST calls. To learn more, see Authentication part which describes Cloud and Cluster API keys and base64 encoding. Create Logic App workflow To produce message to a topic, we need to provide JSON data and a base64-encoded API key and secret to the REST Produce endpoint: /kafka/v3/clusters/<cluster-id>/topics/<topic-name>/records. Below is a sample REST code snippet (Non-streaming mode): curl \ -X POST \ -H "Content-Type: application/json" \ -H "Authorization: Basic <base64-encoded API key and secret>" \ https://xxx.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-mxpx52/topics/LAtest/records \ -d '{"value":{"type":"JSON","data":"Hello World!"}}' In the Logic App workflow, we can add a "When a HTTP request is received trigger" to fire the workflow for test. Then we add a HTTP action with following configurations: Run the workflow We can send the body message to target topic successfully. View the Messages tab of Latest topic on the Confluent Cloud UI:143Views2likes0CommentsLogic Apps Aviators Newsletter - March 2025
In this issue: Ace Aviator of the Month News from our product group News from our community Ace Aviator of the Month March’s Ace Aviator: Dieter Gobeyn What’s your role and title? What are your responsibilities? I work as an Azure Solution Architect; however, I remain very hands-on and regularly develop solutions to stay close to the technology. I design and deliver end-to-end solutions, ranging from architectural analysis to full implementation. My responsibilities include solution design, integration analysis, contributing to development, reviewing colleagues’ work, and proposing improvements to our platform. I also provide Production support when necessary. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? My days can vary greatly, but collaboration with my globally distributed team is always a priority. I begin my day promptly at 8 AM to align with different time zones. After our daily stand-up, I often reach out to colleagues to see if they need assistance or follow-up on mails/team messages. A significant portion of my day involves solution design—gathering requirements, outlining integration strategies, and collaborating with stakeholders. I also identify potential enhancements, perform preliminary analysis, and translate them into user stories. I also spend time on technical development, building features, testing them thoroughly, and updating documentation for both internal and client use. On occasions where deeper investigation is needed, I support advanced troubleshooting, collaborating with our support team if issues demand additional expertise. If a release is scheduled, I sometimes manage deployment activities in the evening. What motivates and inspires you to be an active member of the Aviators/Microsoft community? I’ve always valued the sense of community that comes from sharing knowledge. Early in my career, attending events and meeting fellow professionals helped me bridge the gap between theory and real-world practice. This informal environment encourages deeper, hands-on knowledge exchange, which often goes beyond what official documentation can provide. Now that I’m in a more senior role, I believe it’s my responsibility—and pleasure—to give back. Contributing to the community enables me to keep learning, connect with fantastic people, and grow both technically and personally. Looking back, what advice do you wish you had been given earlier that you’d now share with those looking to get into STEM/technology? Master the fundamentals, not just the tools. It’s easy to get caught up in the newest frameworks, cloud platforms, and programming languages. However, what remains constant are the core concepts such as networking, data structures, security, and system design. By understanding the ‘why’ behind each technology, you’ll be better equipped to design future-proof solutions and adapt fast as tools and trends evolve. What has helped you grow professionally? Curiosity and a commitment to continuous learning have been key. I’m always keen to understand the ‘why’ behind how things work. Outside my normal job, I pursue Microsoft Reactor sessions, community events, and personal projects to expand my skills. Just as important is receiving open, honest feedback from peers and being honest with oneself. Having mentors or colleagues who offer both challenges and support is crucial for growth, as they provide fresh perspectives and help you refine your skills. In many cases, I’ve found it takes effort outside standard working hours to truly develop my skills, but it has always been worth it. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? I’d love to see more uniformity & predictability across adapters, for example in terms of their availability for both stateless and stateful workflows. Currently, certain adapters—like the timer trigger—are either unavailable in stateless workflows or behave differently. Unifying adapter support would not only simplify solution design decisions, but also reduce proof-of-concept overhead and streamline transitions between stateless and stateful workflows as requirements evolve. News from our product group Logic Apps Live Feb 2025 Missed Logic Apps Live in February? You can watch it here. You will find a live demo for the Exporting Logic Apps Standard to VS Code, some updates on the new Data Mapper User Experience and lots of examples on how to leverage Logic Apps to create your Gen AI solutions. Exporting Logic App Standard to VS Code Bringing existing Logic Apps Standard deployed in Azure to VS Code are now simpler with the new Create Logic Apps Workspaces from package. New & Improved Data Mapper UX in Azure Logic Apps – Now in Public Preview! We’re excited to announce that a UX update for Data Mapper in Azure Logic Apps is now in Public Preview! We have continuously improved Data Mapper, which is already generally available (GA), based on customer feedback. Parse or chunk content for workflows in Azure Logic Apps (Preview) When working with Azure AI Search or Azure OpenAI actions, it's often necessary to convert content into tokens or divide large documents into smaller pieces. The Data Operations actions, "Parse a document" and "Chunk text," can help by transforming content like PDFs, CSVs, and Excel files into tokenized strings and splitting them based on the number of tokens. These outputs can then be used in subsequent actions within your workflow. Connect to Azure AI services from workflows in Azure Logic Apps Integrate enterprise services, systems, and data with AI technologies by connecting your logic app workflows to Azure OpenAI and Azure AI Search resources. This guide offers an overview and practical examples on how to use these connector operations effectively in your workflow. Power Automate migration to Azure Logic Apps (Standard) Development teams often need to build scalable, secure, and efficient automation solutions. If your team is considering migrating flows from Microsoft Power Automate to Standard workflows in Azure Logic Apps, this guide outlines the key advantages of making the transition. Azure Logic Apps (Standard) is particularly beneficial for enterprises running complex, high-volume, and security-sensitive workloads. AI playbook, examples, and other resources for workflows in Azure Logic Apps AI capabilities are increasingly essential in applications and software, offering time-saving and innovative tasks like chat interactions. They also facilitate the creation of integration workloads across various services, systems, apps, and data within enterprises. This guide provides building blocks, examples, samples, and resources to demonstrate how to use AI services, such as Azure OpenAI and Azure AI Search, in conjunction with other services and systems to build automated workflows in Azure Logic Apps. Collect ETW trace in Logic App Standard An Inline C# script to collect Event Tracing for Windows (ETW) and store it in a text file, from within your Logic Apps. Typical Storage access issues troubleshooting With this blog post we intend to provide you more tools and visibility on how to troubleshoot your Logic App and accelerate your service availability restore. Download Logic App content for Consumption and Standard Logic App in the Portal It's common to see customers needing to download the JSON contents for their Logic Apps, either to keep a copy of the code or to initiate CI/CD. The methods to download this are very simple, accessible on a single button. Running Powershell inline with Az commands- Logic App Standard With the availability of the Inline "Execute Powershell code" action, a few questions have been brought to us like for example how to execute Az commands with this action. Deploy Logic App Standard with Application Routing Feature Based on Terraform and Azure Pipeline This article shared a mature plan to deploy logic app standard then set the application routing features automatically. It's based on Terraform template and Azure DevOps Pipeline. News from our community Azure Logic Apps: create Standard Logic App projects in Visual Studio Code from Azure portal export Post by Stefano Demiliani How many times you had the need to create a new Azure Logic App workflow starting from an existing one? Personally, this happens a lot of time… Starting with version 5.18.7 (published some days ago), the Azure Logic Apps (Standard) extension for Visual Studio Code provides the capability to create Standard Azure Logic App projects from an existing Logic App exported from the Azure portal. Bridging the Gap: Azure Logic Apps Meets On-Prem Fileshares Post by Tim D'haeyer The end of BizTalk Server is fast approaching, signaling a significant shift in the Microsoft integration landscape. With this transition, the era of on-premises integration is drawing to a close, prompting many organizations to migrate their integration workloads to Azure. One key challenge in this process is: “How can I read and write from an on-premises file share using Logic Apps?” Thankfully, this functionality has been available for some time with Azure Logic Apps Standard. Azure Logic Apps vs. Power Apps vs. Power Automate: What to Use When? Post by Prashant Singh The Architect’s Dilemma: Logic Apps vs. Power Apps vs. Power Automate! In my latest blog, I compare Logic Apps, Power Automate, and Power Apps—helping you pick the right one! Securing Azure Logic Apps: Prevent SQL Injection in Complex SQL Server Queries Post by Cameron McKay Executing COMPLEX queries as raw SQL is tempting in Logic App workflows. It's clear how to protect SQL CRUD actions in Logic Apps. BUT how do we protect our complex queries? In the Logic App Standard tier, built-in connectors run locally within the same process as the logic app Post by Sandro Pereira In the Logic App Standard tier, built-in connectors run locally within the same process as the logic app, reducing latency and improving performance. This contrasts with the Consumption model, where many connectors rely on external dependencies, leading to potential delays due to network round-trips. This makes Logic App Standard an ideal choice for scenarios where performance and low-latency integration are critical, such as real-time data processing and enterprise API integrations. Scaling Logic Apps Hybrid Post by Massimo Crippa Logic Apps Hybrid provides a consistent development, deployment, and observability experience across both cloud and edge applications. But what about scaling? Let's dive into that in this blog post. Calling API Management in a different subscription on LA Standard Post by Sandro Pereira Welcome again to another Logic Apps Best Practices, Tips, and Tricks post. Today, we will discuss how to call from Logic App Standard an API exposed in API Management from a different subscription using the in-app API Management connector. How to enable API Management Connector inside VS Code Logic App Standard Workflow Designer Post by Sandro Pereira If you’ve been working with Azure Logic Apps Standard in Visual Studio Code and noticed that the API Management connector is conspicuously absent from the list of connectors inside the workflow designer, you’re not alone. This is a typical behavior that many developers encounter, and understanding why it happens—and how to enable it—can save you a lot of headaches. Do you have strict security requirements for your workflows? Azure Logic Apps is the solution. Post by Stefano Demiliani Azure Logic Apps offers robust solutions for enterprise-level workflows, emphasizing high performance, scalability, and stringent security measures. This article explores how Logic Apps ensures business continuity with geo-redundancy, automated backups, and advanced security features like IP restrictions and VNET integration. Discover why Azure Logic Apps is the preferred choice for secure and scalable automation in large organizations.317Views2likes0CommentsDownload Logic App content for Consumption and Standard Logic App in the Portal
It's common to see customers needing to download the JSON contents for their Logic Apps, either to keep a copy of the code or to initiate CI/CD. The methods to download this are very simple, accessible on a single button. We will only approach the Portal method to extract the Workflows JSON content, not approaching the other available methods (Visual Studio, VS Code, PowerShell, etc.).