integration
72 TopicsAccess [Logic Apps / App Services] Site Files with FTPS using Logic Apps
You may need to access storage files for your site, whether it is a Logic App Standard, Function App, or App Service. Depending on your ASP SKU, these files can be accessed using FTP/FTPS. Some customers encounter difficulties when attempting to connect using Implicit/Explicit FTPS. This post aims to simplify this process by utilizing a Logic App to list files, retrieve file content, and update files. Explicit FTPS connection will be used in this scenario as it is the one mutually supported by the FTP Connector and by the FTP site when using FTPS. Steps: Create a user/pass credentials to access the FTP site. You can do it from the Portal or using CLI. You can run a command Shell from the below reference to execute the command. Reference: Configure deployment credentials - Azure App Service | Microsoft Learn CLI: az webapp deployment user set --user-name <username> --password <password> Portal: Enable FTP Basic Authentication on the Destination (Logic App Standard, Function App, App Services): It is highly advised to use "FTPS only" connection as it provides a secure encrypted connection. To disable unencrypted FTP, select FTPS Only in FTP state. To disable both FTP and FTPS entirely, select Disabled. When finished, select Save. If using FTPS Only, you must enforce TLS 1.2 or higher by navigating to the TLS/SSL settings page of your web app. TLS 1.0 and 1.1 aren't supported with FTPS Only. Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn The FTP Connector supports Explicit connection only: FTP - Connectors | Microsoft Learn For secure FTP, make sure to set up explicitFile Transfer Protocol Secure (FTPS), rather than implicit FTPS. Also, some FTP servers, such as ProFTPd, require that you enable the NoSessionReuseRequired option if you use Transport Layer Security (TLS) mode, the successor to Secure Socket Layer (SSL). The FTP connector doesn't work with implicit FTPS and supports only explicit FTP over FTPS, which is an extension of TLS. Create Logic App and FTP Connection: Create the Logic App workflow, add an FTP Action to List the Files, or any FTP Action based on your requirements. To test the connection for the first time, I recommend using the "List files in folder" Action. In the connection configuration: Server Address: xxxxxxx.ftp.azurewebsites.windows.net (Get this value from the Properties of the Destination service, don't add the "ftps://" section nor the "/site/wwwroot" section) User Name and Password: xxxxxx\xxxxx (This is what we created in the FTP credentials tab under the Deployment Center, in the User scope section, or using the CLI command) FTP Server Port: 21 (use Port 21 to force the connection to be Explicit) Enabled SSL?: checked (use SSL to force the connection to use FTPS) Create Logic App FTP Connection: After creating the connection, use "/site/wwwroot" to access your folder: Test this and see if it works! Troubleshooting: Reference: Deploy content using FTP/S - Azure App Service | Microsoft Learn I recommend to secure the connection password using KeyVault. More on that below. Secure Parameters in Keyvault: Main steps: Put the connection string in KeyVault. Give Access to the Logic App on KeyVault. Add the reference in App Settings for the Logic App. The steps mentioned here: Use Key Vault references - Azure App Service | Microsoft Learn Example of this at the end of this article: A walkthrough of parameterization of different connection types in Logic App Standard | Microsoft Community Hub And that's how you access those files! You can make use of this secure connection for multiple tasks based on your requirements.Scaling mechanism in hybrid deployment model for Azure Logic Apps Standard
Hybrid Logic Apps offer a unique blend of on-premises and cloud capabilities, making them a versatile solution for various integration scenarios. A key feature of hybrid deployment models is their ability to scale efficiently to manage different workloads. This capability enables customers to optimize their compute costs during peak usage by scaling up to handle temporary spikes in demand and then scaling down to reduce costs when the demand decreases. This blog will explore the scaling mechanism in hybrid deployment models, focusing on the role of the KEDA operator and its integration with other components.Logic Apps Aviators Newsletter - March 2025
In this issue: Ace Aviator of the Month News from our product group News from our community Ace Aviator of the Month March’s Ace Aviator: Dieter Gobeyn What’s your role and title? What are your responsibilities? I work as an Azure Solution Architect; however, I remain very hands-on and regularly develop solutions to stay close to the technology. I design and deliver end-to-end solutions, ranging from architectural analysis to full implementation. My responsibilities include solution design, integration analysis, contributing to development, reviewing colleagues’ work, and proposing improvements to our platform. I also provide Production support when necessary. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? My days can vary greatly, but collaboration with my globally distributed team is always a priority. I begin my day promptly at 8 AM to align with different time zones. After our daily stand-up, I often reach out to colleagues to see if they need assistance or follow-up on mails/team messages. A significant portion of my day involves solution design—gathering requirements, outlining integration strategies, and collaborating with stakeholders. I also identify potential enhancements, perform preliminary analysis, and translate them into user stories. I also spend time on technical development, building features, testing them thoroughly, and updating documentation for both internal and client use. On occasions where deeper investigation is needed, I support advanced troubleshooting, collaborating with our support team if issues demand additional expertise. If a release is scheduled, I sometimes manage deployment activities in the evening. What motivates and inspires you to be an active member of the Aviators/Microsoft community? I’ve always valued the sense of community that comes from sharing knowledge. Early in my career, attending events and meeting fellow professionals helped me bridge the gap between theory and real-world practice. This informal environment encourages deeper, hands-on knowledge exchange, which often goes beyond what official documentation can provide. Now that I’m in a more senior role, I believe it’s my responsibility—and pleasure—to give back. Contributing to the community enables me to keep learning, connect with fantastic people, and grow both technically and personally. Looking back, what advice do you wish you had been given earlier that you’d now share with those looking to get into STEM/technology? Master the fundamentals, not just the tools. It’s easy to get caught up in the newest frameworks, cloud platforms, and programming languages. However, what remains constant are the core concepts such as networking, data structures, security, and system design. By understanding the ‘why’ behind each technology, you’ll be better equipped to design future-proof solutions and adapt fast as tools and trends evolve. What has helped you grow professionally? Curiosity and a commitment to continuous learning have been key. I’m always keen to understand the ‘why’ behind how things work. Outside my normal job, I pursue Microsoft Reactor sessions, community events, and personal projects to expand my skills. Just as important is receiving open, honest feedback from peers and being honest with oneself. Having mentors or colleagues who offer both challenges and support is crucial for growth, as they provide fresh perspectives and help you refine your skills. In many cases, I’ve found it takes effort outside standard working hours to truly develop my skills, but it has always been worth it. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? I’d love to see more uniformity & predictability across adapters, for example in terms of their availability for both stateless and stateful workflows. Currently, certain adapters—like the timer trigger—are either unavailable in stateless workflows or behave differently. Unifying adapter support would not only simplify solution design decisions, but also reduce proof-of-concept overhead and streamline transitions between stateless and stateful workflows as requirements evolve. News from our product group Logic Apps Live Feb 2025 Missed Logic Apps Live in February? You can watch it here. You will find a live demo for the Exporting Logic Apps Standard to VS Code, some updates on the new Data Mapper User Experience and lots of examples on how to leverage Logic Apps to create your Gen AI solutions. Exporting Logic App Standard to VS Code Bringing existing Logic Apps Standard deployed in Azure to VS Code are now simpler with the new Create Logic Apps Workspaces from package. New & Improved Data Mapper UX in Azure Logic Apps – Now in Public Preview! We’re excited to announce that a UX update for Data Mapper in Azure Logic Apps is now in Public Preview! We have continuously improved Data Mapper, which is already generally available (GA), based on customer feedback. Parse or chunk content for workflows in Azure Logic Apps (Preview) When working with Azure AI Search or Azure OpenAI actions, it's often necessary to convert content into tokens or divide large documents into smaller pieces. The Data Operations actions, "Parse a document" and "Chunk text," can help by transforming content like PDFs, CSVs, and Excel files into tokenized strings and splitting them based on the number of tokens. These outputs can then be used in subsequent actions within your workflow. Connect to Azure AI services from workflows in Azure Logic Apps Integrate enterprise services, systems, and data with AI technologies by connecting your logic app workflows to Azure OpenAI and Azure AI Search resources. This guide offers an overview and practical examples on how to use these connector operations effectively in your workflow. Power Automate migration to Azure Logic Apps (Standard) Development teams often need to build scalable, secure, and efficient automation solutions. If your team is considering migrating flows from Microsoft Power Automate to Standard workflows in Azure Logic Apps, this guide outlines the key advantages of making the transition. Azure Logic Apps (Standard) is particularly beneficial for enterprises running complex, high-volume, and security-sensitive workloads. AI playbook, examples, and other resources for workflows in Azure Logic Apps AI capabilities are increasingly essential in applications and software, offering time-saving and innovative tasks like chat interactions. They also facilitate the creation of integration workloads across various services, systems, apps, and data within enterprises. This guide provides building blocks, examples, samples, and resources to demonstrate how to use AI services, such as Azure OpenAI and Azure AI Search, in conjunction with other services and systems to build automated workflows in Azure Logic Apps. Collect ETW trace in Logic App Standard An Inline C# script to collect Event Tracing for Windows (ETW) and store it in a text file, from within your Logic Apps. Typical Storage access issues troubleshooting With this blog post we intend to provide you more tools and visibility on how to troubleshoot your Logic App and accelerate your service availability restore. Download Logic App content for Consumption and Standard Logic App in the Portal It's common to see customers needing to download the JSON contents for their Logic Apps, either to keep a copy of the code or to initiate CI/CD. The methods to download this are very simple, accessible on a single button. Running Powershell inline with Az commands- Logic App Standard With the availability of the Inline "Execute Powershell code" action, a few questions have been brought to us like for example how to execute Az commands with this action. Deploy Logic App Standard with Application Routing Feature Based on Terraform and Azure Pipeline This article shared a mature plan to deploy logic app standard then set the application routing features automatically. It's based on Terraform template and Azure DevOps Pipeline. News from our community Azure Logic Apps: create Standard Logic App projects in Visual Studio Code from Azure portal export Post by Stefano Demiliani How many times you had the need to create a new Azure Logic App workflow starting from an existing one? Personally, this happens a lot of time… Starting with version 5.18.7 (published some days ago), the Azure Logic Apps (Standard) extension for Visual Studio Code provides the capability to create Standard Azure Logic App projects from an existing Logic App exported from the Azure portal. Bridging the Gap: Azure Logic Apps Meets On-Prem Fileshares Post by Tim D'haeyer The end of BizTalk Server is fast approaching, signaling a significant shift in the Microsoft integration landscape. With this transition, the era of on-premises integration is drawing to a close, prompting many organizations to migrate their integration workloads to Azure. One key challenge in this process is: “How can I read and write from an on-premises file share using Logic Apps?” Thankfully, this functionality has been available for some time with Azure Logic Apps Standard. Azure Logic Apps vs. Power Apps vs. Power Automate: What to Use When? Post by Prashant Singh The Architect’s Dilemma: Logic Apps vs. Power Apps vs. Power Automate! In my latest blog, I compare Logic Apps, Power Automate, and Power Apps—helping you pick the right one! Securing Azure Logic Apps: Prevent SQL Injection in Complex SQL Server Queries Post by Cameron McKay Executing COMPLEX queries as raw SQL is tempting in Logic App workflows. It's clear how to protect SQL CRUD actions in Logic Apps. BUT how do we protect our complex queries? In the Logic App Standard tier, built-in connectors run locally within the same process as the logic app Post by Sandro Pereira In the Logic App Standard tier, built-in connectors run locally within the same process as the logic app, reducing latency and improving performance. This contrasts with the Consumption model, where many connectors rely on external dependencies, leading to potential delays due to network round-trips. This makes Logic App Standard an ideal choice for scenarios where performance and low-latency integration are critical, such as real-time data processing and enterprise API integrations. Scaling Logic Apps Hybrid Post by Massimo Crippa Logic Apps Hybrid provides a consistent development, deployment, and observability experience across both cloud and edge applications. But what about scaling? Let's dive into that in this blog post. Calling API Management in a different subscription on LA Standard Post by Sandro Pereira Welcome again to another Logic Apps Best Practices, Tips, and Tricks post. Today, we will discuss how to call from Logic App Standard an API exposed in API Management from a different subscription using the in-app API Management connector. How to enable API Management Connector inside VS Code Logic App Standard Workflow Designer Post by Sandro Pereira If you’ve been working with Azure Logic Apps Standard in Visual Studio Code and noticed that the API Management connector is conspicuously absent from the list of connectors inside the workflow designer, you’re not alone. This is a typical behavior that many developers encounter, and understanding why it happens—and how to enable it—can save you a lot of headaches. Do you have strict security requirements for your workflows? Azure Logic Apps is the solution. Post by Stefano Demiliani Azure Logic Apps offers robust solutions for enterprise-level workflows, emphasizing high performance, scalability, and stringent security measures. This article explores how Logic Apps ensures business continuity with geo-redundancy, automated backups, and advanced security features like IP restrictions and VNET integration. Discover why Azure Logic Apps is the preferred choice for secure and scalable automation in large organizations.326Views2likes0Comments🚀 New & Improved Data Mapper UX in Azure Logic Apps – Now in Public Preview!
We’re excited to announce that a UX update for Data Mapper in Azure Logic Apps is now in Public Preview! We have continuously improved Data Mapper, which is already generally available (GA), based on customer feedback. Last year, we conducted a private preview to assess the improvements in the new user experience and confirm that we are on the right track in simplifying complex data transformations, including EDI schemas. With the insights gained, we made significant UI enhancements and added features to streamline the mapping process. Feedback We value your feedback to make the Data Mapper even better. Please share your thoughts, suggestions, and overall experience with us through our feedback form. How feedback shaped the Public Preview Throughout the evolution of Data Mapper, we gathered valuable feedback from customers and partners. Key themes that emerged include: Reliability: Ensuring the Data Mapper can handle large schemas and complex transformation logic, including functions. Error handling: Providing real-time validation by allowing users to test payloads and catch errors while authoring maps. Looping: Clearly indicating when repeating nodes are mapped and ensuring complex objects are properly represented Drag & drop enhancements: Improving how connections between nodes are created for better usability. Deserialization & namespace honoring: Ensuring XML deserialization correctly loads mappings without data loss, preserving namespace integrity for seamless schema validation. We’ve incorporated these suggestions into the public preview, ensuring a more refined and user-friendly experience. What’s new in the Data Mapper UX? 1. Easier navigation Docked schema panels keep you oriented within the data map. Easily search for specific nodes to streamline mapping. 2. Side-by-side function panel Search and use 100+ built-in functions, including mainly: Collection functions (for repeating elements) String manipulations Mathematical operations Conditional logic 3. Automatic looping for repeating nodes When mapping repeating nodes, a new loop connection is automatically added on the immediate parent nodes at source and destination. Repeating parent nodes are denoted by "A1, A2" notation on hover. Note: If the child node in the source has a deeper nesting level than in the destination, you must manually map the connection from the repeating source node to the destination to ensure proper data transformation. 4. Real-time error detection On saving the map, instantly view warnings and errors for missing mappings or incorrect configurations 5. Test Your Map Instantly Preview the output before running your workflow. How to set up and test out the new Data Mapper experience Enable the Preview: Go to your Azure Logic App (Standard) extension -> Settings -> Data Mapper. Select “Version ~2” to try out the new user experience. Light theme: Enable "Light Theme" in VS Code before creating a new data map. Dark Theme is not supported, but is on the roadmap and will be prioritized soon. Create a New Data Map: Navigate to the Azure tab on the left-hand panel of your VS Code. Select “Create New Data Map” and name it. Once loaded, select the schemas for source and destination. Upload schemas: Upload your source and destination schemas before creating the map (eg .xsd or .json files). Limitations While the new Data Mapper UX brings significant improvements, a few limitations remain: Filter function: The filter function correctly processes numeric conditions when enclosed in quotes (e.g., ">= 10"), but does not behave consistently for string comparisons (e.g., checking if item name = "Pen"). We are actively working on refining this behavior. Custom Functions: Support for custom functions is coming in the next refresh to enhance flexibility in data mapping. Usability enhancements: Improved tooltips, function labels, error messages and other UX refinements are on the way to provide clearer guidance and a smoother mapping experience, especially for complex transformations. Future investments The product is going to continue getting better and we should be adding more features very soon! Some immediate investments include: Enhanced test map experience: Making it easier to validate mappings during development. Panel resizing: Allowing users to have flexibility in viewing larger schemas and functions when multiple panels are expanded.2KViews12likes6CommentsIntroducing GenAI Gateway Capabilities in Azure API Management
We are thrilled to announce GenAI Gateway capabilities in Azure API Management – a set of features designed specifically for GenAI use cases. Azure OpenAI service offers a diverse set of tools, providing access to advanced models like GPT3.5-Turbo to GPT-4 and GPT-4 Vision, enabling developers to build intelligent applications that can understand, interpret, and generate human-like text and images. One of the main resources you have in Azure OpenAI is tokens. Azure OpenAI assigns quota for your model deployments expressed in tokens-per-minute (TPMs) which is then distributed across your model consumers that can be represented by different applications, developer teams, departments within the company, etc. Starting with a single application integration, Azure makes it easy to connect your app to Azure OpenAI. Your intelligent application connects to Azure OpenAI directly using API Key with a TPM limit configured directly on the model deployment level. However, when you start growing your application portfolio, you are presented with multiple apps calling single or even multiple Azure OpenAI endpoints deployed as Pay-as-you-go or Provisioned Throughput Units (PTUs) instances. That comes with certain challenges: How can we track token usage across multiple applications? How can we do cross charges for multiple applications/teams that use Azure OpenAI models? How can we make sure that a single app does not consume the whole TPM quota, leaving other apps with no option to use Azure OpenAI models? How can we make sure that the API key is securely distributed across multiple applications? How can we distribute load across multiple Azure OpenAI endpoints? How can we make sure that PTUs are used first before falling back to Pay-as-you-go instances? To tackle these operational and scalability challenges, Azure API Management has built a set of GenAI Gateway capabilities: Azure OpenAI Token Limit Policy Azure OpenAI Emit Token Metric Policy Load Balancer and Circuit Breaker Import Azure OpenAI as an API Azure OpenAI Semantic Caching Policy (in public preview) Azure OpenAI Token Limit Policy Azure OpenAI Token Limit policy allows you to manage and enforce limits per API consumer based on the usage of Azure OpenAI tokens. With this policy you can set limits, expressed in tokens-per-minute (TPM). This policy provides flexibility to assign token-based limits on any counter key, such as Subscription Key, IP Address or any other arbitrary key defined through policy expression. Azure OpenAI Token Limit policy also enables pre-calculation of prompt tokens on the Azure API Management side, minimizing unnecessary request to the Azure OpenAI backend if the prompt already exceeds the limit. Learn more about this policy here. Azure OpenAI Emit Token Metric Policy Azure OpenAI enables you to configure token usage metrics to be sent to Azure Applications Insights, providing overview of the utilization of Azure OpenAI models across multiple applications or API consumers. This policy captures prompt, completions, and total token usage metrics and sends them to Application Insights namespace of your choice. Moreover, you can configure or select from pre-defined dimensions to split token usage metrics, enabling granular analysis by Subscription ID, IP Address, or any custom dimension of your choice. Learn more about this policy here. Load Balancer and Circuit Breaker Load Balancer and Circuit Breaker features allow you to spread the load across multiple Azure OpenAI endpoints. With support for round-robin, weighted (new), and priority-based (new) load balancing, you can now define your own load distribution strategy according to your specific requirements. Define priorities within the load balancer configuration to ensure optimal utilization of specific Azure OpenAI endpoints, particularly those purchased as PTUs. In the event of any disruption, a circuit breaker mechanism kicks in, seamlessly transitioning to lower-priority instances based on predefined rules. Our updated circuit breaker now features dynamic trip duration, leveraging values from the retry-after header provided by the backend. This ensures precise and timely recovery of the backends, maximizing the utilization of your priority backends to their fullest. Learn more about load balancer and circuit breaker here. Import Azure OpenAI as an API New Import Azure OpenAI as an API in Azure API management provides an easy single click experience to import your existing Azure OpenAI endpoints as APIs. We streamline the onboarding process by automatically importing the OpenAPI schema for Azure OpenAI and setting up authentication to the Azure OpenAI endpoint using managed identity, removing the need for manual configuration. Additionally, within the same user-friendly experience, you can pre-configure Azure OpenAI policies, such as token limit and emit token metric, enabling swift and convenient setup. Learn more about Import Azure OpenAI as an API here. Azure OpenAI Semantic Caching policy Azure OpenAI Semantic Caching policy empowers you to optimize token usage by leveraging semantic caching, which stores completions for prompts with similar meaning. Our semantic caching mechanism leverages Azure Redis Enterprise or any other external cache compatible with RediSearch and onboarded to Azure API Management. By leveraging the Azure OpenAI Embeddings model, this policy identifies semantically similar prompts and stores their respective completions in the cache. This approach ensures completions reuse, resulting in reduced token consumption and improved response performance. Learn more about semantic caching policy here. Get Started with GenAI Gateway Capabilities in Azure API Management We’re excited to introduce these GenAI Gateway capabilities in Azure API Management, designed to empower developers to efficiently manage and scale their applications leveraging Azure OpenAI services. Get started today and bring your intelligent application development to the next level with Azure API Management.33KViews10likes14CommentsInbound private endpoint for Standard v2 tier of Azure API Management
Standard v2 was announced in general availability on April 1st, 2024. Customers can now configure an inbound private endpoint (preview) for your API Management Standard v2 instance to allow clients in your private network to securely access the API Management gateway over Azure Private Link. The private endpoint uses an IP address from an Azure virtual network in which it's hosted. Network traffic between a client on your private network and API Management traverses over the virtual network and a Private Link on the Microsoft backbone network, eliminating exposure from the public internet. Further, you can configure custom DNS settings or an Azure DNS private zone to map the API Management hostname to the endpoint's private IP address. Inbound private endpoint With a private endpoint and Private Link, you can: Create multiple Private Link connections to an API Management instance. Use the private endpoint to send inbound traffic on a secure connection. Use policy to distinguish traffic that comes from the private endpoint. Limit incoming traffic only to private endpoints, preventing data exfiltration. Combine with outbound virtual network integration to provide end-to-end network isolation of your API Management clients and backend services. Preview limitations Today, only the API Management instance’s Gateway endpoint supports inbound private link connections. In addition, each API management instance can support at most 100 private link connections. To participate in the preview and add an inbound private endpoint to your Standard v2 instance, you must complete a request form. The Azure API Management team will review your request and respond via email within five business days. Learn more API Management v2 tiers FAQ API Management v2 tiers documentation API Management overview documentationTypical Storage access issues troubleshooting
We get a big number of cases with Storage Account connection failing and sometimes we see that our customers are not aware of the troubleshooting steps they can take to accelerate the resolution of this issue. As such, we've compiled some scenarios and the usual troubleshooting steps we ask you to take. Always remember that if you have done changes to your infrastructure, consider rolling them back to ensure that this is not the root cause. Even a small change that apparently has no effect, may cause downtime on your application. Common messages The errors that are shown in the portal when the Storage Account connectivity is down are very similar, and they may not indicate correctly the cause. Error Message that surfaces in the Portal for Logic Apps Standard System.Private.Core.Lib: Access to the path 'C:\home\site\wwwroot\host.json' is denied Cannot reach host runtime. Error details, Code: 'BadRequest', Message: 'Encountered an error (InternalServerError) from host runtime.' System.Private.CoreLib: The format of the specified network name is invalid. : 'C:\\home\\site\\wwwroot\\host.json'.' System.Private.CoreLib: The user name or password is incorrect. : 'C:\home\site\wwwroot\host.json'. Microsoft.Windows.Azure.ResourceStack: The SSL connection could not be established, see inner exception. System.Net.Http: The SSL connection could not be established, see inner exception. System.Net.Security: Authentication failed because the remote party has closed the transport stream Unexpected error occurred while loading workflow content and artifacts The errors don't really indicate what the root cause is, but it's very common to be a broken connection with the Storage. What to verify? There are 4 major components to verify in these cases: Logic App environment variables and network settings Storage Account networking settings Network settings DNS settings Logic App environment variables and Network From an App Settings point of view, there is not much to verify, but these are important steps, that sometimes are overlook. At this time, all or nearly all Logic Apps have been migrated to dotnet Functions_Worker_Runtime (under Environmental Variables tab), but this is good to confirm. It's also good to confirm if your Platform setting is set to 64 bits (under Configuration tab/ General Settings). We've seen that some deployments are using old templates and setting this as 32 bits, which doesn't make full use of the available resources. Check if Logic App has the following environment variables with value: WEBSITE_CONTENTOVERVNET - set to 1 OR WEBSITE_VNET_ROUTE_ALL - set to 1. OR vnetRouteAllEnabled set to 1. Configure virtual network integration with application and configuration routing. - Azure App Service | Microsoft Learn These settings can also be replaced with the UI setting in the Virtual Network tab, when you select "Content Storage" in the Configuration routing. For better understanding, vnetContentShareEnabled takes precedence. In other words, if it is set (true/false), WEBSITE_CONTENTOVERVNET is ignored. Only if vnetContentShareEnabled is null, WEBSITE_CONTENTOVERVNET is taken into account. Also keep this in mind: Storage considerations for Azure Functions | Microsoft Learn WEBSITE_CONTENTAZUREFILECONNECTIONSTRING and AzureWebJobsStorage have the connection string as in the Storage Account Website_contentazurefileconnectionstring | App settings reference for Azure Functions | Microsoft Learn Azurewebjobsstorage | App settings reference for Azure Functions | Microsoft Learn WEBSITE_CONTENTSHARE has the Fileshare name Website_contentshare | App settings reference for Azure Functions | Microsoft Learn These are the first points to validate. Storage Account settings If all these are matching/properly configured and still the Logic App is in error, we move to the next step, that is to validate the Storage Account network settings. When the Storage Account does not have Vnet integration enabled, there should be no issues, because the connection is made through the public endpoints. Still, even with this, you must ensure that at least the "Allow storage account key access" is enabled. This is because at this time, the Logic App is dependent on the Access key to connect to the Storage Account. Although you can set the AzureWebJobsStorage to run with Managed Identity, you can't fully disable storage account key access for Standard logic apps that use the Workflow Service Plan hosting option. However, with ASE v3 hosting option, you can disable storage account key access after you finish the steps to set up managed identity authentication. Create example Standard workflow in Azure portal - Azure Logic Apps | Microsoft Learn If this setting is enabled, you must check if Storage Account is behind Firewall. The Access may be Enabled for select networks or fully disabled. Both options require Service Endpoints or Private Endpoints configured. Deploying Standard Logic App to Storage Account behind Firewall using Service or Private Endpoints | Microsoft Community Hub So check the Networking tab under the Storage Account and confirm the following: In case you select the "selected networks" option, confirm that the VNET is the same as the Logic App is extended to. Your Logic App and Storage may be hosted in different Vnets, but you must ensure that there is full connectivity between them. They must be peered and with HTTPS and SMB traffic allowed (more explained in the Network section). You can select "Disabled" network access as well. You should also confirm that the Fileshare is created. Usually this is created automatically with the creation of the Logic App, but if you use Terraform or ARM, it may not create the file share and you must do it manually. Confirm if all 4 Private Endpoints are created and approved (File, Table, Queue and Blob). All these resources are used for different components of the Logic App. This is not fully documented, as it is internal engine documentation and not publicly available. For Azure Functions, the runtime base, this is partially documented, as you can read in the article: Storage considerations for Azure Functions | Microsoft Learn If a Private Endpoint is missing, create it and link it to the Vnet as Shared Resource. Not having all Private Endpoints created may end in runtime errors, connections errors or trigger failures. For example, if a workflow is not generating the URL even if it saves correctly, it may be the Table and Queue Private Endpoints missing, as we've seen many times with customers. You can read a bit more about the integration of the Logic App and a firewall secured Storage Account and the needed configuration in these articles: Secure traffic between Standard workflows and virtual networks - Azure Logic Apps | Microsoft Learn Deploy Standard logic apps to private storage accounts - Azure Logic Apps | Microsoft Learn You can use the Kudu console (Advanced tools tab) to further troubleshoot the connection with the Storage Account by using some network troubleshooting commands. If the Kudu console is not available, we recommend using a VM in the same Vnet as the Logic App, to mimic the scenario. Nslookup [hostname or IP] [DNS HOST IP] TCPPing [hostname or IP]:[PORT] Test-Netconnection [hostname] -port [PORT] If you have Custom DNS, the command NSLookup will not return the results from your DNS unless you specify the IP address as a parameter. Instead, you can use the nameresolver command for this, which will use the Vnet DNS settings to check for the endpoint name resolution. nameresolver [endpoint hostname or IP address] Networking Related Commands for Azure App Services | Microsoft Community Hub Vnet configuration Having configured the Private Endpoint for the Logic App will not affect traffic to the Storage. This is because the PE is only for Inbound traffic. The Storage Communication will the considered as outbound traffic, as it's the Logic App that actively communicates with the Storage. Secure traffic between Standard workflows and virtual networks - Azure Logic Apps | Microsoft Learn So consider that the link between these resources must not be interrupted. This forces you to understand that the Logic App uses both HTTPS and SMB protocols to communicate with the Storage Account, meaning that traffic under the ports 443 and 445 needs to be fully allowed in your Vnet. If you have a Network Security Group associated with the Logic App subnet, you need to confirm that the rules are allowing this traffic. You may need to explicitly create rules to allow this. Source port Destination port Source Destination Protocol Purpose * 443 Subnet integrated with Standard logic app Storage account TCP Storage account * 445 Subnet integrated with Standard logic app Storage account TCP Server Message Block (SMB) File Share In case you have forced routing to your Network Virtual Appliance (i.e. Firewall), you must also ensure that this resource is not filtering the traffic or blocking it. Having TLS inspection enabled in your Firewall must also be disabled, for the Logic App traffic. In short, this is because the Firewall will replace the certificate in the message, thus making the Logic App not recognizing the returned certificate, invalidating the message. You can read more about TLS inspection in this URL: Azure Firewall Premium features | Microsoft Learn DNS If you are using Azure DNS, this section should not apply, because all records are automatically created once you create the resources, but if you're using a Custom DNS, when you create the Azure resource (ex: Storage Private Endpoint), the IP address won't be registered in your DNS, so you must do it manually. You must ensure that all A Records are created and maintained, also keeping in mind that they need to point to the correct IP and name. If there are mismatches, you may see the communications severed between the Logic App and other resources, such as the Storage Account. So double-check all DNS records, and confirm that all is in proper state and place. And to make it even easier, with the help of my colleague Mohammed_Barqawi , this information is now translated into a easy to understand flowchart. If you continue to have issues after all these steps are verified, I suggest you open a case with us, so that we can validate what else may be happening, because either a step may have been missed, or some other issue may be occurring.Collect ETW trace in Logic App Standard
ETW stands for Event Tracing for Windows. It's a powerful and built-in tracing mechanism within the Windows operating system that allows developers and system administrators to capture detailed information about the behavior of applications, the system itself, and even kernel-mode drivers. Usually we can use the Logman tool to collect the ETW , but this tool is not allowed to be used in logic app kudo , so the solution was to come up with a code base solution How to collect the EWT using C# ? I have built in previous article a small tool that collect the ETW from Http operation , the whole idea is to implement a class of type EventListener How this tool works download the ETW collector flow and the C# file from Github in the compose action modify the URL to your test case url where you are calling http endpoint or sftp or SMTP etc. then run the collector flow , after the flow is finished you need to open Kudo and in the log folder to will find the log file What the C# do it will create the ETW listener and then call the child logic app which is the logic app that we will test , I have some event that I am ignoring , the because they are very nosey and some of them will make Stack over flow error like System.Data.DataCommonEventSource private readonly Hashtable ignoredList = new Hashtable { { "Private.InternalDiagnostics.System.Net.Http_decrypt", null }, { "System.Data.DataCommonEventSource_Trace", null }, { "System.Buffers.ArrayPoolEventSource", null }, { "System.Threading.Tasks.TplEventSource", null }, { "Microsoft-ApplicationInsights-Data", null } }; then putting all the logs in Data table and finally convert this data table to text file How to open the log file? I do recommend to use something like Klogg variar/klogg or normal VS code.Running Powershell inline with Az commands- Logic App Standard
With the availability of the Inline "Execute Powershell code" action, a few questions have been brought to us like for example how to execute Az commands with this action. First, let's talk about requirements. As the documentation states, we need to use a Logic App Standard workflow, and when we add the action, it will create two files: a requirements.psd1 file and the execute_powershell_code.ps1 file, in the workflow folder, for reference. Add and run PowerShell in Standard workflows - Azure Logic Apps | Microsoft Learn We can import private modules and public modules, but these aren't imported automatically. We need to specify the modules we need. So, to make our lives easier, the great folks in the Product Group already gave us some hints on how to import them. For our purposes, we will import only the Az module. The Requirements file will need to look like this: # This file enables modules to be automatically managed by the Functions service. # See https://aka.ms/functionsmanageddependency for additional information. # @{ # For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'. Uncomment the next line and replace the MAJOR_VERSION, e.g., 'Az' = '5.*' 'Az' = '10.*' } With the module imported, we can now face the script itself. This demo is a very simplistic script, that will list the resource groups in the subscriptions. The example script is as follows: $action = Get-TriggerOutput Connect-AzAccount -Identity $results = Get-AzSubscription | ForEach-Object { $subscriptionName = $_.Name Set-AzContext -SubscriptionId $_.SubscriptionId (Get-AzResourceGroup).ResourceGroupName | ForEach-Object { [PSCustomObject] @{ Subscription = $subscriptionName ResourceGroup = $_ } } } Push-WorkflowOutput -Output $results The trick to make this work is that we're connecting to the AzAccount with the Managed Identity. The Logic App already has a System Assigned MI out of the box, that allows you to connect to the Az environment, but you will need to assign permissions as well. For my example, I assigned the MI a Contributor role, for the subscription, as I was listing all Resource Groups in it. But you may restrict as needed, of course. With the test, you can see that this simple script executes quite nicely, taking about 45 seconds to complete (may depend on user experience and the script complexity). Keep in mind that this is running on a WS1 plan, so it may be a bit slow. Once it caches the request, it's quite fast: So, to summarize, the steps taken to achieve proper execution for Az commands with the inline PowerShell action were: Add the action Import the Az module in the requirements file Assign the proper role to the Logic App Managed Identity Create the script Test!