Azure DevOps
1438 TopicsHow to sync date and time formats between Azure DevOps and Jira?
Jira Cloud and Azure DevOps have different APIs that support different data formats, even for the same type of data. Azure DevOps uses the HTML format, while Jira uses Wiki. When transferring date-time information between both platforms, transformers need to work behind the scenes to convert between both formatting styles. That’s the only way to get coherent data. But, this conversion is not available by default due to the lack of compatibility. So, a third-party tool is needed to bridge this gap. In this article, I’ll discuss how to sync date and time formats between Azure DevOps and Jira Cloud. We'll use a third-party integration tool called Exalate to implement this use case. Use case requirements If your Jira and Azure DevOps instances are hosted in different time zones, and you need to manipulate the date-time fields so that the same date is kept, here is how we do it. Let us assume the two teams are 5 hours apart. Here is what we can do: “Due Date” and “Start Date” are custom date-time fields in Azure DevOps “Start date” is a custom date field in Jira “Due” is the standard due date field in Jira We would like to add 5 hours to the Jira timestamp when they are received in the Azure DevOps instance. Then, we would like to subtract 5 hours from the Azure DevOps timestamp when they are received in Jira. How to use Exalate to sync date and time formats between Azure DevOps and Jira cloud? Exalate comes with an AI-powered scripting engine that allows you to write mapping rules for any connection. It supports the Groovy language, which you can use to write the transformation algorithm for the date-time information. To implement this use case, you need to install Exalate on both your Jira Cloud and Azure DevOps instance. Then, set up a connection in the Script mode. Once connected, click Configure Sync and navigate to the Rules tab, where you’ll set up sync rules. In the Rules tab, you’ll find default scripts for syncing basic fields like summary, description, comments, and attachments. To sync custom fields or behaviors, you’ll need to add your own scripts. The Rules tab is divided into: Outgoing Sync: Defines the information sent from Jira to Azure DevOps. Incoming Sync: Maps the information received from Azure DevOps into Jira. The same exists on the Azure DevOps side. Jira outgoing script replica.customFields."Start date" = issue.customFields."Start date" replica.due = issue.due The script for the data going out of Jira is being fetched from the custom field named “Start date”, while the due date is going over as the replica.due expression. Jira incoming script import java.text.SimpleDateFormat import java.text.DateFormat import java.util.Calendar import java.util.Date def datePattern = "yyyy-MM-dd HH:mm:ss"; DateFormat formatter = new SimpleDateFormat(datePattern); dateString = replica."start" dateString = dateString.replaceAll("T"," ").trim(); dateString = dateString.replaceAll("Z"," ").trim(); date = formatter.parse(dateString); def timestamp = date.time Calendar calendar = Calendar.getInstance() calendar.timeInMillis = timestamp calendar.add(Calendar.HOUR_OF_DAY, -5) def updatedTimestamp = calendar.timeInMillis issue.customFields."Start date".value = updatedTimestamp dateString = replica."duedate" dateString = dateString.replaceAll("T"," ").trim(); dateString = dateString.replaceAll("Z"," ").trim(); date = formatter.parse(dateString); timestamp = date.time calendar.timeInMillis = timestamp calendar.add(Calendar.HOUR_OF_DAY, -5) issue.due = calendar.getTime() The code snippet above allows you to set the datePattern and parsing it through a formatter. It also fetches the timestamp down to milliseconds from the calendar. The calendar.add(Calendar.HOUR_OF_DAY, -5) expression specifies that the timestamp coming into Jira should be at least 5 hours behind the original time obtained from Azure DevOps. Azure DevOps outgoing script replica."start" = workItem."Microsoft.VSTS.Scheduling.StartDate" replica."duedate" = workItem."Microsoft.VSTS.Scheduling.DueDate" This code snippet fetches the start and due dates from the default Microsoft Azure DevOps Server (formerly VSTS or Visual Studio Team System). Azure DevOps incoming script import java.text.SimpleDateFormat import java.util.Calendar import java.util.Date def convertJiraTimeToAdoTime(String dateString){ if(dateString == null) return String inputFormat = "yyyy-MM-dd HH:mm:ss.S" String outputFormat = "yyyy-MM-dd'T'HH:mm:ss'Z'" // Create SimpleDateFormat objects for the input and output formats SimpleDateFormat inputDateFormat = new SimpleDateFormat(inputFormat) SimpleDateFormat outputDateFormat = new SimpleDateFormat(outputFormat) // Parse the input date string into a Date object Date date = inputDateFormat.parse(dateString) def timestamp = date.time Calendar calendar = Calendar.getInstance() calendar.timeInMillis = timestamp calendar.add(Calendar.HOUR_OF_DAY, 5) def updatedTimestamp = calendar.timeInMillis // Convert the Date object into the output format return outputDateFormat.format(updatedTimestamp) // String } // does not set the field String inputDateString = replica.customFields."Start date"?.value workItem."Microsoft.VSTS.Scheduling.StartDate" = convertJiraTimeToAdoTime(inputDateString) inputDateString = replica.due workItem."Microsoft.VSTS.Scheduling.DueDate" = convertJiraTimeToAdoTime(inputDateString) This code snippet converts the incoming date-time information into a string before parsing it as input into the date object. It then fetches the date from the calendar to be 5 hours ahead of the time fetched from the Jira Cloud instance. Once you’re done, review the scripts to make sure everything is in order before publishing the changes. That’s all! Your date-time formats are now in sync between Azure DevOps and Jira Cloud. Using script-based solutions, like Exalate, can seem overwhelming at first, but if done correctly, they can be very effective for your integration needs. Have any specific use case that requires handling different data formats? Just ask our integration engineers for a walkthrough of what’s possible or drop a comment here.7Views0likes0CommentsHow to reduce downloading time for releasing pipeline
When I do a deployment task on a given deployment server, it downloads all linked artifacts for the entire release. So I'm downloading a ton of stuff I don't need. For Example, if it taking 10 minutes to complete the task and out of 10 minutes, it takes 8 minutes just to download artifacts. Is there a way to avoid downloading unnecessary artifacts?415Views0likes1CommentTechnical Walkthrough: Deploying a SQL DB like it's Terraform
Introduction This post will be a union of multiple topics. It is part of the SQL CI/CD series and as such will build upon Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub and Managed SQL Deployments Like Terraform | Microsoft Community Hub while also crossing over with the YAML Pipeline series This is an advanced topic in regard to both Azure DevOps YAML and SQL CI/CD. If both of these concepts are newer to you, please refer to the links above as this is not designed to be a beginner's approach in either one of these domains. Assumptions To get the most out of this and follow along we are going to assume that you are 1.) On board with templating your Azure DevOps YAML Pipelines. By doing this we will see the benefit of quickly onboarding new pipelines, standardizing our deployment steps, and increasing our security. We also are going to assume you are on board with Managed SQL Deployments Like Terraform | Microsoft Community Hub for deploying your SQL Projects. By adopting this we can increase our data security, confidence in source control, and speed our time to deployment. For this post we will continue to leverage the example cicd-adventureWorks repository for the source of our SQL Project and where the pipeline definition will live. Road mapping the Templates Just like my other YAML posts let's outline the pieces required in this stage and we will then break down each job Build Stage Build .dacpac job run `dotnet build` and pass in appropriate arguments execute a Deploy Report from the dacpac produced by the build and the target environment copy the Deploy Report to the build output directory publish the pipeline artifact Deploy Stage Deploy .dacpac job run Deploy Report from dacpac artifact (optional) deploy dacpac, including pre/postscripts Build Stage For the purposes of this stage, we should think of building our .dacpac similar to a terraform or single page application build. What I am referring to is we will produce an artifact per environment, and this will be generated from the same codebase. Additionally, we will run a 'plan' which will be the proposed result of deploying our dacpac file. Build Job We will have one instance of the build job for each environment. Each instance will produce a different artifact as they will be passing different build configurations which in turn will result in a different .dacpac per environment. If you are familiar with YAML templating, then feel free to jump to the finish job template. One of the key differences with this job structure, as opposed to the one outlined in Deploying .dacpacs to Multiple Environments via ADO Pipelines is the need for a Deploy Report. This is the key to unlocking the CI/CD approach which aligns with Terraform. This Deploy Report detects our changes on build, similar to running a terraform plan. Creating a Deploy Report is achieved by setting the DeployAction attribute in the SQLAzureDacpacDeployment@1 action to 'DeployReport' Now there is one minor "bug" in the Microsoft SQLAzureDacpacDeployment task, which I have raised with the ADO task. It appears the output path for the Deploy Report as well as the Drift Report are hardcoded to the same location. To get around this I had to find out where the Deploy Report was being published and, for our purposes, have a task to copy the Deploy Report to the same location as the .dacpac and then publish them both as a single folder. Here is the code for the for a single environment to build the associated .dacpac and produce the Deploy Repo stages: - stage: adventureworksentra_build variables: - name: solutionPath value: $(Build.SourcesDirectory)// jobs: - job: build_publish_sql_sqlmoveme_dev_dev steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration dev /p:NetCoreBuild=true /p:DacVersion=1.0.1 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/dev/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_dev ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev artifact: sqlmoveme_dev_dev properties: '' The end result will be similar to: (I have two environments in the screenshot below) One can see I have configured this to run a Deploy Report across each regional instance, thus the `cus` folder, of a SQL DB I do this is to identify and catch any potential schema and data issues. The Deploy Reports are the keys to tie this to the thought of deploying and managing SQL Databases like Terraform. These reports will execute when a pull request is created as part of the Build and again at Deployment to ensure changes from PR to deployment that may have occurred. For the purposes of this blog here is a deployment report indicating a schema change: This is an important artifact for organizations whose auditing policy requires documentation around deployments. This information is also available in the ADO job logs: This experience should feel similar to Terraform CI/CD...THAT'S A GOOD THING! It means we are working on developing and refining practices and principals across our tech stacks when it comes to SDLC. If this feels new to you then please read Terraform, CI/CD, Azure DevOps, and YAML Templates - John Folberth Deploy Stage We will have a deploy stage for each environment and within that stage will be a job for each region and/or database we are deploying our dacpac to. This job can be a template because, in theory, our deploying process across environments is identical. We will run a deployment report and deploy the .dacpac which was built for the specific environment and will include any and all associated pre/post scripts. Again this process has already been walked through in Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub Deploy Job The deploy job will take what we built in the deployment process in Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub and we will add a perquisite job to create a second Deployment Report. This process is to ensure we are aware of any changes in the deployed SQL Database that may have occurred after the original dacpac and Deployment Report were created at the time of the Pull Request. By doing this we now have a tight log identifying any changes that were being made right before we deployed the code. Next, we need to make a few changes to override the default arguments of the .dacpac publish command in order to automatically deploy changes that may result in data loss. Here is a complete list of all the available properties SqlPackage Publish - SQL Server | Microsoft Learn. The ones we are most interested in are DropObjectsNotInSource and BlockOnPossibleDataLoss. DropObjectsNotInSource is defined as: Specifies whether objects that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. This value takes precedence over DropExtendedProperties. This is important as it will drop and delete objects that are not defined in our source code. As I've written about previously this will drop all those instances of "Shadow Data" or copies of tables we were storing. This value, by default, is set to false as a safeguard from a destructive data action. Our intention though is to ensure our deployed database objects match our definitions in source control, as such we want to enable this. BlockOnPossibleDataLoss is defined as: Specifies that the operation will be terminated during the schema validation step if the resulting schema changes could incur a loss of data, including due to data precision reduction or a data type change that requires a cast operation. The default (True) value causes the operation to terminate regardless if the target database contains data. An execution with a False value for BlockOnPossibleDataLoss can still fail during deployment plan execution if data is present on the target that cannot be converted to the new column type. This is another safeguard that has been put in place to ensure data isn't lost in the situation of type conversion or schema changes such as dropping a column. We want this set to `true` so that our deployment will actually deploy in an automated fashion. If this is set to `false` and we are wanting to update schemas/columns then we would be creating an anti-pattern of a manual deployment to accommodate. When possible, we want to automate our deployments and in this specific case we have already taken the steps of mitigating unintentional data loss through our implementation of a Deploy Report. Again, we should have confidence in our deployment and if we have this then we should be able to automate it. Here is that same deployment process, including now the Deploy Report steps: - stage: adventureworksentra_dev_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_dev_cus environment: name: dev dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-dev-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True Putting it Together Let's put together all these pieces. This example will show an expanded pipeline that has the following stages and jobs Build a stage Build Dev job Build Tst job Deploy Dev stage Deploy Dev Job Deploy tst stage Deploy tst Job And here is the code: resources: repositories: - repository: templates type: github name: JFolberth/TheYAMLPipelineOne endpoint: JFolberth trigger: branches: include: - none pool: vmImage: 'windows-latest' parameters: - name: projectNamesConfigurations type: object default: - projectName: 'sqlmoveme' environmentName: 'dev' regionAbrvs: - 'cus' projectExtension: '.sqlproj' buildArguments: '/p:NetCoreBuild=true /p:DacVersion=1.0.1' sqlServerName: 'adventureworksentra' sqlDatabaseName: 'moveme' resourceGroupName: adventureworksentra ipDetectionMethod: 'AutoDetect' deployType: 'DacpacTask' authenticationType: 'servicePrincipal' buildConfiguration: 'dev' dacpacAdditionalArguments: '/p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false' - projectName: 'sqlmoveme' environmentName: 'tst' regionAbrvs: - 'cus' projectExtension: '.sqlproj' buildArguments: '/p:NetCoreBuild=true /p:DacVersion=1.0' sqlServerName: 'adventureworksentra' sqlDatabaseName: 'moveme' resourceGroupName: adventureworksentra ipDetectionMethod: 'AutoDetect' deployType: 'DacpacTask' authenticationType: 'servicePrincipal' buildConfiguration: 'tst' dacpacAdditionalArguments: '/p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false' - name: serviceName type: string default: 'adventureworksentra' stages: - stage: adventureworksentra_build variables: - name: solutionPath value: $(Build.SourcesDirectory)// jobs: - job: build_publish_sql_sqlmoveme_dev_dev steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration dev /p:NetCoreBuild=true /p:DacVersion=1.0.1 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/dev/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_dev ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev artifact: sqlmoveme_dev_dev properties: '' - job: build_publish_sql_sqlmoveme_tst_tst steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration tst /p:NetCoreBuild=true /p:DacVersion=1.0 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/tst/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/tst/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_tst_tst ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/tst artifact: sqlmoveme_tst_tst properties: '' - stage: adventureworksentra_dev_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_dev_cus environment: name: dev dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-dev-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True - stage: adventureworksentra_tst_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_tst_cus environment: name: tst dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_tst_tst\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-tst-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_tst_tst\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True In ADO it will look like: We can see the important Deploy Report being created and can confirm that there are Deploy Reports for each environment/region combination: Conclusion With the inclusion of deploy reports we now have the ability to create Azure SQL Deployments that adhere to modern DevOps approaches. We can ensure our environments will be in sync with how we have defined them in source control. By doing this we achieve a higher level of security, confidence in our code, and reduction in shadow data. To learn more on these approaches with SQL Deployments be sure to check out my other blog articles on the topic "SQL Database Series" in "Healthcare and Life Sciences Blog" | Microsoft Community Hub and be sure to follow me on LinkedInData Migration from azure devops serveer to new one
I have a problem in data migration from Azure DevOps server to the new one in my case, i have a split database on a different machine. So, I have two environments: each one contains a devops server and database in splitter machines i want to migrate the whole data from different tools to a new one and the problem is as the organization roles i do not have all administration privileges on databases and devops machines So, I want a determined way to get the whole data from the old one to the new one (especially privileges that are required to ask the admin to give me ) with clear steps335Views0likes1CommentPost-job: Cache fails
In the pic above Iam trying to cache my yaml.lock file to reduce runtime and Iam using the azure devops docs to do that. The first Cache step seems to work fine and there is "a cache miss" and then it runs yarn install. But the Error comes in the Post-job: Its taking 10 minutes and then timing out i think. --Note-- Iam using a self hosted agent here and i added GNU Tar and 7-zip and restarted the agent. --End Note-- I dont know if the above had something to do with my problem. But it didnt work before or after. Can anyone help out here?641Views0likes1CommentError Unauthorized when trigger a flow with Webhook and HTTP request
Hi guys, I have an error when O want to send a webhook from Azure DevOps to trigger a Flow with an HTTP request. In Azure DevOps I get an Unauthorized error. But when I access the HTTP request URL in the browser everything works fine. Can anybody help? Thx!!438Views0likes1CommentGetting secrets from Key Vault in YAML pipeline
If you have ever created an Azure App Service or Azure Function App that uses app settings, then you have dealt with the problem of how you are going to get those settings secure and updated correctly in each environment. You need a secure location to store this information and then be able to access it during your deployment process. Azure Key Vault and using the Azure Key Vault task inside a deployment pipeline in Azure DevOps can solve this problem for you. If you prefer video, then have a look at this as it will walk you through the steps of getting this setup.Utilizing Azure Key vault with Private link in DevOps
Azure Key Vault is a cloud service that provides secure storage and access to secrets such as API keys, passwords, certificates, or cryptographic keys. To enhance security and disable public access, Azure Key Vault can be integrated with Private Endpoint powered by Azure Private Link. This private endpoint uses a private IP address from your VNet and brings the service into your VNet, effectively eliminating exposure from the public Internet by traversing traffic between your virtual network and the service over the Microsoft backbone network.