powershell
28 TopicsPowerShell script to delete all Containers from a Storage Account
After move the BootDiag settings out of the Custom Storage Account, the original Storage Account used for are still consuming space for nothing. This is part of the standard Clean Up stream need to be consider into the FinOps Plan. This script will help you to clean these Storage Accounts quickly and avoid cost paid for nothing. Connect-AzAccount #Your Subscription $MySubscriptionToClean = "MyGuid-MyGuid-MyGuid-MyGuid-MyGuid" $MyStorageAccountName = "MyStorageAccountForbootdiags" $MyStorageAccountKey = "MySAKeyWithAllCodeProvidedByYourStorageAccountSetting+MZ3cUvdQ==" $ContainerStartName = "bootdiag*" #Set subscription ID Set-AzContext -Subscription $MySubscriptionToClean Get-AzContext $ctx = New-AzStorageContext -StorageAccountName $MyStorageAccountName -StorageAccountKey $MyStorageAccountKey $myContainers = Get-AzStorageContainer -Name $ContainerStartName -Context $ctx -MaxCount 1000 foreach($mycontainer in $myContainers) { Remove-AzStorageContainer -Name $mycontainer.Name -Force -Context $ctx } I used this script to remove millions of BootDiag Containers from several Storage Accounts. You can also use it for any other cleanup use case if you need it. Fab61Views0likes1CommentAzure PowerShell find LastOwnershipUpdateTime on disk
Hello: I wondering if it's possible to find LastOwnershipUpdateTime on the disk via PowerShell. I can see this info in the portal, but cannot figure out how to find it via script (PowerShell). Looks like MSFT recently released it, but even updating my Az.Compute module to the latest (9.0.0) version I still do not see it. Any help would be really appreciated. Thank you!Solved69Views0likes3CommentsPowershell Script to remove all Blobs from Storage account
With large number of Blobs in Storage Account, the manual cleanup from the Portal is more complicated and time consuming, as it's per set of 10'000. This script is simple and and can be executed in background to clean all items from a defined Blob Container. You have to specify the Storage Account connection string and the blob container name. [string]$myConnectionString = "DefaultEndpointsProtocol=https;AccountName=YourStorageAccountName;AccountKey=YourKeyFromStorageAccountConnectionString;EndpointSuffix=core.windows.net" [string]$ContainerName = "YourBlobContainerName" [int]$blobCountAfter = 0 [int]$blobCountBefore = 0 $context = New-AzStorageContext -ConnectionString $myConnectionString $blobCountBefore = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container Before deletion: $blobCount" -ForegroundColor Yellow Get-AzStorageBlob -Container $ContainerName -Context $context | ForEach-Object { $_ | Remove-AzureStorageBlob # or: Remove-AzureStorageBlob -ICloudBlob $_.ICloudBlob -Context $ctx } $blobCountAfter = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container After deletion : $blobCount" -ForegroundColor Green It was used for large blob storage container with more than 5 millions of blob items. Sources: https://learn.microsoft.com/en-us/powershell/module/az.storage/new-azstoragecontext?view=azps-13.0.0#examples https://learn.microsoft.com/en-us/answers/questions/1637785/what-is-the-easiest-way-to-find-the-total-number-o https://stackoverflow.com/questions/57119087/powershell-remove-all-blobs-in-a-container Fab198Views1like1CommentAzure - PowerShell script to change the Table Retention in Azure Log Analytics Workspaces
With large scale implementation of Azure, the Log Analytics Workspace volume could increase and the default value for retention is quite long if you are not changing it. This PowerShell script will help you to reset the 2 retention values applied in Workspace Tables (Live and Total). I applied a selection criteria based in name as we are using a naming convention with status (prod, vs nonprod), you can anyway adapt this part with your context. #Install-Module -Name Az -Repository PSGallery -Force Import-module Az Connect-AzAccount $RetentionDays = 30 $TotalRetentionDays = 30 $AzureRetentionDays = 90 $AzureTotalRetentionDays = 90 $namecriteria = "nonprod" $All_Az_Subscriptions = Get-AzSubscription Foreach ($Az_Subscription in $All_Az_Subscriptions) { ################################################### #Set the context Write-Host "Working on subscription ""$($Az_Subscription.Name)""" Set-AzContext -SubscriptionObject $Az_Subscription | Out-Null $AllWorkspaces = Get-AzOperationalInsightsWorkspace foreach ($myWorkspace in $AllWorkspaces) { Write-Host " ---------------", $myWorkspace.Name ,"---------------- " -foregroundcolor "gray" if ($myWorkspace.Name -match $namecriteria) { Write-Host " >>> WORKSPACE TO APPLY RETENTION ADJUSTMENT:", $myWorkspace.Name -foregroundcolor "green" if ($myWorkspace.retentionInDays -gt $RetentionDays) { Write-Host " >>> APPLYING DEFAULT RETENTION PERIOD:", $RetentionDays -foregroundcolor "yellow" Set-AzOperationalInsightsWorkspace -ResourceGroupName $myWorkspace.ResourceGroupName -Name $myWorkspace.Name -RetentionInDays $RetentionDays } $GetAllTables = Get-AzOperationalInsightsTable -ResourceGroupName $myWorkspace.ResourceGroupName -WorkspaceName $myWorkspace.Name foreach ($MyTable in $GetAllTables) { if (($MyTable.Name -eq "AzureActivity") -or ($MyTable.Name -eq "Usage")) { if (($MyTable.RetentionInDays -gt $AzureRetentionDays) -or ($MyTable.TotalRetentionInDays -gt $AzureTotalRetentionDays)) { Write-Host " >>> APPLYING SPECIFIC RETENTION PERIOD:", $AzureRetentionDays, "- TABLE:", $MyTable.Name -foregroundcolor "yellow" Update-AzOperationalInsightsTable -ResourceGroupName $MyTable.ResourceGroupName -WorkspaceName $MyTable.WorkspaceName -TableName $MyTable.Name -RetentionInDays $AzureRetentionDays -TotalRetentionInDays $AzureTotalRetentionDays } else { Write-Host " >>> NO CHANGE FOR RETENTION PERIOD FOR TABLE:", $MyTable.Name -foregroundcolor "green" } } else { if (($MyTable.RetentionInDays -gt $RetentionDays) -or ($MyTable.TotalRetentionInDays -gt $RetentionDays)) { Write-Host " >>> APPLYING NEW RETENTION PERIOD:", $RetentionDays, "- TABLE:", $MyTable.Name -foregroundcolor "yellow" Update-AzOperationalInsightsTable -ResourceGroupName $MyTable.ResourceGroupName -WorkspaceName $MyTable.WorkspaceName -TableName $MyTable.Name -RetentionInDays $RetentionDays -TotalRetentionInDays $TotalRetentionDays } else { Write-Host " >>> NO CHANGE FOR RETENTION PERIOD FOR TABLE:", $MyTable.Name -foregroundcolor "green" } } } } else { Write-Host " >>> WORKSPACE NOT CONCERNED BY THIS CHANGE:", $myWorkspace.Name -foregroundcolor "green" } } } With this script, we reduced the Workspace cost for non prod drastically maintaining only the last 30 days live without any archive. The material used for this script is: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-retention-archive?tabs=portal-3%2Cportal-1%2Cportal-2 https://learn.microsoft.com/en-us/powershell/module/az.operationalinsights/get-azoperationalinsightsworkspace?view=azps-11.6.0 https://learn.microsoft.com/en-us/powershell/module/az.operationalinsights/update-azoperationalinsightstable?view=azps-11.6.0 Fabrice Romelard732Views0likes1CommentAzure - PowerShell Script to delete a specific Tag for any resources in all your Subscriptions
A classical question after many months of usage and delegation to different admin is related to the TAG Cleanup. You can be faced to a large diversity of Tags created at one moment, but not useful and mainly not maintained. This small script will help you to execute this cleanup in all your subscriptions you are in charge. Import-module Az Connect-AzAccount [string]$TagName = "YourSpecificTagKey" $TagCount = 0 $All_Az_Subscriptions = Get-AzSubscription Foreach ($Az_Subscription in $All_Az_Subscriptions) { Write-Host " " Write-Host " --------------------------------------- " Write-Host "Working on subscription ""$($Az_Subscription.Name)""" -foregroundcolor "yellow" $TagCount = 0 Set-AzContext -SubscriptionObject $Az_Subscription | Out-Null $AllTaggedresources = Get-AzResource -TagName $TagName $TagCount = $AllTaggedresources.Count Write-Host " >> TAG "" $($TagName) "" found "" $($TagCount) "" times" -foregroundcolor "green" if($TagCount -gt 0) { $AllTaggedresources.ForEach{ if ( $_.tags.ContainsKey($TagName) ) { $_.tags.Remove($TagName) } $_ | Set-AzResource -Tags $_.tags -Force } } } This script was inspired by these pages: https://stackoverflow.com/questions/54162372/how-to-fix-this-error-in-azure-powershell-can-not-remove-tag-tag-value-becaus https://learn.microsoft.com/en-us/powershell/module/az.resources/set-azresource?view=azps-11.6.0 Fabrice Romelard775Views0likes0CommentsAzure function app- Powershell runtime stack
Hello community, Can anyone please help me with the query i have. I have a Azure Powershell function app set up in my environment. i have 2 timer triggered functions say F1 and F2 running at 5 mins interval. They both are connecting to 2 different SharePoint sites with the same tenant using Connect-PnPOnline with Client Id and Certificates. Issue is that both of them are conflicting the run of the other function when running. Errors i get- Save conflict. Your changes conflict with those made concurrently by another user. OR List 'X' is not present in the site F2 My understanding is the function even though are in the same App can run independently of each other as separate instances. Is there a way other than updating the apps to run at 2 different times to fix this issue?1.3KViews0likes1CommentHow to pass ARM Variable as parameter of PowerShell in ARM
Hi There, I am working on a ARM template and it is working fine however I want to pass ARM variable(s) should be passed in the argument rather hardcoding. Like same storage account name should be used in the arguments. I want to pass the same variable storage account name in the line number 192 -storageaccount Here is the ARM { "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "functionAppName": { "type": "string", "defaultValue": "[format('func-{0}', uniqueString(resourceGroup().id))]", "metadata": { "description": "The name of the Azure Function app." } }, "storageAccountType": { "type": "string", "defaultValue": "Standard_LRS", "allowedValues": [ "Standard_LRS", "Standard_GRS", "Standard_RAGRS" ], "metadata": { "description": "Storage Account type" } }, "location": { "type": "string", "defaultValue": "[resourceGroup().location]", "metadata": { "description": "Location for all resources." } }, "functionWorkerRuntime": { "type": "string", "defaultValue": "node", "allowedValues": [ "dotnet", "node", "python", "java" ], "metadata": { "description": "The language worker runtime to load in the function app." } }, "functionPlanOS": { "type": "string", "defaultValue": "Windows", "allowedValues": [ "Windows", "Linux" ], "metadata": { "description": "Specifies the OS used for the Azure Function hosting plan." } }, "functionAppPlanSku": { "type": "string", "defaultValue": "S1", "allowedValues": [ "S1", "S2", "S3" ], "metadata": { "description": "Specifies the Azure Function hosting plan SKU." } }, "createfunctionkey": { "type": "string" }, "apimanagementkey": { "type": "securestring", "defaultValue": "[base64(newGuid())]" }, "linuxFxVersion": { "type": "string", "defaultValue": "", "metadata": { "description": "Only required for Linux app to represent runtime stack in the format of 'runtime|runtimeVersion'. For example: 'python|3.9'" } }, "useridentity": { "type": "string" }, "scriptname": { "type": "string" } }, "variables": { "hostingPlanName": "[parameters('functionAppName')]", "storageAccountName": "[concat(uniquestring(resourceGroup().id), 'azfunctions')]", "functionhostkey": "[parameters('apimanagementkey')]", "isReserved": "[if(equals(parameters('functionPlanOS'), 'Linux'), true(), false())]" }, "resources": [ { "type": "Microsoft.Storage/storageAccounts", "apiVersion": "2021-02-01", "name": "[variables('storageAccountName')]", "location": "[parameters('location')]", "sku": { "name": "[parameters('storageAccountType')]" }, "kind": "Storage" }, { "type": "Microsoft.Web/serverfarms", "apiVersion": "2021-02-01", "name": "[variables('hostingPlanName')]", "location": "[parameters('location')]", "sku": { "tier": "Standard", "name": "[parameters('functionAppPlanSku')]", "family": "S", "capacity": 1 }, "properties": { "reserved": "[variables('isReserved')]" } }, { "condition": "[equals(parameters('createfunctionkey'), 'yes')]", "type": "Microsoft.Web/sites/host/functionkeys", "apiVersion": "2020-06-01", "dependsOn": [ "[resourceId('Microsoft.Web/sites', parameters('functionAppName'))]" ], "name": "[concat(parameters('functionAppName'),'/default/apiManagementKey')]", "properties": { "name": "apiManagementKey", "value": "[variables('functionhostkey')]" } }, { "type": "Microsoft.Web/sites", "apiVersion": "2021-02-01", "name": "[parameters('functionAppName')]", "location": "[parameters('location')]", "kind": "[if(variables('isReserved'), 'functionapp,linux', 'functionapp')]", "dependsOn": [ "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]", "[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]" ], "properties": { "reserved": "[variables('isReserved')]", "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]", "siteConfig": { "alwaysOn": true, "linuxFxVersion": "[if(variables('isReserved'), parameters('linuxFxVersion'), json('null'))]", "appSettings": [ { "name": "AzureWebJobsStorage", "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';EndpointSuffix=', environment().suffixes.storage, ';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName')), '2019-06-01').keys[0].value)]" }, { "name": "FUNCTIONS_EXTENSION_VERSION", "value": "~4" }, { "name": "FUNCTIONS_WORKER_RUNTIME", "value": "[parameters('functionWorkerRuntime')]" }, { "name": "WEBSITE_NODE_DEFAULT_VERSION", "value": "~14" }, { "name": "WEBSITE_RUN_FROM_PACKAGE", "value": "1" } ] } } }, { "type": "Microsoft.Resources/deploymentScripts", "name": "[parameters('scriptname')]", "apiVersion": "2020-10-01", "location": "[parameters('location')]", "kind": "AzurePowerShell", "identity": { "type": "UserAssigned", "userAssignedIdentities": { "[parameters('useridentity')]": { } } }, "properties": { "azPowerShellVersion": "3.0", "primaryScriptUri": "https://raw.githubusercontent.com/INGourav/Azure-Resources/master/KeyVaultSecretUsingSAS.ps1", "arguments": "-azsub 'Goukumar' -rg 'pstest' -keyvault 'pstestk' -storageaccount 'pstests' -secretname 'secretarm333'", "timeout": "PT30M", "forceUpdateTag": "utcNow()", "retentionInterval": "PT1H", "cleanupPreference": "OnSuccess" } } ] }Solved2.2KViews0likes1CommentDeploying bot channel registration New-AzResourceGroupDeployment CHANNEL_NOT_SUPPORTED
Hello, I'm trying to automate bot creation on Azure using powershell. I 'm able to create an Azure application using New-AzureADApplication command. I'm also able to create an AzureBot using New-AzBotService command. Now I'm struggling to add a new channel registration, the Microsoft Teams channel. The command New-AzResourceGroupDeployment command returns the following error (correlation ID : 3ca89268-5443-45f9-96e6-6f5cfd1a7e57) PS C:\Users\XXXRoot> New-AzResourceGroupDeployment -ResourceGroupName $ResourceGroup.ResourceGroupName -TemplateFile $PathTemplate New-AzResourceGroupDeployment : 10:52:44 AM - The deployment 'ChannelTemplate' failed with error(s). Showing 1 out of 1 error(s). Status Message: Channel is not supported (Code:CHANNEL_NOT_SUPPORTED) CorrelationId: 3ca89268-5443-45f9-96e6-6f5cfd1a7e57 At line:1 char:1 + New-AzResourceGroupDeployment -ResourceGroupName $ResourceGroup.Resou ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [New-AzResourceGroupDeployment], Exception + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet DeploymentName : ChannelTemplate ResourceGroupName : XXXPREPROD_group ProvisioningState : Failed Timestamp : 2021-11-25 9:52:41 AM Mode : Incremental TemplateLink : Parameters : Outputs : DeploymentDebugLogLevel : I didn't find anything on this particular error (CHANNEL_NOT_SUPPORTED). Here is the template file I used : { "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "resources": [ { "condition": true, "type": "Microsoft.BotService/botServices/channels", "apiVersion": "2021-03-01", "name": "Office_A_Handle/msteamschannel", "location": "global", "tags": { }, "sku": { "name": "F0" }, "kind": "azurebot", "properties": { "channelName": "MsTeamsChannel", "properties": { "callingWebHook": "https://server:port/route", "enableCalling": true, "isEnabled": true } } } ] } Can someone help me ? Not sure I understood everything about New-AzResourceGroupDeployment command. Regards,Solved1.1KViews0likes1CommentAzure Pipeline - Powershell Task Output
Hello all, I am looking to leverage Azure Pipelines to replace an existing scheduled task that I have running on an on-premise server. Today, my script connects to a SQL database using the dbatools PowerShell module, runs a few queries, and dumps the files to a specific file format. Then uploads these files to an SFTP server. My repository contains the SQL files to run, the Get-Data.ps1 script that connects to the database, retrieves the data, and exports it to the format necessary. I also have a Send-Data.ps1 which takes the files created from Get-Data.ps1 and SFTPs them to an SFTP server. I've got 1 PowerShell script to retrieve and output the data to files, and another to SFTP the files to an SFTP server. My Send-Data.ps1 script has 3 parameters - localpath, remotepath and credential. I am curious how I could have the Get-Data.ps1 script write as output the data required for "localpath" parameter, and then pass this to the Send-Data.ps1 as a 2nd task in the pipeline? Additionally, is there a recommended way to pass credentials into PowerShell scripts in a pipeline? I saw where I can leverage the library to store a username and password to pass into a script. But this isn't a PSCredential object. Would I do this by using Azure Keyvault? Could I do it using the library and the inline script part of the pipeline? Or read it in from the key vault/library, create the PSCredential object, store that as a variable(?) and pass it into the PowerShell script task? Thanks in advance for any input. Steve2.2KViews0likes0CommentsAzure DevOps - How to monitor the files & folders placed into the Build Pipeline Workfolder
When we are building a new Pipeline into Azure DevOps, we don't have any visibility on the server itself and his content placed into work folder. In many situation, like debugging, that vision is really useful to find a file or a path to apply into another task (like Code Coverage report path). To do that, the simplest option is to place into the pipeline flow one PowerShell step with this simple execution. In Yaml mode: - powershell: | Write-Host "Show all folder content" Get-ChildItem -Path $(Agent.WorkFolder)\*.* -Recurse -Force errorActionPreference: continue displayName: 'PowerShell Script List folder structure' continueOnError: true In visual editor mode: At the next execution, you will have into the Pipeline execution log all folders and files placed into the agent workplace like: You can use the search button or View raw log option to look the file or folder you are looking for. That task could be disable in standard usage, and enable only when you need to debug. Fabrice Romelard28KViews0likes0Comments