storage
114 TopicsPowerShell script to delete all Containers from a Storage Account
After move the BootDiag settings out of the Custom Storage Account, the original Storage Account used for are still consuming space for nothing. This is part of the standard Clean Up stream need to be consider into the FinOps Plan. This script will help you to clean these Storage Accounts quickly and avoid cost paid for nothing. Connect-AzAccount #Your Subscription $MySubscriptionToClean = "MyGuid-MyGuid-MyGuid-MyGuid-MyGuid" $MyStorageAccountName = "MyStorageAccountForbootdiags" $MyStorageAccountKey = "MySAKeyWithAllCodeProvidedByYourStorageAccountSetting+MZ3cUvdQ==" $ContainerStartName = "bootdiag*" #Set subscription ID Set-AzContext -Subscription $MySubscriptionToClean Get-AzContext $ctx = New-AzStorageContext -StorageAccountName $MyStorageAccountName -StorageAccountKey $MyStorageAccountKey $myContainers = Get-AzStorageContainer -Name $ContainerStartName -Context $ctx -MaxCount 1000 foreach($mycontainer in $myContainers) { Remove-AzStorageContainer -Name $mycontainer.Name -Force -Context $ctx } I used this script to remove millions of BootDiag Containers from several Storage Accounts. You can also use it for any other cleanup use case if you need it. Fab61Views0likes1CommentLinking containers in azure storage blockblob
Hi! I need to link two containers to each other, much like a soft link in a file system. I need it to avoid using storage for the same exact file twice, which I want under two containers. i.e The file https://account.blob.core.windows.net/1/image.jpg under container 1 needs to be accessible via https://account.blob.core.windows.net/2/image.jpg as well (under container 2). Notice that the whole point is to have one blob named image.jpg under two containers, not two identical files! Does anybody have any idea about how to make this happen? I did not find any documentation enabling such a feature. Thanks!3.3KViews0likes2CommentsHow to Sync Azure Blob Storage with AzCopy
A couple of months ago, I wrote a blog about how you can sync files to Azure Blob storage using AzCopy. Today I got an exciting update from one of my issues for AzCopy on GitHub, AzCopy version 10.3 now supports the synchronization feature from Azure Blob storage to Azure Blob storage. This means you can now sync Azure Blob to Azure Blob using AzCopy. You can read more here: https://www.thomasmaurer.ch/2019/11/how-to-sync-azure-blob-storage-with-azcopy/1.7KViews4likes1CommentAzure Site Recovery breaking SQL Server log chain
I'm protecting a SQL Server with ASR and doing backups with Commvault. Commvault is telling me that transaction log backups are being converted to fulls because it detected a break in the log chain probably caused by another backup product. I dug into it and found that ASR's VSS writer is taking a snapshot once an hour and SQL Server is seeing this as a full backup. Has anyone had experience with this? I feel like the ASR VSS snapshot shouldn't be counted as a full backup of the database2.4KViews0likes1CommentAzure Recovery purging recovery points?
If you have setup a backup job using Azure, and you have decided at a later date that you do not need quite as many recovery points as you once thought. Is it possible to take for instance Azure backups that are currently holding 198 recovery points and trim it down to say 30? Without losing all your backup jobs? Or do you have to delete all your old backups and start completely over?4.3KViews0likes2CommentsProblems mapping Azure file shares on-premises
Hi. I'm new to Azure. I'm an MSP and a Microsoft Partner. I just created my Storage Account and first file share. I'm trying to map the file share from an on-premises Windows 10 workstation and Windows Server 2012. Neither work. I keep getting the error "network path not found"...error code 0x80070035. I tried to map the drive using PowerShell and Windows map drive from the GUI using the instructions for the Azure Portal (connect). I have verified that port 445 is open. Is this supported from on-premises locations?33KViews0likes6CommentsMoving files to Azure
We are looking to movie files, not just back them up to Azure. We have an on-prem server that we use as a file repository. We would like to accomplish the following Is there a way to configure backup services, or another app, to move the data to the cloud, not just create a backup? I would like it if the program could pull the files and folders off the on-prem server, to make room, and store them in the cloud. Can we configure it so the data is moved after a certain time frame say everything after 6 months, based on file modification date, is moved to the cloud?1.2KViews0likes1CommentUnable to unzip file using Extract archive to folder
Hi Team, I used a logic app to download a zip file from a URL and saved it in a blob. Now, I want to unzip the file. So I used "Extract archive to folder" action in Logic App. But since the size of my zip file is more than 50 MB, the extract isn't successful and I am facing the below error: { "status": 413, "message": "The file contains 50.306 megabytes which exceeds the maximum 50 megabytes.\r\nclientRequestId: abcd", "error": { "message": "The file contains 50.306 megabytes which exceeds the maximum 50 megabytes." }, "source": "azureblob-ci.azconn-ci.p.azurewebsites.net" } How can I increase this threshold value of 50 MB and get my action triggered? Please help. Regards, Mitesh Agrawal3.6KViews0likes0CommentsReference to database and/or server name in 'Azure.dbo.XXX' is not supported in this version of SQL
I am getting an error that seemingly indicates that I'm trying to use linked server or cross-database query when I am in no way trying to. I've opened a stackexchange question w/ full details but no one has seen this scenario. Reference to database and/or server name in 'Azure.dbo.XXX' is not supported in this version of SQL Server (where XXX is my table name) See full details on StackExchange but basically I can SELECT, INSERT, and UPDATE to this particular table but cannot DELETE from it. When I try to run a DELETE statement, I receive the error above. Have tried disabling TRIGGERs, have tried various combinations of schema/database/tablename in the FROM clause. Pulling my hair out here.Solved105KViews0likes8CommentsAzure API Management for Authenticating Internal Tools
Trying to wrap my head around the "best" way to authenticate some internal tooling for our organization that integrates nicely with all the Microsoft 365 resources. I'd like to be able to transparently utilize our existing Azure AD to authenticate these client side interfaces (teams apps, sharepoint web parts, generic web apps/desktop apps, etc) when they call back-end web APIs that will exist. I'm trying to stay fairly low cost, as these are going to mostly be tiny apps e.g. some mild automations and information surfacing. Right now I am looking at using Azure Functions + a storage account using the Tables storage API + API management to do auth for the api. The auth problem of Functions seems to be a fair bit more complicated, even with the api management included, as the functions seem to be "accessible" by just going to them directly circumventing the management. Does this make sense for some pretty small almost "toy" applications for now?849Views0likes0Comments