azure
42 TopicsAnnouncing the Next generation Azure Data Box Devices
Microsoft Azure Data Box offline data transfer solution allows you to send petabytes of data into Azure Storage in a quick, inexpensive, and reliable manner. The secure data transfer is accelerated by hardware transfer devices that enable offline data ingestion to Azure. Our customers use the Data Box family to move petabytes-scale data into Azure for backup, archival, data analytics, media and entertainment, training, and different workload migrations etc. We continue to get requests about moving truly massive amounts of data in a secure, simple and quick manner. We’ve heard you and to address your needs, we’ve designed a new, enhanced product to meet your data transfer needs. About the latest innovation in Azure Data Box Family Today, we’re excited to announce the preview of Azure Data Box 120 and Azure Data Box 525, our next-generation compact, NVMe-based Data Box devices. The new offerings reflect insights gained from working with our customers over the years and understanding their evolving data transfer needs. These new devices incorporate several improvements to accelerate offline data transfers to Azure, including: Fast copy - Built with NVMe drives for high-speed transfers and improved reliability and support for faster network connections Easy to use - larger capacity offering (525 TB) in a compact form-factor for easy handling Resilient - Ruggedized devices built to withstand rough conditions during transport Secure - Enhanced physical, hardware and software security features Broader availability – Presence in more Azure regions, meeting local compliance standards and regulations What’s new? Improved Speed & Efficiency NVMe devices offer faster data transfer rates, with copy speeds up to 7 GBps via SMB Direct on RDMA (100-GbE) for medium to large files, a 10x improvement in data transfer speeds as compared to previous generation devices. High-speed transfers to Azure with data upload up to 5x faster for medium to large files, reducing the lead time for your data to become accessible in the Azure cloud. Improved networking with support for up to 100 GbE connections, as compared to 10 GbE on the older generation of devices. Two options with usable capacity of 120 TB and 525 TB in a compact form factor meeting OSHA requirements. Devices ship the next day air in most regions. Learn more about the performance improvements on Data Box 120 and Data Box 525. Enhanced Security The next generation devices come with several new physical, hardware and software security enhancements. This is in addition to the built in Azure security baseline for Data Box and Data Box service security measures currently supported by the service. Secure boot functionality with hardware root of trust and Trusted Platform Module (TPM) 2.0. Custom tamper-proof screws and built-in intrusion detection system to detect unauthorized device access. AES 256-bit BitLocker software encryption for data at rest is currently available. Hardware encryption via the RAID controller, which will be enabled by default on these devices, is coming soon. Furthermore, once available, customers can enable double encryption through both software and hardware encryption to meet their sensitive data transfer requirements. These ISTA 6A compliant devices are built to withstand rough conditions during shipment while keeping both the device and your data safe and intact. Learn more about the enhanced security features on Data Box 120 and Data Box 525. Broader Azure region coverage Recurring request from our customers has been for wider availability of our higher-capacity device to ease large migrations. We’re happy to share Data Box 525 will be available across most Azure regions where the Data Box service is currently live. This marks a significant improvement in availability of a large-capacity device as compared to the current Data Box Heavy. What our customers have to say For the last several months, we’ve been working directly with our customers of all industries and sizes to leverage the next generation devices for their data migration needs. Customers love the larger capacity with form-factor familiarity, seamless set up and faster copy. “This new offering brings significant advantages, particularly by simplifying our internal processes. With deployments ranging from hundreds of terabytes to even petabytes, we previously relied on multiple regular Data Box devices—or occasionally Data Box Heavy devices—which required extensive operational effort. The new solution offers sizes better aligned with our needs, allowing us to achieve optimal results with fewer logistical steps. Additionally, the latest generation is faster and provides more connectivity options at data centre premises, enhancing both efficiency and flexibility for large-scale data transfers.” - Lukasz Konarzewski, Senior Data Architect, Commvault “We have been using the devices to move 1PB of archival media data to Azure blob storage using the Data Box transfer devices. The next generation devices provided a very smooth setup and copy experience, and we were able to transfer data in larger chunks and much faster than before. Overall, this has helped shorten our migration lead times and land the data in the cloud quickly and seamlessly.” - Daniel Perry, Kohler “We have had a positive experience overall with the new Data Box devices to move our data to Azure Blob storage. The devices offer easy plug and play installation, detailed documentation especially for the security features and good data copy performance. We would definitely consider using it again for future large data migration projects.” – Bas Boeijink, Cloud Engineer, Eurofiber Cloud Infra Sign up for the Preview The Preview is available in the US, Canada, EU, UK, and US Gov Azure regions, and we will continue to expand to more regions in the coming months. If you are interested in the preview, we want to hear from you. Customers can sign up here ISV partners can sign up here You can learn more about the all-new Data Box devices here. We are committed to continuing to deliver innovative solutions to lower the barrier for bringing data to Azure. Your feedback is important to us. Tell us what you think about the new Azure Data Box preview by writing to us at DataBoxPM@microsoft.com – we can’t wait to hear from you. Stop by and see us! Now that you’ve heard about the latest innovation in the product family, do come by and see the new devices at the Ignite session What’s new in Azure Storage: Supercharge your data centric workloads, on 21st November starting 11:00 AM CST. You can also drop by the Infra Hub to learn more from our product experts and sign up to try the new devices for your next migration!1.2KViews7likes0CommentsIntroducing Virtual Machine restore points – a simpler way to protect Azure workloads
Virtual Machine restore point are now generally available. Customers and Azure partners who are looking to build business continuity and disaster recovery solutions can use VM restore points to capture app consistent and crash consistent backups natively on the Azure platform. This can then be used to restore disks and VMs during scenarios such as data loss, data corruption or disaster recovery.18KViews5likes0CommentsAzure File Sync: faster, more secure and Windows Server 2025 support
Azure File Sync enables seamless tiering of data from on-premises Windows Servers to Azure Files for hybrid use cases and simplified migration. It also enables you to leverage the performance, flexibility and compatibility of your on-premises File Server while leveraging the scale and cost effectiveness of Azure Files. The latest updates for Azure File Sync bring a host of exciting features and improvements: Faster server onboarding and disaster recovery (7x improvement), significantly reducing the time to access data on new server endpoints. Sync performance has been significantly improved (10x improvement), reducing the amount of time to migrate shares and sync a large number of changes (for example, permission changes). Windows Server 2025 support ensures that organizations can stay on the cutting edge of technology. Windows Server 2025 introduces enhanced capabilities, offering better scalability, security, and cloud integration. Copilot in Azure can help you quickly troubleshoot and resolve common Azure File Sync issues. Managed identities support is now in preview, enabling a more secure method to authenticate to your Azure File shares. In this blog post, we’ll explore these key updates and what they mean for businesses looking to maximize their Azure File Sync experience. Whether it's reducing your on-premises footprint or ensuring seamless and secure cloud integration, now is the ideal time to embrace Azure File Sync and take full advantage of what it has to offer. Faster server provisioning and improved disaster recovery for Azure File Sync server endpoints One of the most significant updates in Azure File Sync is the dramatic reduction in time required for provisioning new server endpoints. Previously, setting up a new server endpoint could take hours or even days, but with the v19 release and later, we’ve drastically cut down the time it takes to access data on the new server endpoint. This enhancement is critical for disaster recovery and is especially impactful when the Azure file share contains millions of files and folders. Furthermore, to enhance the management experience, we’ve introduced a Provisioning Steps tab in the portal, which allows you to easily determine when server endpoints are ready for use. You can now access data before syncing is complete. As users or applications navigate through their data, the system prioritizes relevant items for quicker access, eliminating the need to wait for a full download. These improvements help businesses quickly get their server endpoints up and running without long delays, improving overall operational efficiency. For more information, see Create an Azure File Sync server endpoint documentation. Improved sync performance for migrations & bulk updates Another exciting update for Azure File Sync is the substantial improvement in sync performance, now reaching up to 200 items per second. This marks a tenfold improvement over the past two years. This enhancement strengthens Azure File Sync's role as a seamless migration tool, enabling faster data transfers, especially those that require a large number of file changes (for example, when file permissions are changed). It's particularly beneficial for customers aiming to replace on-premises file servers and manage larger data sizes with Azure File Sync. Support for Windows Server 2025 Azure File Sync now supports Windows Server 2025 which has improved security, performance and manageability . The Azure File Sync extension for Windows Admin Center now supports Windows Servers from Windows Server 2025 down to Windows Server 2012 R2. This makes Azure File Sync suitable for a wide range of organizations regardless of their current server version. Azure File Sync facilitates the modernization of file servers, allowing organizations to seamlessly transition to newer servers running Windows Server 2025. The integration with Windows Admin Center (WAC) provides centralized management, offering a unified interface for managing configurations across multiple File Sync servers. This integration simplifies the management process, reducing complexity and saving time. With this configuration, businesses can utilize Windows Server as a fast cache for their Azure file share and optionally implement cloud tiering for more efficient data management. Enhancing File Sync with Copilot in Azure With Copilot in Azure, you can now supercharge your Azure deployments by taking advantage of cutting-edge AI technology that simplifies troubleshooting and resolution like never before. Whether it’s network misconfigurations, incorrect RBAC permissions, or accidental file share deletions, Copilot makes fixing these issues faster and easier than ever. Copilot automatically detects errors and misconfigurations, guides you through the necessary steps to resolve them, and can even take action on your behalf to fix common problems instantly. If you encounter challenges with Azure File Sync due to incorrect network settings, simply enter a prompt like, “Help me troubleshoot Azure File Sync issues.” Copilot in Azure will walk you through the steps to identify and correct the network misconfigurations, ensuring that your files sync smoothly again. By leveraging Copilot’s intelligent capabilities, you not only save time on manual troubleshooting but also gain the confidence to resolve issues independently, allowing you to focus more on growing your business instead of dealing with roadblocks. With Copilot, you stay ahead of the curve, maximizing productivity and minimizing downtime in your Azure environment. For more information, see Troubleshoot and resolve Azure File Sync issues using Microsoft Copilot. Preview: Managed identities support for enhanced security Azure File Sync now includes support for managed identities (MI). This feature allows organizations to authenticate with Azure File shares using an Entra ID identity, replacing the need for a shared key. The new managed identities support enables more secure authentication across several areas of Azure File Sync, including: Storage Sync Service authentication to Azure File shares Registered server authentication to Azure File shares Registered server authentication to Storage Sync Service For more information, see How to use managed identities with Azure File Sync (preview). Get Started with File Sync Don’t have Azure File Sync yet? To get started, see How to deploy Azure File Sync. Share Your Feedback Your feedback is invaluable to us as it shapes and refines Azure File Sync and Azure Files. Please take a moment to share your feedback with us.743Views4likes1CommentEnable Secure access to Azure Storage Account across multiple subscriptions
Public read access to Azure containers and blob storage is an easy and convenient way to share data, however it also poses a security risk. For better and enhanced security, public access to the entire storage account can be disallowed regardless of the public access setting for an individual container present within the storage container. Disallowing public access to storage prevents a user from enabling public access for a container in the respective storage account. Ensuring secure access to storage account(s) across subscriptions and storage accounts can be tedious as we grow. Here is a solution that can help you to disallow public access to storage account(s) at scale. You can extract the list of all storage accounts from the Azure subscription(s) and use the same .csv file as an input in the solution below to disallow access to storage account containers at scale across all your subscriptions. Pre-Requisite: - Az Modules must be installed - Service Principal created as part of Step 1, must be having contributor level access to subscriptions Steps to follow: Step 1: Create a service principal Please refer: https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal https://docs.microsoft.com/en-us/powershell/azure/create-azure-service-principal-azureps?view=azps-5.7.0 Post creation of service principal, please retrieve below values. Tenant Id Client Secret Client Id Step 2: Create a PowerShell function which will be used in generating authorization token function Get-apiHeader{ [CmdletBinding()] Param ( [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $TENANTID, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $ClientId, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $PasswordClient, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $resource ) $tokenresult=Invoke-RestMethod -Uri https://login.microsoftonline.com/$TENANTID/oauth2/token?api-version=1.0 -Method Post -Body @{"grant_type" = "client_credentials"; "resource" = "https://$resource/"; "client_id" = "$ClientId"; "client_secret" = "$PasswordClient" } $token=$tokenresult.access_token $Header=@{ 'Authorization'="Bearer $token" 'Host'="$resource" 'Content-Type'='application/json' } return $Header } Step 3: Invoke API to retrieve authorization token using function created in above step Note: Replace $TenantId, $ClientId and $ClientSecret with value captured in step 1 $AzureApiheaders = Get-apiHeader -TENANTID $TenantId -ClientId $ClientId -PasswordClient $ClientSecret -resource "management.azure.com" Step 4: Extracting list of storage accounts across accessible subscriptions $subscriptionList = Get-AzSubscription $subscriptionIdList = $subscriptionList.Id foreach($subscriptionId in $subscriptionIdList) { $resourceURL = "https://management.azure.com/subscriptions/$($subscriptionId)/providers/Microsoft.Storage/storageAccounts?api-version=2021-01-01" $resourcedetails=(Invoke-RestMethod -Uri $resourceURL -Headers $AzureApiheaders -Method GET) $TableData = $resourcedetails.value.ID } Step 5: Enable secure access to storage account foreach($Data in $TableData) { #Select Current Subscription and get All Storage Accounts $resourceid=$Data $resourceURL="https://management.azure.com$($resourceid)?api-version=2021-02-01" $resourcedetails=(Invoke-RestMethod -Uri $resourceURL -Headers $AzureApiheaders -Method GET) $resourcelocation=$resourcedetails.location $permissions=$resourcedetails.properties.allowBlobPublicAccess if($permissions -eq $false) { Write-Output "Public access to Storage Account: $($resourcedetails.name) is already disabled" } Else { Write-Output "Changing ACL for Storage Account: $($resourcedetails.name)" $body = @" { "location":"$($resourcelocation)", "properties": { "allowBlobPublicAccess": "false" } }"@ Invoke-RestMethod -Uri $resourceURL -Method Put -Headers $AzureApiheaders -Body $body } } Overall Script: function Get-apiHeader{ [CmdletBinding()] Param ( [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $TENANTID, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $ClientId, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $PasswordClient, [Parameter(Mandatory=$true)] [System.String] [ValidateNotNullOrEmpty()] $resource ) $tokenresult=Invoke-RestMethod -Uri https://login.microsoftonline.com/$TENANTID/oauth2/token?api-version=1.0 -Method Post -Body @{"grant_type" = "client_credentials"; "resource" = "https://$resource/"; "client_id" = "$ClientId"; "client_secret" = "$PasswordClient" } $token=$tokenresult.access_token $Header=@{ 'Authorization'="Bearer $token" 'Host'="$resource" 'Content-Type'='application/json' } return $Header } $AzureApiheaders = Get-apiHeader -TENANTID $TenantId -ClientId $ClientId -PasswordClient $ClientSecret -resource "management.azure.com" $subscriptionList = Get-AzSubscription $subscriptionIdList = $subscriptionList.Id foreach($subscriptionId in $subscriptionIdList) { $resourceURL = "https://management.azure.com/subscriptions/$($subscriptionId)/providers/Microsoft.Storage/storageAccounts?api-version=2021-01-01" $resourcedetails=(Invoke-RestMethod -Uri $resourceURL -Headers $AzureApiheaders -Method GET) $TableData = $resourcedetails.value.ID foreach($Data in $TableData) { #Select Current Subscription and get All Storage Accounts $resourceid=$Data $resourceURL="https://management.azure.com$($resourceid)?api-version=2021-02-01" $resourcedetails=(Invoke-RestMethod -Uri $resourceURL -Headers $AzureApiheaders -Method GET) $resourcelocation=$resourcedetails.location $permissions=$resourcedetails.properties.allowBlobPublicAccess if($permissions -eq $false) { Write-Output "Public access to Storage Account: $($resourcedetails.name) is already disabled" } Else { Write-Output "Changing ACL for Storage Account: $($resourcedetails.name)" $body = @" { "location":"$($resourcelocation)", "properties": { "allowBlobPublicAccess": "false" } }"@ Invoke-RestMethod -Uri $resourceURL -Method Put -Headers $AzureApiheaders -Body $body } } } References: https://docs.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-configure?tabs=powershell14KViews3likes2Comments