Data + Storage
103 TopicsAzure ADF ServiceNow connector can't retrieve table columns but same login can do in REST API
I have tried to create a pipeline using copy activity to extract data from a table in our ServiceNow dev platform. I have first used the latest version of the ServiceNow connector. However, it didn't work. When I tried to import schema, it shows below error message: Failed to load. The API request to ServiceNow failed. Request Url: https://airtrunkautemp.service-now.com/api/now/table/sys_dictionary?sysparm_query=name%3dfacilities_request^ORname%3dsm_order^ORname%3dtask, Status Code: Forbidden, Error message: {"error":{"message":"Insufficient rights to query records","detail":"Field(s) present in the query do not have permission to be read"},"status":"failure"} Activity ID: 5a99e871-893d-4426-809e-0b22654248f8 Then I tried to use the legacy version of the ServiceNow connector, extract full table data using query. After I executed the pipeline, only 1 column sys_id is returned. I have contacted the ServiceNow support for the issue, they checked and got back me the access login has no issue. Then I wrote python to use REST API to retrieve data from the same table, it works, I could extract all table columns without the insufficient rights issue. Does anyone have this experience before? How did you solve it?25Views0likes1CommentUnable to process AAS model connecting to Azure SQL with Service Account
Hello I have built a demo SSAS model that I am hosting on an Azure Analysis Services Server. The model connects to an Azure SQL database in my tenant (the Database is the default AdventureWorks provided by Azure when creating your first DB). To connect to the Azure SQL, I have created an App (service principal) and granted it reader access to my Azure SQL DB. If I login to the Azure SQL DB from SSMS with this account, using Microsoft Entra Service Principal Authentication providing ClientId@TenantID for the Username and SecretValue as the password, I am able to login and SELECT from the tables. However, when I try to process the SSAS model, I get an error. For reference, below I have put the TMSL script that sets the DataSource part of the SSAS after deployment via YAML pipelines (variables are replaced when running). I think the issue lies in the "AuthenticationKind" value I have provided in the credential, but I can't figure out what to use. When I create the datasource like this and process, I get error: Failed to save modifications to the server. Error returned: '<ccon>Windows authentication has been disabled in the current context.</ccon>. I don't understand why since I am not using Windows authentication kind. Every other keyword I used ib the "AuthenticationKind" part returns error AuthenticationKind not supported. Any help on how to change this script would be useful. { "createOrReplace": { "object": { "database": "$(AAS_DATABASE)", "dataSource": "$(AZSQLDataSourceName)" }, "dataSource": { "type": "structured", "name": "$(AZSQLDataSourceName)", "connectionDetails": { "protocol": "tds", "address": { "server": "$(AZSQLServer)" }, "initialCatalog": "$(AZSQLDatabase)" }, "credential": { "AuthenticationKind": "ServiceAccount", "username": "$(AZSQL_CLIENT_ID)@$(AZSQL_TENANT_ID)", "password": "$(AZSQL_CLIENT_SECRET)" } } } }40Views0likes1CommentTroubleshooting Azure Function App Proxy with Private Blob Container Access for Static Web App
Recently, I shared a problem I’m facing in my testing environment with a friend. I’ve decided to bring this issue to an open forum discussion to gather additional insights. I hope you can help me figure out what might be missing in my configuration. **Context:** I’m trying to replicate a solution in my test environment but encountering difficulties in a specific scenario. **Scenario:** I have a Function App acting as a proxy for a Static Web App hosted in a Blob Container. This Blob Container is set to private access, meaning public access is disabled. **The Problem:** The goal is for my Function App to authorize users and direct them correctly to the Static Web App. However, it’s not working as expected. **What I’ve tried so far:** 1. Configured Managed Identity for the Function App and granted the necessary permissions to the Blob Container. 2. Properly set up authentication and created the App Registration, which works flawlessly. 3. Verified that the proxy functions correctly when the Blob Container’s public access is enabled. **Current behavior:** - When public access to the Blob Container is enabled, everything works fine. - When public access is disabled, even with the proxy configured, access fails, and an error message "resource not found" is returned. **My questions are:** 1. Do I need to configure something additional in the proxy definition file? 2. Is there a specific setting, like a private endpoint or something similar, that I should implement to resolve this issue? **Additional considerations:** I haven’t configured a private endpoint yet, but I’m considering whether this would be the most appropriate solution for my case. My initial expectation was that granting the necessary permissions to the Function App via Managed Identity would solve the issue, but it hasn’t. I appreciate any guidance or suggestions you can provide!81Views0likes1CommentAzure support team not responding to support request
I am posting here because I have not received a response to my support request despite my plan stating that I should hear back within 8 hours. It has now gone a day beyond that limit, and I am still waiting for assistance with this urgent matter. This issue is critical for my operations, and the delay is unacceptable. The ticket/reference number for my original support request was 2410100040000309. And I have created a brand new service request with ID 2412160040010160. I need this addressed immediately.91Views0likes3CommentsPowershell Script to remove all Blobs from Storage account
With large number of Blobs in Storage Account, the manual cleanup from the Portal is more complicated and time consuming, as it's per set of 10'000. This script is simple and and can be executed in background to clean all items from a defined Blob Container. You have to specify the Storage Account connection string and the blob container name. [string]$myConnectionString = "DefaultEndpointsProtocol=https;AccountName=YourStorageAccountName;AccountKey=YourKeyFromStorageAccountConnectionString;EndpointSuffix=core.windows.net" [string]$ContainerName = "YourBlobContainerName" [int]$blobCountAfter = 0 [int]$blobCountBefore = 0 $context = New-AzStorageContext -ConnectionString $myConnectionString $blobCountBefore = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container Before deletion: $blobCount" -ForegroundColor Yellow Get-AzStorageBlob -Container $ContainerName -Context $context | ForEach-Object { $_ | Remove-AzureStorageBlob # or: Remove-AzureStorageBlob -ICloudBlob $_.ICloudBlob -Context $ctx } $blobCountAfter = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container After deletion : $blobCount" -ForegroundColor Green It was used for large blob storage container with more than 5 millions of blob items. Sources: https://learn.microsoft.com/en-us/powershell/module/az.storage/new-azstoragecontext?view=azps-13.0.0#examples https://learn.microsoft.com/en-us/answers/questions/1637785/what-is-the-easiest-way-to-find-the-total-number-o https://stackoverflow.com/questions/57119087/powershell-remove-all-blobs-in-a-container Fab198Views1like1CommentAdding users to an AD group with Azure Functions/Logic Apps
I want to add users to an Entra ID/Azure AD group. The list of users will be retrieved from a REST API call with Azure Functions, and then saved into a database, probably Azure SQL. I'm planning on then using Azure Logic Apps to connect the database to the AD group. How can I make the script run every time the REST API changes? Can I add users to the AD group from SQL? Is there a better way to go about this?347Views0likes5CommentsDigital event: Microsoft Data and Analytics Forum
Join the Microsoft Data and Analytics Forum on October 30, 2024, at 8:00 AM Pacific Time to ensure your data is organized, secure, and ready for AI innovation. This one-day, digital event will give attendees the insight they need to unify data and analytics on an open and governed foundation and streamline data transformation, business intelligence, and generative AI using Microsoft solutions. By joining the Microsoft Data and Analytics Forum, you’ll: Hear real-world success stories and industry best practices from other organizations using Microsoft analytics tools to drive business growth and efficiency. Future-proof your skills and stay ahead of the curve with future trends and developments in data analytics and cloud computing. Gain insight to the latest advancements and innovations across the Microsoft Intelligent Data Platform. Learn about product updates, best practices, and cost-saving programs straight from Microsoft leaders, experts, and partners. See the products in action through breakout sessions and live demos. This helps you understand how to apply these tools to your own business scenarios. Register now to make sure you’re a part of the Microsoft Data and Analytics Forum. Microsoft Data and Analytics Forum Wednesday, October 30, 2024 8:00 AM-10:00 AM Pacific Time (UTC-7)180Views0likes0CommentsSetting up Azure for Adobe Analytics File Retrieval
I'm not sure if this is the right forum for this or if I am going to word this correctly but I will give it a try. Adobe Analytics has a feature where you can import what they call Classification files. It's basically additional data you can import to augment your analytics data. To do this you need to set up an account in their interface. We are looking to use Azure SAS. Below is the information this account setup asks for Once this account is created you set up a Location in Adobe Analytics The Location Account is the one set up in the previous step. This is all fine, but you will notice that you don't tell Adobe where the file is. At least I don't think you do. Adobe doesn't provide any guidance as what needs to be done on the Azure side. They provide links to Azure documentation, but I am told that the documentation is not great. Or maybe we are just overlooking something. I am wondering how you would set up Azure to be able to store the file and allow Adobe to access it? I hope this makes sense. If you need additional details, I'll be happy to get them.380Views0likes3CommentsCreating Logic App to Identify Low Storage Devices from Intune
Hello everyone, I’m seeking some assistance with creating a Logic App. I need to identify devices in Intune that have 5GB or less of available space and receive an email with the details of these devices, including their names. Is this achievable?515Views0likes3Comments