microsoft information governance
31 TopicsUnlocking the Power of Microsoft Purview for ChatGPT Enterprise
In today's rapidly evolving technology landscape, data security and compliance are key. Microsoft Purview offers a robust solution for managing and securing interactions with AI based solutions. This integration not only enhances data governance but also ensures that sensitive information is handled with the appropriate controls. Let's dive into the benefits of this integration and outline the steps to integrate with ChatGPT Enterprise in specific. The integration works for Entra connected users on the ChatGPT workspace, if you have needs that goes beyond this, please tell us why and how it impacts you. Benefits of Integrating ChatGPT Enterprise with Microsoft Purview Enhanced Data Security: By integrating ChatGPT Enterprise with Microsoft Purview, organizations can ensure that interactions are securely captured and stored within their Microsoft 365 tenant. This includes user text prompts and AI app text responses, providing a comprehensive record of communications. Compliance and Governance: Microsoft Purview offers a range of compliance solutions, including Insider Risk Management, eDiscovery, Communication Compliance, and Data Lifecycle & Records Management. These tools help organizations meet regulatory requirements and manage data effectively. Customizable Detection: The integration allows for the detection of built in can custom classifiers for sensitive information, which can be customized to meet the specific needs of the organization. To help ensures that sensitive data is identified and protected. The audit data streams into Advanced Hunting and the Unified Audit events that can generate visualisations of trends and other insights. Seamless Integration: The ChatGPT Enterprise integration uses the Purview API to push data into Compliant Storage, ensuring that external data sources cannot access and push data directly. This provides an additional layer of security and control. Step-by-Step Guide to Setting Up the Integration 1. Get Object ID for the Purview account in Your Tenant: Go to portal.azure.com and search for "Microsoft Purview" in the search bar. Click on "Microsoft Purview accounts" from the search results. Select the Purview account you are using and copy the account name. Go to portal.azure.com and search for “Enterprise" in the search bar. Click on Enterprise applications. Remove the filter for Enterprise Applications Select All applications under manage, search for the name and copy the Object ID. 2. Assign Graph API Roles to Your Managed Identity Application: Assign Purview API roles to your managed identity application by connecting to MS Graph utilizing Cloud Shell in the Azure portal. Open a PowerShell window in portal.azure.com and run the command Connect-MgGraph. Authenticate and sign in to your account. Run the following cmdlet to get the ServicePrincipal ID for your organization for the Purview API app. (Get-MgServicePrincipal -Filter "AppId eq '9ec59623-ce40-4dc8-a635-ed0275b5d58a'").id This command provides the permission of Purview.ProcessConversationMessages.All to the Microsoft Purview Account allowing classification processing. Update the ObjectId to the one retrieved in step 1 for command and body parameter. Update the ResourceId to the ServicePrincipal ID retrieved in the last step. $bodyParam= @{ "PrincipalId"= "{ObjectID}" "ResourceId" = "{ResourceId}" "AppRoleId" = "{a4543e1f-6e5d-4ec9-a54a-f3b8c156163f}" } New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId '{ObjectId}' -BodyParameter $bodyParam It will look something like this from the command line We also need to add the permission for the application to read the user accounts to correctly map the ChatGPT Enterprise user with Entra accounts. First run the following command to get the ServicePrincipal ID for your organization for the GRAPH app. (Get-MgServicePrincipal -Filter "AppId eq '00000003-0000-0000-c000-000000000000'").id The following step adds the permission User.Read.All to the Purview application. Update the ObjectId with the one retrieved in step 1. Update the ResourceId with the ServicePrincipal ID retrieved in the last step. $bodyParam= @{ "PrincipalId"= "{ObjectID}" "ResourceId" = "{ResourceId}" "AppRoleId" = "{df021288-bdef-4463-88db-98f22de89214}" } New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId '{ObjectId}' -BodyParameter $bodyParam 3. Store the ChatGPT Enterprise API Key in Key Vault The steps for setting up Key vault integration for Data Map can be found here Create and manage credentials for scans in the Microsoft Purview Data Map | Microsoft Learn When setup you will see something like this in Key vault. 4. Integrate ChatGPT Enterprise Workspace to Purview: Create a new data source in Purview Data Map that connects to the ChatGPT Enterprise workspace. Go to purview.microsoft.com and select Data Map, search if you do not see it on the first screen. Select Data sources Select Register Search for ChatGPT Enterprise and select Provide your ChatGPT Enterprise ID Create the first scan by selecting Table view and filter on ChatGPT Add your key vault credentials to the scan Test the connection and once complete click continue When you click continue the following screen will show up, if everything is ok click Save and run. Validate the progress by clicking on the name, completion of the first full scan may take an extended period of time. Depending on size it may take more than 24h to complete. If you click on the scan name you expand to all the runs for that scan. When the scan completes you can start to make use of the DSPM for AI experience to review interactions with ChatGPT Enterprise. The mapping to the users is based on the ChatGPT Enterprise connection to Entra, with prompts and responses stored in the user's mailbox. 5. Review and Monitor Data: Please see this article for required permissions and guidance around Microsoft Purview Data Security Posture Management (DSPM) for AI, Microsoft Purview data security and compliance protections for Microsoft 365 Copilot and other generative AI apps | Microsoft Learn Use Purview DSPM for AI analytics and Activity Explorer to review interactions and classifications. You can expand on prompts and responses in ChatGPT Enterprise 6. Microsoft Purview Communication Compliance Communication Compliance (here after CC) is a feature of Microsoft Purview that allows you to monitor and detect inappropriate or risky interactions with ChatGPT Enterprise. You can monitor and detect requests and responses that are inappropriate based on ML models, regular Sensitive Information Types, and other classifiers in Purview. This can help you identify Jailbreak and Prompt injection attacks and flag them to IRM and for case management. Detailed steps to configure CC policies and supported configurations can be found here. 7. Microsoft Purview Insider Risk Management We believe that Microsoft Purview Insider Risk Management (here after IRM) can serve a key role in protecting your AI workloads long term. With its adaptive protection capabilities, IRM dynamically adjusts user access based on evolving risk levels. In the event of heightened risk, IRM can enforce Data Loss Prevention (DLP) policies on sensitive content, apply tailored Entra Conditional Access policies, and initiate other necessary actions to effectively mitigate potential risks. This strategic approach will help you to apply more stringent policies where it matters avoiding a boil the ocean approach to allow your team to get started using AI. To get started use the signals that are available to you including CC signals to raise IRM tickets and enforce adaptive protection. You should create your own custom IRM policy for this. Do include Defender signals as well. Based on elevated risk you may select to block users from accessing certain assets such as ChatGPT Enterprise. Please see this article for more detail Block access for users with elevated insider risk - Microsoft Entra ID | Microsoft Learn. 8. eDiscovery eDiscovery of AI interactions is crucial for legal compliance, transparency, accountability, risk management, and data privacy protection. Many industries must preserve and discover electronic communications and interactions to meet regulatory requirements. Including AI interactions in eDiscovery ensures organizations comply with these obligations and preserves relevant evidence for litigation. This process also helps maintain trust by enabling the review of AI decisions and actions, demonstrating due diligence to regulators. Microsoft Purview eDiscovery solutions | Microsoft Learn 9. Data Lifecycle Management Microsoft Purview offers robust solutions to manage AI data from creation to deletion, including classification, retention, and secure disposal. This ensures that AI interactions are preserved and retrievable for audits, litigation, and compliance purposes. Please see this article for more information Automatically retain or delete content by using retention policies | Microsoft Learn. Closing By following these steps, organizations can leverage the full potential of Microsoft Purview to enhance the security and compliance of their ChatGPT Enterprise interactions. This integration not only provides peace of mind but also empowers organizations to manage their data more effectively. We are still in preview some of the features listed are not fully integrated, please reach out to us if you have any questions or if you have additional requirements.1.6KViews1like2CommentsMicrosoft Security in Action: Zero Trust Deployment Essentials for Digital Security
The Zero Trust framework is widely regarded as a key security model and a commonly referenced standard in modern cybersecurity. Unlike legacy perimeter-based models, Zero Trust assumes that adversaries will sometimes get access to some assets in the organization, and you must build your security strategy, architecture, processes, and skills accordingly. Implementing this framework requires a deliberate approach to deployment, configuration, and integration of tools. What is Zero Trust? At its core, Zero Trust operates on three guiding principles: Assume Breach (Assume Compromise): Assume attackers can and will successfully attack anything (identity, network, device, app, infrastructure, etc.) and plan accordingly. Verify Explicitly: Protect assets against attacker control by explicitly validating that all trust and security decisions use all relevant available information and telemetry. Use Least Privileged Access: Limit access of a potentially compromised asset, typically with just-in-time and just-enough-access (JIT/JEA) and risk-based policies like adaptive access control. Implementing a Zero Trust architecture is essential for organizations to enhance security and mitigate risks. Microsoft's Zero Trust framework essentially focuses on six key technological pillars: Identity, Endpoints, Data, Applications, Infrastructure, & Networks. This blog provides a structured approach to deploying each pillar. 1. Identity: Secure Access Starts Here Ensure secure and authenticated access to resources by verifying and enforcing policies on all user and service identities. Here are some key deployment steps to get started: Implement Strong Authentication: Enforce Multi-Factor Authentication (MFA) for all users to add an extra layer of security. Adopt phishing-resistant methods, such as password less authentication with biometrics or hardware tokens, to reduce reliance on traditional passwords. Leverage Conditional Access Policies: Define policies that grant or deny access based on real-time risk assessments, user roles, and compliance requirements. Restrict access from non-compliant or unmanaged devices to protect sensitive resources. Monitor and Protect Identities: Use tools like Microsoft Entra ID Protection to detect and respond to identity-based threats. Regularly review and audit user access rights to ensure adherence to the principle of least privilege. Integrate threat signals from diverse security solutions to enhance detection and response capabilities. 2. Endpoints: Protect the Frontlines Endpoints are frequent attack targets. A robust endpoint strategy ensures secure, compliant devices across your ecosystem. Here are some key deployment steps to get started: Implement Device Enrollment: Deploy Microsoft Intune for comprehensive device management, including policy enforcement and compliance monitoring. Enable self-service registration for BYOD to maintain visibility. Enforce Device Compliance Policies: Set and enforce policies requiring devices to meet security standards, such as up-to-date antivirus software and OS patches. Block access from devices that do not comply with established security policies. Utilize and Integrate Endpoint Detection and Response (EDR): Deploy Microsoft Defender for Endpoint to detect, investigate, and respond to advanced threats on endpoints and integrate with Conditional Access. Enable automated remediation to quickly address identified issues. Apply Data Loss Prevention (DLP): Leverage DLP policies alongside Insider Risk Management (IRM) to restrict sensitive data movement, such as copying corporate data to external drives, and address potential insider threats with adaptive protection. 3. Data: Classify, Protect, and Govern Data security spans classification, access control, and lifecycle management. Here are some key deployment steps to get started: Classify and Label Data: Use Microsoft Purview Information Protection to discover and classify sensitive information based on predefined or custom policies. Apply sensitivity labels to data to dictate handling and protection requirements. Implement Data Loss Prevention (DLP): Configure DLP policies to prevent unauthorized sharing or transfer of sensitive data. Monitor and control data movement across endpoints, applications, and cloud services. Encrypt Data at Rest and in Transit: Ensure sensitive data is encrypted both when stored and during transmission. Use Microsoft Purview Information Protection for data security. 4. Applications: Manage and Secure Application Access Securing access to applications ensures that only authenticated and authorized users interact with enterprise resources. Here are some key deployment steps to get started: Implement Application Access Controls: Use Microsoft Entra ID to manage and secure access to applications, enforcing Conditional Access policies. Integrate SaaS and on-premises applications with Microsoft Entra ID for seamless authentication. Monitor Application Usage: Deploy Microsoft Defender for Cloud Apps to gain visibility into application usage and detect risky behaviors. Set up alerts for anomalous activities, such as unusual download patterns or access from unfamiliar locations. Ensure Application Compliance: Regularly assess applications for compliance with security policies and regulatory requirements. Implement measures such as Single Sign-On (SSO) and MFA for application access. 5. Infrastructure: Securing the Foundation It’s vital to protect the assets you have today providing business critical services your organization is creating each day. Cloud and on-premises infrastructure hosts crucial assets that are frequently targeted by attackers. Here are some key deployment steps to get started: Implement Security Baselines: Apply secure configurations to VMs, containers, and Azure services using Microsoft Defender for Cloud. Monitor and Protect Infrastructure: Deploy Microsoft Defender for Cloud to monitor infrastructure for vulnerabilities and threats. Segment workloads using Network Security Groups (NSGs). Enforce Least Privilege Access: Implement Just-In-Time (JIT) access and Privileged Identity Management (PIM). Just-in-time (JIT) mechanisms grant privileges on-demand when required. This technique helps by reducing the time exposure of privileges that are required for people, but are only rarely used. Regularly review access rights to align with current roles and responsibilities. 6. Networks: Safeguard Communication and Limit Lateral Movement Network segmentation and monitoring are critical to Zero Trust implementation. Here are some key deployment steps to get started: Implement Network Segmentation: Use Virtual Networks (VNets) and Network Security Groups (NSGs) to segment and control traffic flow. Secure Remote Access: Deploy Azure Virtual Network Gateway and Azure Bastion for secure remote access. Require device and user health verification for VPN access. Monitor Network Traffic: Use Microsoft Defender for Endpoint to analyze traffic and detect anomalies. Taking the First Step Toward Zero Trust Zero Trust isn’t just a security model—it’s a cultural shift. By implementing the six pillars comprehensively, organizations can potentially enhance their security posture while enabling seamless, secure access for users. Implementing Zero Trust can be complex and may require additional deployment approaches beyond those outlined here. Cybersecurity needs vary widely across organizations and deployment isn’t one-size-fits all, so these steps might not fully address your organization’s specific requirements. However, this guide is intended to provide a helpful starting point or checklist for planning your Zero Trust deployment. For a more detailed walkthrough and additional resources, visit Microsoft Zero Trust Implementation Guidance. The Microsoft Security in Action blog series is an evolving collection of posts that explores practical deployment strategies, real-world implementations, and best practices to help organizations secure their digital estate with Microsoft Security solutions. Stay tuned for our next blog on deploying and maximizing your investments in Microsoft Threat Protection solutions.1.7KViews1like0CommentsOptimizing OneDrive Retention Policies with Administrative Units and Adaptive Scopes
A special thank you note to Ashwini_Anand for contributing to the content of this blog. In today's digital landscape, efficient data retention management is a critical priority for organizations of all sizes. Organizations can optimize their OneDrive retention policies, ensuring efficient and compliant data management tailored to their unique user base and licensing arrangements. Scenario: Contoso Org encountered a distinct challenge - managing data retention for their diverse user base of 200,000 employees, which includes 80,000 users with F3 licenses and 120,000 users with E3 and E5 licenses. As per Microsoft licensing, F3 users are allocated only 2 GB of OneDrive storage, whereas E3 and E5 users are provided with a much larger allocation of 5 TB. This difference required creating separate retention policies for these users' groups. The challenge was further complicated by the fact that retention policies utilize the same storage for preserving deleted data. If a unified retention policy were applied to all users such as retaining data for 6 years before deletion - F3 users’ OneDrive storage could potentially fill up within a year or less (depending on usage patterns). This would leave F3 users unable to delete or save new files, severely disrupting productivity and data management. To address this, it is essential to create a separate retention policy for E3 and E5 users, ensuring that the policy applies only to these users and excludes F3 users. This blog will discuss the process of designing and implementing such a policy for the large user base based on separate licenses, ensuring efficient data management and uninterrupted productivity. Challenges with Retention Policy Configuration for large organizations 1. Adaptive Scope Adaptive scopes in Microsoft Purview allow you to dynamically target policies based on specific attributes or properties such as department, location, email address, custom Exchange attributes etc. Refer the link to get the list of supported attributes: Adaptive scopes | Microsoft Learn. Limitation: Although Adaptive scopes can filter by user properties, Contoso, being a large organization, had already utilized all 15 custom attributes for various purposes. Additionally, user attributes also couldn’t be used to segregate users based on licenses. This made it challenging to repurpose any attribute for our filter criteria to apply the retention policy to a specific set of users. Furthermore, refinable strings used in SharePoint do not work for OneDrive sites. 2. Static Scope Static scope refers to manually selected locations (e.g., specific users, mailboxes, or sites) where the policy is applied. The scope remains fixed and does not automatically adjust. Limitation: Static scope allows the inclusion or exclusion of mailboxes and sites but is limited to 100 sites and 1000 mailboxes, making it challenging to utilize for large organizations. Proposed Solution: Administrative Units with Adaptive Scope To address the above challenges, it required utilizing Administrative Units (Admin Units - is a container within an organization that can hold users, groups, or devices. It helps us to manage and organize users within an organization more efficiently, especially in large or complex environments) with Adaptive Scopes for creation of a retention policy targeting E3 and E5 licensed users. This approach allows organizations to selectively apply retention policies based on user licenses, enhancing both efficiency and governance. Prerequisites For Administrative unit - Microsoft Entra ID P1 license For Retention policy - Refer to the link: Microsoft 365 guidance for security & compliance - Service Descriptions | Microsoft Learn Configuration Steps Step 1: Create Administrative Unit: Navigate to Microsoft Entra Admin Center https://entra.microsoft.com/#home Click on ‘Identity’ and then click on ‘Show more’ Expand ‘Roles & admins’ Proceed to ‘Admin units’ -> Add. Define a name for the Administrative unit. Click on ‘Next: Assign roles’ No role assignment required, click on 'Next: Review + create’) Click on ‘Create’. To get more information about creating administrative unit, refer this link: Create or delete administrative units - Microsoft Entra ID | Microsoft Learn Step 2: Update Dynamic Membership: Select the Administrative Unit which is created in Step1. Navigate to ‘Properties’ Choose ‘Dynamic User’ for Membership type. Click on ‘Add a dynamic query’ for Dynamic user members. Click on ‘Edit' for Rule syntax In order to include E3 and E5 licensed users who are using OneDrive, you need to include SharePoint Online Service Plan 2 enabled users. Use the query below in the code snippet to define the dynamic membership. user.assignedPlans -any (assignedPlan.servicePlanId -eq "5dbe027f-2339-4123-9542-606e4d348a72" -and assignedPlan.capabilityStatus -eq "Enabled") Click on 'Save' to update the Dynamic membership rules Click on 'Save' to update the Administrative unit changes. Open the Administrative Unit and click on the 'Users' tab to check if users have started to populate. Note: It may take some time to replicate all users, depending on the size of your organization. Please wait for minutes and then check again. Step 3: Create Adaptive Scope under Purview Portal: Access https://purview.microsoft.com Navigate to ‘Settings’ Expand ‘Roles & scopes’ and click on ‘Adaptive scopes’ Create a new adaptive scope, providing ‘Name’ and ‘Description’. Proceed to select the Administrative unit which was created earlier. (It takes time for the Admin/Administrative Unit to become visible. Please wait for some time if it does not appear immediately.) Click on ‘Add’ and ‘Next’ Select ‘Users’ and 'Next' Once the Admin unit is selected, we need to specify the criteria which allows to select users within the Admin unit (this is the second level of filtering available). However, in this case since we needed to select all users of the admin unit, hence the below criteria was used. Click 'Add attribute' and form the below query. Email addresses is not equal to $null Note: You can apply any other filter if you need to select a subset of users within the Admin Unit based on your business use case. Click on ‘Next’ Review and ‘Submit’ the adaptive scope. Step 4: Create Retention Policy using Adaptive Scope: Access https://purview.microsoft.com/datalifecyclemanagement/overview Navigate to ‘Policies’ and then go to ‘Retention Policies’. Create a ‘New Retention policy’, providing a ‘Name’ and ‘Description’. Proceed to select the Administrative unit created earlier. Click on ‘Add or remove admin units’ Choose ‘Adaptive’ and click on ‘Next’. Click on ‘Add scopes’ and Select the previously created Adaptive scope. Click on ‘Next’ to proceed and select the desired retention settings. Click Next and Finish Outcome By implementing Admin Units with adaptive scopes, organizations can effectively overcome challenges associated with applying OneDrive retention policies for distinguished and large set of users. This approach facilitates the dynamic addition of required users, eliminating the need for custom attributes and manual user management. Users are dynamically added or removed from the policy based on license status, ensuring seamless compliance management. FAQ: Why is it important to differentiate retention policies based on user licensing tiers? It is important to differentiate retention policies based on user licensing tiers to ensure that each user group has policies tailored to their specific needs and constraints, avoiding issues such as storage limitations for users with lower-tier licenses like F3. How many Exchange custom attributes are typically available? There are typically 15 Exchange custom attributes available, which can limit scalability when dealing with a large user base. What challenge does Adaptive Scoping face when including a large number of OneDrive sites? Adaptive Scoping faces the challenge of including a large number of OneDrive sites due to limitations in the number of custom attributes allowed. While these custom attributes help in categorizing and managing OneDrive sites, the finite number of attributes available can restrict scalability and flexibility. Why are refinable strings a limitation for Adaptive Scoping in OneDrive? Refinable strings are a limitation for Adaptive Scoping in OneDrive because their usage is restricted to SharePoint only. What are the limitations of Static Scoping for OneDrive sites? Static Scoping for OneDrive sites is limited by the strict limit of including or excluding only 100 sites, making it usage limited for larger environments. Do we need any licenses to create an administrative unit with dynamic membership? Yes, a Microsoft Entra ID P1 license is required for all members of the group.Select the 'Adaptive' retention policy type1.4KViews2likes0CommentsSafely activate your data estate with Microsoft Purview
60% of CDOs cite data integration challenges as a top pain-point due to lack of knowledge of where relevant data resides [1]. Companies operate on multi-platform, multi-cloud data estates making it harder than ever to seamlessly discover, secure, govern and activate data. This increases the overall complexity when enabling users to responsibly derive insights and drive business value from data. In the era of AI, data governance is no longer an afterthought, data security and data governance are now both table stakes. Data Governance is not a new concept but with the proliferation of AI and evolving regulatory landscape, data governance is critical for safeguarding data related to AI-driven business innovation. With 95% of organizations implementing or developing an AI strategy [2], customers are facing emerging governance challenges, such as: False signals: The lack of clean accurate data can cause false signals in AI which can trigger consequential business outcomes or lead to incorrect reported forecasting and regulatory fines. Time to insight: Data scientists and analysts spend 60-80% of their time on data access and preparation to feed AI initiatives which leads to staff frustration, increased OPEX, and delays in critical AI innovation priorities. Shadow innovation: Data innovation outside governance can increase business risks around data leakage, oversharing, or inaccurate outcomes. This is why federated governance has surfaced as a top priority across security and data leaders because it unlocks data innovation while maintaining appropriate data oversight to help minimize risks. Customers are seeking more unified solutions that enable data security and governance seamlessly across their complex data estate. To help customers better respond to these needs, Microsoft Purview unifies data security, data governance, and data compliance solutions across the heterogeneous data estate for the era of AI. Microsoft Purview also works closely with Microsoft Fabric to integrate capabilities that help seamlessly secure and govern data to help reduce risks associated with data activation across the Microsoft Intelligent Data Platform and across the Microsoft Cloud portfolio. Microsoft Fabric delivers a pre-integrated and optimized SaaS environment for data teams to work faster together over secure and governed data within the Fabric environment. Combining the strengths of Microsoft Purview and Microsoft Fabric enables organizations to more confidently leverage Fabric to unlock data innovation across data engineers, analysts, data scientists, and developers whilst Purview enables data security teams to extend Purview advanced data security value and enables the central data office to extend Purview advanced data governance value across Fabric, Azure, M365, and the heterogenous data estate. Furthering this vision, today Microsoft is announcing 1. a new name for the Purview Data Governance solution, Purview Unified Catalog, to better reflect its growing catalog capabilities, 2. integration with new OneLake catalog, 3. a new data quality scan engine, 4. Purview Analytics in OneLake, and 5. expanded Data Loss Prevention (DLP) capabilities for Fabric lakehouse and semantic models. Introducing Unified Catalog: a new name for the visionary solution The Microsoft Purview data governance solution, made generally available in September, delivers comprehensive visibility, data confidence, and responsible innovation—for greater business value in the era of AI. The solution streamlines metadata from disparate catalogs and sources, like OneLake, Databricks Unity, and Snowflake Polaris, into a unified experience. To better reflect these comprehensive customer benefits, Microsoft Purview Data Catalog is being renamed to Microsoft Purview Unified Catalog to exemplify the growing catalog capabilities such as deeper data quality support for more cloud sources, and Purview Analytics in OneLake. A data catalog serves as a comprehensive inventory of an organization's data assets. As the Microsoft Purview Unified Catalog continues to add on capabilities within curation, data quality, and third-party platform integration, the new Unified Catalog name reflects the current cross-cloud capability. This cross-cloud capability is illustrated in the figure below. This data product contains data assets from multiple different sources, including a Fabric lakehouse table, Snowflake Table and Azure Databricks Table. With the proper curation of analytics into data products, data users can govern data assets easier than ever. Figure 1: Curation of a data product from disparate data sources within Purview’s Unified Catalog Introducing OneLake catalog (Preview) As announced in the Microsoft Fabric blog earlier today, the OneLake catalog is a solution purpose-built for data engineers, data scientists, developers, analysts, and data consumers to explore, manage, and govern data in Fabric. The new OneLake catalog works with Purview by seamlessly connecting data assets governed by OneLake catalog into Purview Unified Catalog, enabling the central data office to centrally govern and manage data assets. The Purview Unified Catalog offers data stewards and data owners advanced capabilities for data curation, advanced data quality, end-to-end data lineage, and an intuitive global catalog that spans the data estate. For data leaders, Unified Catalog offers built-in reports for actionable insights into data health and risks and the ability to confidently govern data across the heterogeneous data estate. In figure 2, you can see how Fabric data is seamlessly curated into the Corporate Emissions Created by AI for CY2024 Data Product, built with data assets from OneLake. Figure 2: Data product curated with Fabric assets Introducing a new data quality scan engine for deeper data quality (Preview) Purview offers deeper data quality support, through a new data quality scan engine for big data platforms, including: Microsoft Fabric, Databricks Unity Catalog, Snowflake, Google Big Query, and Amazon S3, supporting open standard file and table formats. In short, this new scan engine allows businesses to centrally perform rich data quality management from within the Purview Unified Catalog. In Figure 3, you can see how users can run different data quality rules on a particular asset, in this case, a table hosted in OneLake, and when users click on “run quality scan”, the scanner runs a deep scan on the data itself, running the data quality rules in real time, and updating the quality score for that particular asset. Figure 3: Running a data quality scan on an asset living in OneLake Introducing Purview Analytics in OneLake (Preview) To further an organization’s data quality management practice, data stewards can now leverage a new Purview Analytics in OneLake capability, in preview, to extract tenant-specific metadata from the Purview Unified Catalog and publish to OneLake. This new capability enables deeper data quality and lineage investigation using the rich capabilities in Power BI within Microsoft Fabric. Figure 4: In Unified Catalog settings, a user can add self-serve analytics to Microsoft Fabric Figure 5: Curated metadata from Purview within Fabric Expanded Data Loss Prevention (DLP) capabilities for Fabric lakehouse and semantic models To broaden Purview data security features for Fabric, today we are announcing that the restrict access action in Purview DLP policies now extends to Fabric semantic models. With the restrict access action, DLP admins can configure policies to detect sensitive information in semantic models and limit access to only internal users or data owners. This control is valuable for when a Fabric tenant includes guest users and you want to limit unnecessary access to internal proprietary data. The addition of the restrict access action for Fabric semantic models augments the existing ability to detect upload of sensitive data to Fabric lakehouses announced earlier this year. Learn more about the new Purview DLP capabilities for Fabric lakehouses and semantic models in the DLP blog. Figure 6: Example of restricted access to a Fabric semantic model enforced through a Purview DLP policy. Summary With these investments in security and governance, Microsoft Purview is delivering on its vision to extend data protection customer value and innovation across your heterogenous data estate for reduced complexities and improved risk mitigation. Together Purview and Fabric set the foundations for a modern intelligent data platform with seamless security and governance to drive AI innovation you can trust. Learn more As we continue to innovate our products to expand the security and governance capabilities, check out these resources to stay informed. https://aka.ms/Try-Purview-Governance https://www.microsoft.com/en-us/security/business/microsoft-purview https://aka.ms/try-fabric [1] Top 7 Challenges in Data Integration and How to Solve Them | by Codvo Marketing | Medium [2] Microsoft internal research May 2023, N=6382.7KViews1like0CommentsCreating Endpoint DLP Rules using PowerShell - Part 1
This blog is Part 1 of our multi-part series on managing Endpoint DLP Rules using PowerShell. In Part 1, we will demonstrate how we can use PowerShell to create Endpoint DLP Rules with AdvancedRule, AlertProperties and EndpointDLPRestrctions Parameter. In Part 2, we will cover the same for EndpointDLPBrowserRestrictions. Step 1: Create the text file with complex condition as per the requirements and save it. Here is a sample for reference: { "Version": "1.0", "Condition": { "Operator": "And", "SubConditions": [ { "ConditionName": "ContentContainsSensitiveInformation", "Value": [ { "Groups": [ { "Name": "Default", "Operator": "Or", "Sensitivetypes": [ { "Name": "Credit Card Number", "Mincount": 1, "Maxcount": 5, "Confidencelevel": "Low", }, { "Name": "U.S. Bank Account Number", "Mincount": 5, "Confidencelevel": "Medium", } ] } ], "Operator": "And" } ] } ] } } In the above example, we are using the condition Content Contains Sensitive Information with SIT’s Credit Card or Bank Account Number. You can choose to add/remove additional SIT’s/conditions as needed along with the desired operator. You can also change the Confidence level to Low/Medium/High as per the requirements and update the Min/Max count. We have saved it as advancedrule.txt in our example. Note: If you do not specify the Min/Max attribute, the value is taken as any by default. In our example we have not specified the Max attribute for the Bank Account Number, hence it would take the default value i.e. Any. Here is another example: { "Version": "1.0", "Condition": { "Operator": "And", "SubConditions": [ { "ConditionName": "ContentContainsSensitiveInformation", "Value": [ { "Groups": [ { "Name": "Default", "Operator": "Or", "Labels": [ { "Name": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "Id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "Type": "Sensitivity" } ] } ], "Operator": "And" } ] }, { "ConditionName": "ContentFileTypeMatches", "Value": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] } ] } } In this example we are using the condition Content Contains Sensitive Level with a specific label and Content matches a specific file type. Please ensure to replace the ID’s with the appropriate values before saving the file. Step 2: Define the parameters for endpointDlpRestrictions or create a text file for complex restrictions. Here is an example for a simple restriction: $endpointDlpRestrictions = @(@{"Setting"="Print"; "Value"="Block"},@{"Setting"="RemovableMedia"; "Value"="Warn"}) In this case we are setting the Print action to Block and Copy to removable USB Device to Warn. We can configure the value to Block/Warn/Audit as per our requirements. Here is an example to create a text file with complex condition: [ { "defaultmessage": "none", "setting": "Print", "value": "Block", "appgroup": "none", "networkLocation": [ { "priority": "1", "type": "vpn", "action": "Audit" } ], "printerGroup": [ { "priority": "1", "id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "action": "Audit" } ] }, { "setting": "RequireBusinessJustification", "value": "Required" }, { "setting": "RemovableMedia", "defaultmessage": "none", "value": "Warn", "appgroup": "none" }, { "setting": "CloudEgress", "defaultmessage": "none", "cloudEgressGroup": [ { "priority": "1", "id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "action": "Audit" } ], "value": "Warn", "appgroup": "none" }, { "setting": "PasteToBrowser", "defaultmessage": "none", "pasteSensitiveDomainsGroup": [ { "priority": "1", "id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "action": "Audit" } ], "value": "Block", "appgroup": "none" }, { "setting": "CopyPaste", "defaultmessage": "none", "value": "Warn", "appgroup": "none", "networkLocation": [ { "priority": "1", "type": "corporateNetwork", "action": "Audit" } ] }, ] We are setting the below restrictions in the above example. The Action and restrictions can be modified as per the requirements. We have saved it as endpointdlprestrictions.txt in our example. Activity Action Network Restrictions Group Restrictions Print Block VPN is set to Audit A custom Printer Group with Action as Audit The group ID can be retrieved from the Endpoint DLP Settings using PowerShell. Make sure to update the ID before saving the file. Copy to Removable USB Device Warn Upload to restricted cloud service domain Warn A custom Sensitive service domain Group with Action as Audit The group ID can be retrieved from the Endpoint DLP Settings using PowerShell. Paste to browser Block A custom Sensitive service domain Group with Action as Audit The group ID can be retrieved from the Endpoint DLP Settings using PowerShell. Copy to clipboard Warn CorporateNetwork is set to Audit Step 3: Define the Parameters: # Define the parameters to read complex condition from the file we created in Step 1 $data = Get-Content -Path "C:\temp\advancedrule.txt" -ReadCount 0 $AdvancedRuleString = $data | Out-string # Define the parameters for the DLP rule with Simple restriction $ruleName = "Endpoint Rule - Restrict Financial Information Sharing Rule" $PolicyName = "Endpoint Policy - Restrict Financial Information Sharing" $endpointDlpRestrictions = @(@{"Setting"="Print"; "Value"="Block"},@{"Setting"="RemovableMedia"; "Value"="Block"}) $Notifyendpointuser = @{NotificationContent = "default:The sharing is blocked, please contact the helpdesk for more details" ; NotificationTitle = "default:Restricted"} $alertProperties = @{AggregationType = "SimpleAggregation" ; VolumeThreshold = "5" ; AlertBy = "Tenant"; Threshold = "15"; TimeWindow = "60"} Note: The values in bold for notification content can be changed as per the notification you would like to configure. Similarly, the values in Alert properties can also be changed to meet different requirements. Step 4 Create the DLP rule: #Create the DLP rule New-DlpComplianceRule -Name $ruleName -Policy $PolicyName -GenerateAlert admin@xxxx.onmicrosoft.com -ReportSeverityLevel "Medium" -Notifyendpointuser $Notifyendpointuser -EndpointDlpRestrictions $endpointDlpRestrictions -AlertProperties $alertProperties -AdvancedRule $AdvancedRuleString You can use the below if you want to create a DLP rule with complex EDLP Restriction: # Define the parameters to read complex condition from a file we created in Step 1 $data = Get-Content -Path "C:\temp\advancedrule.txt" -ReadCount 0 $AdvancedRuleString = $data | Out-string # Define the parameters for the DLP rule with Simple restriction $ruleName = "Endpoint Rule - Restrict Financial Information Sharing Rule" $PolicyName = "Endpoint Policy - Restrict Financial Information Sharing" $Notifyendpointuser = @{NotificationContent = "default:The sharing is blocked, please contact the helpdesk for more details" ; NotificationTitle = "default:Restricted"} $alertProperties = @{AggregationType = "SimpleAggregation" ; VolumeThreshold = "5" ; AlertBy = "Tenant"; Threshold = "15"; TimeWindow = "60"} # Create the DLP rule using the EndpointDlpRestrictions file we created in Step 2. New-DlpComplianceRule -Name $ruleName -Policy $PolicyName -GenerateAlert admin@xxxx.onmicrosoft.com -ReportSeverityLevel "Medium" -AlertProperties $alertProperties -Notifyendpointuser $Notifyendpointuser -AdvancedRule $AdvancedRuleString -EndpointDlpRestrictions (Get-Content -Raw ("C:\temp\endpointdlprestrictions.txt") | ConvertFrom-Json -AsHashtable) Note: PowerShell 7 is a must for this to work.981Views1like2CommentsBulk Import Endpoint DLP Global Settings
Updating the eDLP settings can be a tedious task when managing an extensive list of Service Domains, File Path Exclusions, Unallowed apps and browsers, Unallowed Bluetooth Apps, and Network Path Exclusions. In this blog, we will demonstrate how to efficiently bulk import these settings and maintain an ongoing list. Pre-requisites Visual Studio Code with Extension to convert csv to json. We are using the below extension in our example. Step 1: Create a csv file with the required parameters and values. Here is a sample table with all the parameters for eDLP Global Settings: Setting Value Executable CloudAppMode Block CloudAppRestrictionList yahoo.com CloudAppRestrictionList hotmail.com PathExclusion /Users/*/Desktop/Folder1 PathExclusion /Users/*/Desktop/Folder2 MacPathExclusion /Users/*/Downloads/Folder1 MacPathExclusion /Users/*/Downloads/Folder2 UnallowedApp testapp1 testapp1.exe UnallowedApp testapp2 testapp2.exe UnallowedBrowser Avast Secure Browser avastbrowser.exe UnallowedBrowser Firefox firefox.exe UnallowedBluetoothApp bluetoothapp1 bluetoothapp1.exe UnallowedBluetoothApp bluetoothapp2 bluetoothapp1.exe UnallowedCloudSyncApp Notepad++ notepad++.exe EvidenceStoreSettings { "FileEvidenceIsEnabled": true, "NumberOfDaysToRetain": 30, "StorageAccounts": [ { "Name": "Test", "BlobUri": "https://test.blob.windows.core.net/" } ], "Store": "CustomerManaged" } VPNSettings { "serverAddress": [ "test.vpnus.contoso.com", "test.vpnin.contoso.com" ] } serverDlpEnabled TRUE CustomBusinessJustificationNotification 1 MacDefaultPathExclusionsEnabled TRUE AdvancedClassificationEnabled TRUE BandwidthLimitEnabled TRUE DailyBandwidthLimitInMB 1000 IncludePredefinedUnallowedBluetoothApps TRUE NetworkPathEnforcementEnabled TRUE NetworkPathExclusion \\TestShare\MyFolder NetworkPathExclusion \\TestShare\MyFolder1 You can make the necessary changes and add additional rows to add more values per setting as needed. Copy the table to a csv file, make the necessary changes, and save it. Step 2: Convert csv to json. Open the csv file in Visual Studio Code Press Ctrl + Shift + P Select convert csv to json in the pop that appears. A new file will be created in VS Code in JSON format Step 3: Remove the unwanted values. Remove the unwanted values such as below using the Find and Replace All (Replace with blank) option in VS Code and save the file in json Format. We have saved it as eDLPGlobalSettings.json in our case. , "Executable": "\n" , "Executable\r": "\r\n" , "Executable\r": "\r" \r Step 4: Validate if the value TRUE is in lower-case in the json file, if not please replace it using txt editor to lower-case and save the file. Step 5: Run the below command to update the eDLP Global Settings. Sst-PolicyConfig -EndpointDlpGlobalSettings (Get-Content -Raw ("C:\temp\eDLPGlobalSettings.json") | ConvertFrom-Json -AsHashtable) Note: Set-PolicyConfig will always override the existing data hence the recommendation is to have a running csv that can be edited, converted, and imported every time. PS: Please ensure to test it in a test environment before executing it in prod and always take a backup of the current settings before importing the new one.572Views1like0CommentsCreating Endpoint DLP Rules using PowerShell - Part 2
This blog is Part 2 of our multi-part series on managing Endpoint DLP Rules using PowerShell. In Part 1, we demonstrated how we can use PowerShell to create Endpoint DLP Rules with AdvancedRule, AlertProperties and EndpointDLPRestrctions Parameter. In this blog, we will cover the same for EndpointDLPBrowserRestrictions. Step 1: Create a text file with condition to restrict browser access. Here is a sample for reference: { "Version": "1.0", "Condition": { "Operator": "And", "SubConditions": [ { "ConditionName": "RestrictBrowserAccess", "Value": true } ] } } We have saved the file as advancedrule.txt in our example. Step 2: Create a text file with endpoint Dlp Browser restrictions. Here is an example for a restriction: [ { "setting": "WebPagePrint", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Block" }, { "setting": "WebPageCopyPaste", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Warn" }, { "setting": "WebPageSaveToLocal", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Audit" }, { "setting": "WebPagePrint", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Block" }, { "setting": "WebPageCopyPaste", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Warn" }, { "setting": "WebPageSaveToLocal", "defaultmessage": "none", "sitegroup": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "value": "Audit" } ] We are setting the below Sensitive Site Restrictions in the above example. The Action and group can be modified as per the requirements, we can also choose to add more groups and remove one out of the two. We have saved the file as EndpointDlpbrowserRestrictions.txt in our example. Note: Please ensure to replace the SiteGroupID before saving the file. Activity CustomSensitiveGroup1 Action CustomSensitiveGroup2 Action Print the site Block Block Copy the date from the site Warn Warn Save the site as local files (Save-As) Audit Audit Step 3: Define the Parameters: # Define the parameters to read condition from the file we created in Step 1 $data = Get-Content -Path "C:\temp\advancedrule.txt" -ReadCount 0 $AdvancedRuleString = $data | Out-string # Define the parameters for the DLP rule $ruleName = "Endpoint Rule – Sensitive Site Restrictions" $PolicyName = "Endpoint Policy - Sensitive Site Restrictions" $alertProperties = @{AggregationType = "SimpleAggregation" ; VolumeThreshold = "5" ; AlertBy = "Tenant"; Threshold = "15"; TimeWindow = "60"} $Notifyendpointuser = @{NotificationContent = "default:The sharing is blocked, please contact the helpdesk for more details" ; NotificationTitle = "default:Restricted"} The values in bold for notification content can be changed as per the notification you would like to configure. Similarly, the values in Alert properties can also be changed to meet different requirements. Step 4: Create the DLP rule: New-DlpComplianceRule -Name $ruleName -Policy $PolicyName -GenerateAlert admin@xxxx.onmicrosoft.com -ReportSeverityLevel "Medium" -Notifyendpointuser $Notifyendpointuser -AlertProperties $alertProperties -AdvancedRule $AdvancedRuleString -EndpointDlpbrowserRestrictions (Get-Content -Raw ("C:\temp\EndpointDlpbrowserRestrictions.txt") | ConvertFrom-Json -AsHashtable) Note: PowerShell 7 is a must for this to work.6.6KViews1like0CommentsUsing Adaptive Policy Scopes to Apply M365 Retention to Shared, Resource, and Inactive Mailboxes
In this blog we'll explain how you can now use adaptive policy scopes to include or exclude shared, resource, and inactive mailboxes from Microsoft 365 retention policies and labels.22KViews4likes8CommentsNavigating the New Frontier: Information Security in the Era of M365 Copilot
Explore the intersection of AI and security in our latest feature, where Microsoft Purview meets M365 Copilot. Dive into the critical role of sensitivity labels, advanced data classification, and encryption in shaping a secure digital workspace. Gain expert insights from industry professionals and discover practical strategies for balancing innovative AI tools with rigorous security protocols.7KViews12likes1Comment