enterprise security
6 TopicsBlog Series: Charting Your Path to Cyber Resiliency
Part 1: What Is Cyber Resiliency and How Do I Get It? Recently I was on a call with some Security leaders who were interested in how we at Microsoft could help them with cyber resiliency. But when I asked the questions "What does cyber resiliency mean to you?” and “What specific aspects of cyber resilience are you interested in improving?", they struggled to answer. If you're having difficulty with those questions yourself, don't worry, you're not alone. Cyber resiliency – being able to successfully continue business operations in the face of destructive cyberattacks - is having a Moment these days. It's The New Zero Trust, you might say. But what is cyber resilience really beyond an industry buzzword or a sales play? What does an organization need to do to become cyber resilient? To understand more, let's start with a look at the history of cyber resiliency and how it has evolved over the last 15 years. MITRE (best known for their ATT&CK frameworks) was an early leader in the cyber resilience movement. MITRE's 2010 publication Building Secure, Resilient Architectures for Cyber Mission Assurance, explained the need for cyber resiliency by emphasizing the operational impact of cyberattacks and the financial cost of recovery, also noting that “the cyber adversary continues to have an asymmetric advantage as we fruitlessly play Whac-A-Mole in response to individual attacks.” (Sound familiar?) One year later, MITRE released the first publication of their Cyber Resiliency Engineering Framework (CREF). In subsequent years, MITRE followed up with revisions to CREF, along with additional papers on methods and metrics for effectively measuring cyber resiliency. They also developed the CREF Navigator, an online tool to help define and graphically represent cyber resiliency goals, objectives and techniques as defined by NIST (National Institute of Standards and Technology). NIST's 2021 publication SP 800-160 Volume 2 (Rev 1): Developing Cyber-Resilient Systems is a comprehensive cyber resiliency framework that builds on CREF. It also gives us the most used definition of cyber resiliency which is: "the ability to anticipate, withstand, recover from, and adapt to adverse conditions, stresses, attacks, or compromises that use or are enabled by cyber resources." Like MITRE's early work, this publication is rooted in systems and software engineering principles and how engineers in national defense and critical infrastructure need to build resiliency into mission-critical systems. However, today we commonly apply this definition and this understanding of cyber resiliency to any organization concerned with minimizing the impact of cyberattacks on their business-critical systems. The extension of cyber resiliency principles beyond government and critical infrastructure is also evident in The EU's Cybersecurity Strategy for the Digital Decade presented in December 2020. Although this strategy was chiefly concerned with "EU institutions bodies and agencies," it also emphasized the increasing dependency of both public and private sectors on digital systems and cybersecurity, noting that financial services, digital services, and manufacturing were among the hardest hit by cybercrime. Microsoft echoed this idea in our 2022 Digital Defense Report which featured a special section on cyber resiliency, calling it “A crucial foundation of a connected society.” The report emphasized 3 key cyber resiliency themes: the critical link between cyber resiliency and business risk the importance of adapting security practices and technologies to keep up with a continuously evolving threat landscape the challenges of attaining cyber resiliency when using legacy technologies Microsoft also maintains a list of 24 key issues impacting cyber resiliency, spanning everything from legacy on-premises resources to cloud technologies and frameworks. We’ll come back to this guidance in Part 2 of our series. Conclusion Cyber resiliency is more than the latest industry buzzword. In the first part of this series, we looked at the origins of the cyber resiliency movement with a focus on 2 common cyber resiliency frameworks developed by MITRE and NIST. We also looked briefly at Microsoft’s approach and some resources we offer customers wanting to improve the resilience of critical business operations in the face of destructive cyberattacks. In the 2nd part of this series, we'll take a closer look at Microsoft's approach to cyber resiliency, from its origins in the days of Trustworthy Compute to present-day guidance on designing security solutions to mitigate the effects of ransomware. Finally, in Part 3 of the series we’ll examine how we can use AI to help with some of the most challenging components of cyber resiliency.222Views2likes0CommentsIntroducing the Secure Future Initiative Tech Tips show!
Introducing the Secure Future Initiative: Tech Tips show! This show provides bite-sized technical tips from Microsoft security experts about how you can implement recommendations from one of the six engineering pillars of Microsoft's Secure Future Initiative in your own environment to uplift your security posture. Hosted by Sarah Young, Principal Security Advocate (_@sarahyo) and Michael Howard, Senior Director Microsoft Red Team (@michael_howard), the series interviews a range of Microsoft security experts giving you practical advice about how to implement SFI controls in your organization's environment. The first episode about phishing resistant creds is live on YouTube and MS Learn. Upcoming episodes include: Using managed identities Using secure vaults to store secrets Applying ingress and egress control Scanning for creds in code and push protection Enabling audit logs for cloud and develop threat detections Keep up to date with the latest Secure Future Initiative news at aka.ms/sfi146Views0likes0CommentsMicrosoft Security in Action: Zero Trust Deployment Essentials for Digital Security
The Zero Trust framework is widely regarded as a key security model and a commonly referenced standard in modern cybersecurity. Unlike legacy perimeter-based models, Zero Trust assumes that adversaries will sometimes get access to some assets in the organization, and you must build your security strategy, architecture, processes, and skills accordingly. Implementing this framework requires a deliberate approach to deployment, configuration, and integration of tools. What is Zero Trust? At its core, Zero Trust operates on three guiding principles: Assume Breach (Assume Compromise): Assume attackers can and will successfully attack anything (identity, network, device, app, infrastructure, etc.) and plan accordingly. Verify Explicitly: Protect assets against attacker control by explicitly validating that all trust and security decisions use all relevant available information and telemetry. Use Least Privileged Access: Limit access of a potentially compromised asset, typically with just-in-time and just-enough-access (JIT/JEA) and risk-based policies like adaptive access control. Implementing a Zero Trust architecture is essential for organizations to enhance security and mitigate risks. Microsoft's Zero Trust framework essentially focuses on six key technological pillars: Identity, Endpoints, Data, Applications, Infrastructure, & Networks. This blog provides a structured approach to deploying each pillar. 1. Identity: Secure Access Starts Here Ensure secure and authenticated access to resources by verifying and enforcing policies on all user and service identities. Here are some key deployment steps to get started: Implement Strong Authentication: Enforce Multi-Factor Authentication (MFA) for all users to add an extra layer of security. Adopt phishing-resistant methods, such as password less authentication with biometrics or hardware tokens, to reduce reliance on traditional passwords. Leverage Conditional Access Policies: Define policies that grant or deny access based on real-time risk assessments, user roles, and compliance requirements. Restrict access from non-compliant or unmanaged devices to protect sensitive resources. Monitor and Protect Identities: Use tools like Microsoft Entra ID Protection to detect and respond to identity-based threats. Regularly review and audit user access rights to ensure adherence to the principle of least privilege. Integrate threat signals from diverse security solutions to enhance detection and response capabilities. 2. Endpoints: Protect the Frontlines Endpoints are frequent attack targets. A robust endpoint strategy ensures secure, compliant devices across your ecosystem. Here are some key deployment steps to get started: Implement Device Enrollment: Deploy Microsoft Intune for comprehensive device management, including policy enforcement and compliance monitoring. Enable self-service registration for BYOD to maintain visibility. Enforce Device Compliance Policies: Set and enforce policies requiring devices to meet security standards, such as up-to-date antivirus software and OS patches. Block access from devices that do not comply with established security policies. Utilize and Integrate Endpoint Detection and Response (EDR): Deploy Microsoft Defender for Endpoint to detect, investigate, and respond to advanced threats on endpoints and integrate with Conditional Access. Enable automated remediation to quickly address identified issues. Apply Data Loss Prevention (DLP): Leverage DLP policies alongside Insider Risk Management (IRM) to restrict sensitive data movement, such as copying corporate data to external drives, and address potential insider threats with adaptive protection. 3. Data: Classify, Protect, and Govern Data security spans classification, access control, and lifecycle management. Here are some key deployment steps to get started: Classify and Label Data: Use Microsoft Purview Information Protection to discover and classify sensitive information based on predefined or custom policies. Apply sensitivity labels to data to dictate handling and protection requirements. Implement Data Loss Prevention (DLP): Configure DLP policies to prevent unauthorized sharing or transfer of sensitive data. Monitor and control data movement across endpoints, applications, and cloud services. Encrypt Data at Rest and in Transit: Ensure sensitive data is encrypted both when stored and during transmission. Use Microsoft Purview Information Protection for data security. 4. Applications: Manage and Secure Application Access Securing access to applications ensures that only authenticated and authorized users interact with enterprise resources. Here are some key deployment steps to get started: Implement Application Access Controls: Use Microsoft Entra ID to manage and secure access to applications, enforcing Conditional Access policies. Integrate SaaS and on-premises applications with Microsoft Entra ID for seamless authentication. Monitor Application Usage: Deploy Microsoft Defender for Cloud Apps to gain visibility into application usage and detect risky behaviors. Set up alerts for anomalous activities, such as unusual download patterns or access from unfamiliar locations. Ensure Application Compliance: Regularly assess applications for compliance with security policies and regulatory requirements. Implement measures such as Single Sign-On (SSO) and MFA for application access. 5. Infrastructure: Securing the Foundation It’s vital to protect the assets you have today providing business critical services your organization is creating each day. Cloud and on-premises infrastructure hosts crucial assets that are frequently targeted by attackers. Here are some key deployment steps to get started: Implement Security Baselines: Apply secure configurations to VMs, containers, and Azure services using Microsoft Defender for Cloud. Monitor and Protect Infrastructure: Deploy Microsoft Defender for Cloud to monitor infrastructure for vulnerabilities and threats. Segment workloads using Network Security Groups (NSGs). Enforce Least Privilege Access: Implement Just-In-Time (JIT) access and Privileged Identity Management (PIM). Just-in-time (JIT) mechanisms grant privileges on-demand when required. This technique helps by reducing the time exposure of privileges that are required for people, but are only rarely used. Regularly review access rights to align with current roles and responsibilities. 6. Networks: Safeguard Communication and Limit Lateral Movement Network segmentation and monitoring are critical to Zero Trust implementation. Here are some key deployment steps to get started: Implement Network Segmentation: Use Virtual Networks (VNets) and Network Security Groups (NSGs) to segment and control traffic flow. Secure Remote Access: Deploy Azure Virtual Network Gateway and Azure Bastion for secure remote access. Require device and user health verification for VPN access. Monitor Network Traffic: Use Microsoft Defender for Endpoint to analyze traffic and detect anomalies. Taking the First Step Toward Zero Trust Zero Trust isn’t just a security model—it’s a cultural shift. By implementing the six pillars comprehensively, organizations can potentially enhance their security posture while enabling seamless, secure access for users. Implementing Zero Trust can be complex and may require additional deployment approaches beyond those outlined here. Cybersecurity needs vary widely across organizations and deployment isn’t one-size-fits all, so these steps might not fully address your organization’s specific requirements. However, this guide is intended to provide a helpful starting point or checklist for planning your Zero Trust deployment. For a more detailed walkthrough and additional resources, visit Microsoft Zero Trust Implementation Guidance. The Microsoft Security in Action blog series is an evolving collection of posts that explores practical deployment strategies, real-world implementations, and best practices to help organizations secure their digital estate with Microsoft Security solutions. Stay tuned for our next blog on deploying and maximizing your investments in Microsoft Threat Protection solutions.1.7KViews1like0CommentsSafely activate your data estate with Microsoft Purview
60% of CDOs cite data integration challenges as a top pain-point due to lack of knowledge of where relevant data resides [1]. Companies operate on multi-platform, multi-cloud data estates making it harder than ever to seamlessly discover, secure, govern and activate data. This increases the overall complexity when enabling users to responsibly derive insights and drive business value from data. In the era of AI, data governance is no longer an afterthought, data security and data governance are now both table stakes. Data Governance is not a new concept but with the proliferation of AI and evolving regulatory landscape, data governance is critical for safeguarding data related to AI-driven business innovation. With 95% of organizations implementing or developing an AI strategy [2], customers are facing emerging governance challenges, such as: False signals: The lack of clean accurate data can cause false signals in AI which can trigger consequential business outcomes or lead to incorrect reported forecasting and regulatory fines. Time to insight: Data scientists and analysts spend 60-80% of their time on data access and preparation to feed AI initiatives which leads to staff frustration, increased OPEX, and delays in critical AI innovation priorities. Shadow innovation: Data innovation outside governance can increase business risks around data leakage, oversharing, or inaccurate outcomes. This is why federated governance has surfaced as a top priority across security and data leaders because it unlocks data innovation while maintaining appropriate data oversight to help minimize risks. Customers are seeking more unified solutions that enable data security and governance seamlessly across their complex data estate. To help customers better respond to these needs, Microsoft Purview unifies data security, data governance, and data compliance solutions across the heterogeneous data estate for the era of AI. Microsoft Purview also works closely with Microsoft Fabric to integrate capabilities that help seamlessly secure and govern data to help reduce risks associated with data activation across the Microsoft Intelligent Data Platform and across the Microsoft Cloud portfolio. Microsoft Fabric delivers a pre-integrated and optimized SaaS environment for data teams to work faster together over secure and governed data within the Fabric environment. Combining the strengths of Microsoft Purview and Microsoft Fabric enables organizations to more confidently leverage Fabric to unlock data innovation across data engineers, analysts, data scientists, and developers whilst Purview enables data security teams to extend Purview advanced data security value and enables the central data office to extend Purview advanced data governance value across Fabric, Azure, M365, and the heterogenous data estate. Furthering this vision, today Microsoft is announcing 1. a new name for the Purview Data Governance solution, Purview Unified Catalog, to better reflect its growing catalog capabilities, 2. integration with new OneLake catalog, 3. a new data quality scan engine, 4. Purview Analytics in OneLake, and 5. expanded Data Loss Prevention (DLP) capabilities for Fabric lakehouse and semantic models. Introducing Unified Catalog: a new name for the visionary solution The Microsoft Purview data governance solution, made generally available in September, delivers comprehensive visibility, data confidence, and responsible innovation—for greater business value in the era of AI. The solution streamlines metadata from disparate catalogs and sources, like OneLake, Databricks Unity, and Snowflake Polaris, into a unified experience. To better reflect these comprehensive customer benefits, Microsoft Purview Data Catalog is being renamed to Microsoft Purview Unified Catalog to exemplify the growing catalog capabilities such as deeper data quality support for more cloud sources, and Purview Analytics in OneLake. A data catalog serves as a comprehensive inventory of an organization's data assets. As the Microsoft Purview Unified Catalog continues to add on capabilities within curation, data quality, and third-party platform integration, the new Unified Catalog name reflects the current cross-cloud capability. This cross-cloud capability is illustrated in the figure below. This data product contains data assets from multiple different sources, including a Fabric lakehouse table, Snowflake Table and Azure Databricks Table. With the proper curation of analytics into data products, data users can govern data assets easier than ever. Figure 1: Curation of a data product from disparate data sources within Purview’s Unified Catalog Introducing OneLake catalog (Preview) As announced in the Microsoft Fabric blog earlier today, the OneLake catalog is a solution purpose-built for data engineers, data scientists, developers, analysts, and data consumers to explore, manage, and govern data in Fabric. The new OneLake catalog works with Purview by seamlessly connecting data assets governed by OneLake catalog into Purview Unified Catalog, enabling the central data office to centrally govern and manage data assets. The Purview Unified Catalog offers data stewards and data owners advanced capabilities for data curation, advanced data quality, end-to-end data lineage, and an intuitive global catalog that spans the data estate. For data leaders, Unified Catalog offers built-in reports for actionable insights into data health and risks and the ability to confidently govern data across the heterogeneous data estate. In figure 2, you can see how Fabric data is seamlessly curated into the Corporate Emissions Created by AI for CY2024 Data Product, built with data assets from OneLake. Figure 2: Data product curated with Fabric assets Introducing a new data quality scan engine for deeper data quality (Preview) Purview offers deeper data quality support, through a new data quality scan engine for big data platforms, including: Microsoft Fabric, Databricks Unity Catalog, Snowflake, Google Big Query, and Amazon S3, supporting open standard file and table formats. In short, this new scan engine allows businesses to centrally perform rich data quality management from within the Purview Unified Catalog. In Figure 3, you can see how users can run different data quality rules on a particular asset, in this case, a table hosted in OneLake, and when users click on “run quality scan”, the scanner runs a deep scan on the data itself, running the data quality rules in real time, and updating the quality score for that particular asset. Figure 3: Running a data quality scan on an asset living in OneLake Introducing Purview Analytics in OneLake (Preview) To further an organization’s data quality management practice, data stewards can now leverage a new Purview Analytics in OneLake capability, in preview, to extract tenant-specific metadata from the Purview Unified Catalog and publish to OneLake. This new capability enables deeper data quality and lineage investigation using the rich capabilities in Power BI within Microsoft Fabric. Figure 4: In Unified Catalog settings, a user can add self-serve analytics to Microsoft Fabric Figure 5: Curated metadata from Purview within Fabric Expanded Data Loss Prevention (DLP) capabilities for Fabric lakehouse and semantic models To broaden Purview data security features for Fabric, today we are announcing that the restrict access action in Purview DLP policies now extends to Fabric semantic models. With the restrict access action, DLP admins can configure policies to detect sensitive information in semantic models and limit access to only internal users or data owners. This control is valuable for when a Fabric tenant includes guest users and you want to limit unnecessary access to internal proprietary data. The addition of the restrict access action for Fabric semantic models augments the existing ability to detect upload of sensitive data to Fabric lakehouses announced earlier this year. Learn more about the new Purview DLP capabilities for Fabric lakehouses and semantic models in the DLP blog. Figure 6: Example of restricted access to a Fabric semantic model enforced through a Purview DLP policy. Summary With these investments in security and governance, Microsoft Purview is delivering on its vision to extend data protection customer value and innovation across your heterogenous data estate for reduced complexities and improved risk mitigation. Together Purview and Fabric set the foundations for a modern intelligent data platform with seamless security and governance to drive AI innovation you can trust. Learn more As we continue to innovate our products to expand the security and governance capabilities, check out these resources to stay informed. https://aka.ms/Try-Purview-Governance https://www.microsoft.com/en-us/security/business/microsoft-purview https://aka.ms/try-fabric [1] Top 7 Challenges in Data Integration and How to Solve Them | by Codvo Marketing | Medium [2] Microsoft internal research May 2023, N=6382.7KViews1like0Comments