data collection
254 TopicsCribl o Logstash vs AMA CEF: What’s the Best Choice for Ingesting Firewall Logs?
Hi everyone, what are the advantages of using Cribl or Logstash over a CEF log collector via AMA for ingesting firewall logs such as Palo Alto for example into Microsoft Sentinel? In a typical scenario, how would you configure the ingestion to optimize performance, scalability, and cost? What do you think? Let’s discuss and share experiences!46Views0likes2CommentsFetching alerts from Sentinel using logic apps
Hello everyone, I have a requirement to archive alerts from sentinel. To do that I need to do the following: Retrieve the alerts from Sentinel Send the data to an external file share As a solution, I decided to proceed with using logic apps where I will be running a script to automate this process. My questions are the following: -> Which API endpoints in sentinel are relevant to retrieve alerts or to run kql queries to get the needed data. -> I know that I will need some sort of permissions to interact with the API endpoint. What type of service account inside azure should I create and what permissions should I provision to it ? -> Is there any existing examples of logic apps interacting with ms sentinel ? That would be helpful for me as I am new to Azure. Any help is much appreciated !76Views0likes4CommentsComprehensive coverage and cost-savings with Microsoft Sentinel’s new data tier
Microsoft is excited to announce the public preview of a new data tier Auxiliary Logs and Summary Rules in Microsoft Sentinel to further increase security coverage for high-volume data at an affordable price.15KViews3likes1CommentEnhancing Security Monitoring: Integrating GitLab Cloud Edition with Microsoft Sentinel
Maximize your security operations by combining GitLab Cloud Edition with Microsoft Sentinel. This blog covers how to fill the void of a missing native connector for GitLab in Sentinel. Utilize GitLab's API endpoints, Azure Monitor Data Collection Rules, and Data Collection Endpoints, as well as Azure Logic Apps and Key Vault, to simplify log collection and improve immediate threat identification. Our detailed guide will help you integrate smoothly and strengthen your security defences.4KViews0likes5CommentsCannot stop CEF duplication to syslog when both processed by same Linux VM
We have a situation where we are sending CEF records from FortiGate firewall to Microsoft Sentinel via Common Event Format (CEF) via AMA Data connector and we also use Syslog via AMA Data connector (both on the same Ubuntu Linux VM using rsyslog) and result is that we are getting duplicates of the CEF records in the syslog. I've read a lot of articles about the duplication and possible ways to fix however I've had not success. My most recent attempt is to create a file /etc/rsyslog.d/05-filter-CEF.conf with the following entries: if ($programname == "CEF") then @@127.0.0.1:28330 & stop Unfortunately we still get duplicates. One article I read said to use @@127.0.0.1:25226 however then we don't get CEF records in a CommonSecurityLog or Syslog. Is there anyone that can help?64Views0likes2CommentsIntegrating Fluent Bit with Microsoft Sentinel
This guide will walk you through the steps required to integrate Fluent Bit with Microsoft Sentinel. Beware that in this article, we assume you already have a Sentinel workspace, a Data Collection Endpoint and a Data Collection Rule, an Entra ID application and finally a Fluent Bit installation. As mentioned above, log ingestion API supports ingestion both in custom tables as built-in tables, like CommonSecurityLog, Syslog, WindowsEvent and more. In case you need to check which tables are supported please the following article: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables Prerequisites: Before beginning the integration process, ensure you have the following: An active Azure subscription with Microsoft Sentinel enabled; Microsoft Entra ID Application taking note of the ClientID, TenantID and Client Secret – create one check this article: https://learn.microsoft.com/en-us/entra/identity-platform/quickstart-register-app?tabs=certificate A Data Collection Endpoint (DCE) – to create a data collection endpoint, please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-endpoint-overview?tabs=portal A Data Collection Rule (DCR) – fields from the Data Collection Rule need to match exactly to what exists in table columns and also the fields from the log source. To create a DCR please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-rule-create-edit?tabs=cli Depending on the source, it might require a custom table to be created or an existing table from log analytics workspace; Fluent Bit installed on your server or container – In case you haven’t yet installed Fluent Bit, in the following article you'll find the instructions per type of operating system: https://docs.fluentbit.io/manual/installation/getting-started-with-fluent-bit High level architecture: Step 1: Setting up Fluent Big configuration file Before we step-in into the configuration, Fluent Bit has innumerous output plugins and one of those is through Log Analytics Ingestion API both to supported Sentinel tables but also for custom tables. You can check more information about it here in Fluent Bit documentation: https://docs.fluentbit.io/manual/pipeline/outputs/azure_logs_ingestion Moving forwarder, in order to configure Fluent Bit to send logs into Sentinel log analytics workspace, please take note of the specific input plugin you are using or intend to use to receive logs and how can you use it to output the logs to Sentinel workspace. For example most of the Fluent Bit plugins allow to set a “tag” key which can be used within the output plugin so that there’s a match in which logs are intended to send. On the other hand, in a scenario where multiple input plugins are used and all are required send logs to Sentinel, then a match of type wildcard "*" could be used as well. Another example, in a scenario where there are multiple input plugins of type “HTTP” and you want to just send a specific one into Sentinel, then the “match” field must be set according to the position of the required input plugin, for example “match http.2”, if the input plugin would the 3 rd in the list of HTTP inputs. If nothing is specified in the "match" field, then it will assume "http.0" by default. For better understanding, here’s an example of how a Fluent Bit config file could look: First, the configuration file is located under the path ”/etc/fluent-bit/fluent-bit.conf” The first part is the definition of all “input plugins”, then follows the “filter plugins” which you can use for example to rename fields from the source to match for what exists within the data collection rule schema and Sentinel table columns and finally there’s the output plugins. Below is a screenshot of a sample config file: INPUT plugins section: In this example we’re going to use the “dummy input” to send sample messages to Sentinel. However, in your scenario you could leverage other’s input plugins within the same config file. After everything is configured in the input section, make sure to complete the “FILTER” section if needed, and then move forward to the output plugin section, screenshot below. OUTPUT plugins section: In this section, we have output plugins to write on a local file based on two tags “dummy.log” and “logger”, an output plugin that prints the outputs in json format and the output plugin responsible for sending data to Microsoft Sentinel. As you can see, this one is matching the “tag” for “dummy.log” where we’ve setup the message “{“Message”:”this is a sample message for testing fluent bit integration to Sentinel”, “Activity”:”fluent bit dummy input plugn”, “DeviceVendor”:”Ubuntu”}. Make sure you insert the correct parameters in the output plugin, in this scenario the "azure_logs_ingestion" plugin. Step 2: Fire Up Fluent Bit When the file is ready to be tested please execute the following: sudo /opt/fluent-bit/bin/fluent-bit -c /etc/fluent-bit/fluent-bit.conf Fluent bit will start initialization all the plugins it has under the config file. Then you’re access token should be retrieved if everything is well setup under the output plugin (app registration details, data collection endpoint URL, data collection rule id, sentinel table and important to make sure the name of the output plugin is actually “azure_logs_ingestion”). In a couple of minutes you should see this data under your Microsoft Sentinel table, either an existing table or a custom table created for the specific log source purpose. Summary Integrating Fluent Bit with Microsoft Sentinel provides a powerful solution for log collection and analysis. By following this guide, hope you can set up a seamless integration that enhances your organization's ability to monitor and respond to security threats, just carefully ensure that all fields processed in Fluent Bit are mapped exactly to the fields in Data Collection Rule and Sentinel table within Log Analytics Workspace. Special thanks to “Bindiya Priyadarshini” that collaborated with me on this blog post. Cheers!736Views2likes1CommentHelp Ingesting PingID Logs into Microsoft Sentinel
Hello, Microsoft Sentinel has a Data Connector for PingFederate, however this does not capture other PingIdentity products. Namely, PingID logs. Making this post asking if there are any ways to best implement ingesting PingID logs into Sentinel, as I am unable to find any documentation for PingIdentity or Sentinel that would assist me in coming up with a solution. Thank you for all comments and ideas.79Views0likes2CommentsHelp Protect your Exchange Environment With Microsoft Sentinel
TL;DR; Sentinel + Exchange Servers or Exchange Online = better protected New Microsoft Sentinel security solution for Exchange Online and on premises servers : Microsoft Exchange Security! This content is very useful for any organization concerned about keeping the highest security posture as possible and be alerted in case of suspicious activities for those critical items.17KViews6likes12CommentsSentinel and Amazon Web Services S3 WAF
Hello, I'm using Sentinel to fetch AWS WAF logs using the new collector Amazon Web Services S3 WAF . I setup a first collection using the ARN role and SQS Queue (Francfort Region). arn:aws:iam::XXXXXXXXX:role/OIDC_MicrosoftSentinel https://sqs.eu-central-1.amazonaws.com/XXXXXXX/sqs-aws-cloudwatch-sentinel I then add new collection using ARN role and SQS Queue (Francfort Region). arn:aws:iam::XXXXXXXXX:role/OIDC_MicrosoftSentinel https://sqs.eu-west-3.amazonaws.com/XXXXXXX/sqs-aws-cloudwatch-sentinel Adding the second collection erase the first one !! Is it a bug ?? Regards, HA58Views1like1Comment