cost management
17 TopicsCribl o Logstash vs AMA CEF: What’s the Best Choice for Ingesting Firewall Logs?
Hi everyone, what are the advantages of using Cribl or Logstash over a CEF log collector via AMA for ingesting firewall logs such as Palo Alto for example into Microsoft Sentinel? In a typical scenario, how would you configure the ingestion to optimize performance, scalability, and cost? What do you think? Let’s discuss and share experiences!46Views0likes2CommentsFeed data location to run against Sentinel's KQL function
Hi, We have a feed consisting of around 250,000-300,000 entries and will be imported daily. We do not intend to store this data in Sentinel as a table and would like to store it somewhere else (Cosmos, storage, etc.) from where we can grab this data and run it against one of our Sentinel's KQL functions to generate Alerts. Planning to use Logic Apps/Functions to do the above actions. But would like to know what would be the right solution here so that comparing the feed data against KQL function results would be fast and not of high cost Thank you !!291Views0likes1CommentDefender advanced hunting, data-grant from Defender for Servers licensing.
Hi, when configuring Defender for Servers P2 in Defender for Cloud it states that you would be granted a 500 MB per day free ingestion to a log analytics workspace, such as in Sentinel. However, when looking into the supported data sources I do not find the advanced hunting data that would be my first go-to data source when setting up Sentinel, how come? Here is a screenshot of how data-ingestion changed once i turned on the XDR connector, am I to understand that the 500MB ingestion per device we're paying for will do nothing to cover this cost? The E5 grant of 5MB/user/day is nowhere near this amount of data. Is there a way to utilize the 500MB ingestion per device grant for the advanced hunting data?577Views0likes2CommentsLinux AMA log ingestion filtering specific logs
I had previously applied ingestion time data transformation for few incoming logs in syslog table when I was using MMA agent for linux. Now I am moving to AMA for Linux servers. How do I apply specific log filtering on AMA for linux logsources? such as if ip is 1.1.1.1 and it contains err logs, drop them. I know it is possible in windows DCR but how can I built same DCR for linux in AMA to filter out them.692Views0likes1CommentAMA agent DCR log filtering
Hi, I have previously created KQL queries for ingestion time transformation and was filtering out certain event ids and few other logs (e.g. | where not(EventID == 4799 and CallerProcessName contains "C:\\Program Files\\Qualys\\QualysAgent\\QualysAgent.exe") ) . Now I have almost 80+ filtering KQL queries which I have applied on securityEvent table to filter out specific logs. I have shifted my servers from MMA agent to AMA agent and AMA agent has its down DCR and my existing ingestion time transformation won't work now. I need to create xpath queries in new DCR. Is there anyway I can convert all of the existing ingestion time transformation applied KQLs (example already mentioned above)? OR Do I need to create separate DCRs for AMA to filterout specific events which are 80+?1.2KViews0likes1CommentRE: Commitment Tiers in Microsoft Sentinel
If you choose a commitment tier of 100 GB per day, are you charged the fixed rate per day OR the amount of GB I use per day, say 50GB? So, let's say I use, on average, 50GB for 30 days, and I am using the commitment tier mentioned... How are my estimated costs calculated?Solved1.4KViews0likes6CommentsSAP Data Connector - Sentinel
Hi Community, we are using SAP Data connector for Sentinel for one Month. According to Microsoft the connector charges for production environments 2 $ per hour after 1. May. Our SAP Environment is a Demo and it can be also viewed at the T000 Table. We have seen that the connector has started to charge us for three days (it is also not understandable because it is supposed to charge us from the beginning of the month, if the environment type has been read as Production and we have not changed anything in the infrastructure). It is also displayed in the Connector page as Demo. As a result i had to stop Agent and it stopped to charge. I couldnot find the reason, is there anybody who uses the this connector with demo SAP env. I appreciate your answers. Thank you in advance.617Views0likes0CommentsAnalytic rules, KQL queries and UEBA pricing
Hi, I am interested if there is any additional cost when talking about Log Analytics Workspace (without Sentinel) when it comes to running KQL queries? Are there any "data processing" costs that occur or is it free in that sense? On this link https://azure.microsoft.com/en-us/pricing/details/monitor/ I didn't see any mention of "data processing costs", Microsoft only mentions "Log data processing" feature name "Log data ingestion and transformation" but writing KQL queries is not data transformation in that sense -> https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-transformations When talking about Sentinel, should I expect larger bill if I enable 50-500 Analytic rules from Sentinel templates or content hub? Do these or custom analytic rules occur any additional "processing" costs? On this link https://azure.microsoft.com/en-us/pricing/details/microsoft-sentinel/ Microsoft only mentions "Search jobs". I assume Analytic rules and issuing KQL queries fall into category search jobs. What if someone is not using Sentinel but only Log Analytics Workspace and writing KQL queries? Since this (search jobs) is not mentioned on https://azure.microsoft.com/en-us/pricing/details/monitor/ is documentation just not up to date and this same search job price applies to KQL queries in Log Analytics deployments without Sentinel? Microsoft states UEBA doesn't cost any additional money. Is it truly no additional cost or some cost will occur since it processes data from Audit Logs, Azure Activity, Security Events and SignIn Logs tables, namely as described by "search jobs"?3.3KViews1like2CommentsMicrosoft sentinel custom parsers
Dear All, There are charges as per the Microsoft website for creating custom coloumns during parsing. Please let me know the following:- What is the charge exactly? How much i will charge if i do parsing and create a single custom coloumns? What is i do the parsing and use the already existing coloumns for example "Account", is there any charges for it? Kindly share any supporting documents or links from Microsoft for support. Regards Sammy. https://techcommunity.microsoft.com/t5/microsoft-sentinel/latest-costing-billing-changes/m-p/36795681.7KViews0likes2CommentsSentinel Cost Optimization Series - Part 1 - Data prioritization
* There are graphs in this post, but I can't seem to upload/insert them; please visit the link in each part to see the picture. Problem statement Data prioritization is an issue that any SIEM or data gathering and analysis solution must consider. The log that we collect to SIEM is typically security-related and capable of directly creating alerts based on the event of that log, such as EDR alerts. However, not all logs are equally weighted. For example, the proxy log only contains the connections of internal users, which is very useful for investigation, but it does not directly create alerts and has a very high volume. To demonstrate this, we categorize the log into the primary log and secondary log based on its security value and volume. Data categorize The metadata and context of what was discovered are frequently contained in the primary log sources used for detection. However, secondary log sources are sometimes required to present a complete picture of a security incident or breach. Unfortunately, many of these secondary log sources are high-volume verbose logs with little relevance for security detection. They aren’t useful unless a security issue or threat search requires them. On the current traditional on-premise solution, we will use SIEM alongside a data lake to store secondary logs for later use. On-premise Architecture Because we have complete control over everything, we can use any technology or solution, making it simple to set up (Eg. Qradar for SIEM and ELK for data lake). However, for cloud-naive SIEM, this becomes more difficult, particularly with Microsoft Sentinel. Microsoft Sentinel is a cloud-native security information and event manager (SIEM) platform that includes artificial intelligence (AI) to help with data analysis across an enterprise. To store and analyze everything for Sentinel, we typically use Log Analytics with the Analytics Logs data plan. However, this is prohibitively expensive, costing between $2.00 and $2.50 per GB ingested per day depending on the Azure region used. Current Solution Storage Account (Blob Storage) To store these secondary data, the present approach uses Blob Storage. Blob storage is designed to hold large volumes of unstructured data, which implies it does not follow a certain data model or specification, such as text or binary data. This is a low-cost option for storing large amounts of data. The architecture for this solution is as follows: Blob Architecture However, Blob Storage has a limitation that is hard to ignore. The data in Blob Storage is not searchable. We can circumvent this by using as demonstrated in Search over Azure Blob Storage content, but this adds another layer of complexity and pricing that we would prefer to avoid. The alternative option is to use KQL externaldata, but this is designed to obtain small amounts of data (up to 100 MB) from an external storage device, not massive amounts of data. Our Solution High-Level Architecture Our solution used Basic Logs to tackle this problem. Basic Logs is a less expensive option for importing large amounts of verbose log data into your Log Analytics workspace. The Basic log also supports a subset of KQL, making it searchable. To get the log into the Basic Log, We need to use a Custom table generated with the Data Collection Rule (DCR)-based logs ingestion API. The structure is as follows: Our Solution Architecture Our Experiment In our experiment, we use the following component for the architecture: Component Solution Description Source Data VMware Carbon Black EDR Carbon Black EDR is an endpoint activity data capture and retention solution that allows security professionals to chase attacks in real-time and observe the whole attack kill chain. This means that it captures not only data for alerting, but also data that is informative, such as binary or host information. Data Processor Cribl Stream Cribl helps process machine data in real-time - logs, instrumentation data, application data, metrics, and so on - and delivers it to a preferred analysis platform. It supports sending logs to Log Analytics, but only with the Analytics plan. To send the log to the Basic plan, we need to set up a data collection endpoint and rule, please see Logs ingestion API in Azure Monitor (Preview) for additional information on how to set this up. And we also use a Logic App as a webhook to collect the log and send it to the Data collection endpoint. The environment we use for log generation is as follows: Number of hosts: 2 Operation System: Windows Server 2019 Number of days demo: 7 The number of logs we collected for our test environment are: Basic Log generated: 30.2 MB Alerts generated: 16.6 MB Data Ingestion The cost is based on the East US region, the currency is the USD, and the Pay-As-You-Go Tier was used to determine the number saved using the generated data with 1000 hosts and 30 days retention period. The calculation using only Analytic Log Table Ingestion Volume (GB) Cost per GB (USD) Total cost per day (USD) Total cost per retention period (USD) Host number Retention (Days) Cb_logs_Raw_CL 2.16 2.3 4.96 148.84 1000 30 Cb_logs_alert_CL 1.19 2.3 2.73 81.81 1000 30 Total 7.69 230.66 If we use Analytic Log with Storage Account Table Ingestion Volume (GB) Cost per GB (USD) Total cost per day (USD) Total cost per retention period (USD) Host number Retention (Days) Cb_logs_Raw_CL 2.16 0.02 0.04 1.29 1000 30 Cb_logs_alert_CL 1.19 2.3 2.73 81.81 1000 30 Total 2.77 83.11 If we use Analytic Log with Basic Log Table Ingestion Volume (GB) Cost per GB (USD) Total cost per day (USD) Total cost per retention period (USD) Host number Retention (Days) Cb_logs_Raw_CL 2.16 0.5 1.08 32.36 1000 30 Cb_logs_alert_CL 1.19 2.3 2.73 81.81 1000 30 Total 3.81 114.17 Now let’s compare these 3 solutions together and get an overall look altogether. Only Analytic Log Analytic Log with Storage Account Analytic Log with Basic Log Cost calculated $230.66 $83.11 $114.17 Searchable Yes No Yes but cost $0.005 per GB Retention Up to 2,556 days (7 years) 146,000 days (400 years) Up to 2,556 days (7 years) Limitation Even though the Basic Log is an excellent choice for ingesting hot data, it does have some limitations that are difficult to overlook: The retention period is only 8 days, and this retention can’t be increased, after that, it will either be deleted or archived KQL language access is limited, for a list of what operators can be used, please see here There is a charge for interactive queries ($0.005/GB-scanned) This is the first post in this Sentinel Cost Optimization series. I hope this helps you have another choice to consider when setting up and sending your custom log to Sentinel.2.6KViews1like0Comments