Forum Discussion

Zorghost's avatar
Zorghost
Copper Contributor
Feb 22, 2025

Fetching alerts from Sentinel using logic apps

Hello everyone,

I have a requirement to archive alerts from sentinel. To do that I need to do the following:

  • Retrieve the alerts from Sentinel
  • Send the data to an external file share

As a solution, I decided to proceed with using logic apps where I will be running a script to automate this process. My questions are the following:

-> Which API endpoints in sentinel are relevant to retrieve alerts or to run kql queries to get the needed data.

-> I know that I will need some sort of permissions to interact with the API endpoint. What type of service account inside azure should I create and what permissions should I provision to it ?

-> Is there any existing examples of logic apps interacting with ms sentinel ? That would be helpful for me as I am new to Azure.

 

Any help is much appreciated !

 

  • Laurie_Rhodes's avatar
    Laurie_Rhodes
    Brass Contributor

    t's actually a lot easier than that!

    If you find the Log Analytics table Sentinel is installed on, you can use the built-in Data Export service to export tables you want to a Storage Account in real-time.

    You can simply specify the tables you want to export

    And in your rule you can specify the Storage Account you want logs to go to:

    This is a lot easier and more reliable than creating custom automation. :)

    • Zorghost's avatar
      Zorghost
      Copper Contributor

      This is indeed a great approach, and I will proceed with it for now. I have one more question, if you don’t mind:

      I need to include the events that triggered the alerts in the SentinelAlert table in the archived data. In your opinion, what is the best approach to tackle this?

    • Zorghost's avatar
      Zorghost
      Copper Contributor

      Thank you very much for your reply Laurie_Rhodes !

      This is indeed a great approach, and I will proceed with it for now. I have one more question, if you don’t mind:

      I need to include the events that triggered the alerts in the SentinelAlert table in the archived data. In your opinion, what is the best approach to tackle this?

      • Laurie_Rhodes's avatar
        Laurie_Rhodes
        Brass Contributor

        Hi Zorghost ,

        This is a lot tougher than you might think.  The flexibility of Sentinel means that we might be alerting on data in any table in Sentinel - or receiving alerts directly from Defender products.  

        To get your head around it, Incidents have an "AlertId" property which matches to the SystemAlertId field oin the SecurityAlert table.  Have a look at the 'ExtendedProperties' in those alerts and you will see a lof of KQL that's being used to trigger the alerts... and a lot of Defender alerts that dont really give you more than metadata that an alert has been triggered.  To make it worse, those KQL rules can be really complex with Joins and object statements that can make it almost impossible to parse the kql and retreieve a complete raw event object that actually caused the alert.  What this does tell you is that if you were trying to externalise and preserve data about events, or produce archival reports you are going to need to be executing KQL from the ExtendedProperties field to al least get the bits of events that are causing the alerts.

        This is a bit of a problem.  If I was tasked with this I'd be thinking about what the business outcome was that I had to achieve (and the budget the organisation had for a solution).


        There are probably two different outcomes your organisation might be trying to achieve with what you have been tasked.

        Management Reporting Outcome

        I'll guess that you are only talking about the SecurityAlerts that result in Incidents ather than preserving the data behind every SecurityAlert event as there can be enormous quantities of alerts.  With this scenario you'll need to do some form of scripting that supports KQL. This approach might make sense if your task is to produce PowerBI reports for management over Incidents.  You would probably need to grab the AlertId from the Incident object then use that to search the SecurityAlert table against SystemAlertIds.  This will then let you get the KQL from ExtendedProperties to query to show what data tripped the Analytics rule to create an alert.  If you go down this path you'll need to look at the Entities as well.  It isn't a small job.

        Forensics / Future Hunting Outcome

        It's more likely that the business outcome recognises that significant incursions can be discovered a year down the track and management wants to at least have data from alerts preserved for future use.  I guess you could look at your Sentinel Analytics rules and work out which tables you actually have alerts on and setup exports on just those tables but this is not likely to be a lot of help in investigating future incursions.

        Typically the SOC only has scraps of information when trying to investigate events today.  Its visibility of events is like shining a torch in the darkness.  If you were to decide to only preserve some tables - or only some events from individual tables, the chances of having enough information in a year to piece together events for some activity that doesn't create an incident today is zero.  

        If Forensics is management's concern, you want to be preserving all data for probably 18 months.  That's going to mean a financial impact.  Smaller firms will probably look at Sentinel's Archive Log capability which is easy to implement and cheap. Most small firms are very unlikely to ever need to restore logs from archive... but it's there if needed.  Bigger organisations will look at using Azure Data Explorer as it really allows everything to be stored long term in a searchable state... far more data than we consider sending to Sentinel but ADX has its own costs and management overheads that would be difficult for small firms.  

        Best Regards

        Laurie

Resources