Forum Discussion
Pontus T
Apr 27, 2017Iron Contributor
Help with parameter for Search-UnifiedAuditLog
Hi, Disclaimer: I am new to PowerShell, hence why I turn here for your input. Background: I'm creating a Power BI dashboard based on data exported from the O365 Audit Log. For the moment, I'...
- Apr 28, 2017
NarasimaPerumal Chandramohan thanks for pointing me in the right direction. I managed to solve it by using SessionID and SessionCommand. All I needed was a while loop that kept running until the variable taking the audit data returned null, and keep appending the export file in every loop run.
TonyRedmond
Apr 28, 2017MVP
I am unaware of a way to exclude accounts from a cmdlet pull for audit records. There might be an easier way for you to do this... Cogmotive Reports has a Discover and Audit module that works against the audit log records (in other words, the same data) and it comes with a pivot table capability that you might be able to do what you want. They also are able to provide more data than 5,000 records at a time because they store the audit log data in their own stores. You could try that...
Pontus T
Apr 28, 2017Iron Contributor
TonyRedmond Thank you for the tip about Cogmotive. I will have a look at the product for sure. However, I like the customizability that I get in Power BI, building interactive reports for different stakeholders. And I guess it's something about the challenge of building it yourself :)
- TonyRedmondApr 28, 2017MVP
Remember that if you build a solution, you have to maintain the code... and things break, like APIs. That's one of the reasons why it is sometimes better to go with a commercial solution and let the ISV have the headache of maintaining the code and making sure that it works, even when the APIs are in flux (as does happen inside Office 365).
TR
Pontus T wrote:TonyRedmond Thank you for the tip about Cogmotive. I will have a look at the product for sure. However, I like the customizability that I get in Power BI, building interactive reports for different stakeholders. And I guess it's something about the challenge of building it yourself :)
- Rajiv ChokshiJul 06, 2017
Microsoft
Is this possible with the native PowerShell command search-unifiedauditlog?
I am trying to figure out if it is possible to get a SINGLE report for all activities on a particular SPO site collection using the PowerShell command search-unifiedauditlog.
For example I want to know everything that happened on below site collection in the last 30 days https://contoso.sharepoint.com/sites/projectabc and this should include all of these :
File and page activities
Folder activities
Sharing and access request activities
Synchronization activities
Site administration activities
Thanks in advance. I was able to do this with AdminDroid but looking to see if the native command can accomplish this.
- Pontus TJul 20, 2017Iron Contributor
Rajiv Chokshi Sorry for the delayed reply. Yes, this should be possible. Anything you can query in the Audit log in the Security & Compliance Center, can be pulled from the Audit log using Search-UnifiedAuditLog, and exported to CSV. So start with going to the Office 365 Audit log and confirm that you can find the actions you are looking for.
Then you can start on designing the script. To get you started, below is a copy of the scripts that I am using. It is looping the cmdlet using the session parameter until all the records are exported to the CSV file. When I'm converting from JSON, and when exporting to CSV, I am selecting certain objects that are of value for my use case, but there are more object holding data that might be valuable for you. Taking away the selection should bring it all in. Also, I am searching 3 days back in time as I have noticed that some records take that long before they are saved to the audit log. You can experiment with this and compare the number of rows you get depending on how far back you set the interval.I am far away from an expert on PowerShell, but with patience and help from our database engineer at work, we have created a Power BI dashboard that can display all audit log actions across the whole tenant in a user friendly and graphical way. It gets new data from the script on a daily basis, and as it is exported you will have no limits in searching back in time, compared to the (newly announced) 90 days that MSFT saves for the Audit log.
Please note that you should start by using a script to securely save (encrypt) your credentials, which in turn are called in the beginning of the second script (you only need to use this once). You probably create a service account for this and set it up with minimal permissions needed (more under the "Before you begin" tab under this link).
First script to store your credentials
param( [Parameter(Mandatory=$true)] [string]$Path, [Parameter(Mandatory=$true)] [string]$password ) # Convert the password to a secure string $SecurePassword = $password | ConvertTo-SecureString -AsPlainText -Force # Store the credential in the path $SecurePassword | ConvertFrom-SecureString | Out-File $Path # Write What we did Write-Host "Wrote password to $path" <# .SYNOPSIS Stores a password in a file on the local computer for retrevial by scripts. .DESCRIPTION Used for securely storing a password on a machine for use with automated scripts. Takes a password and encrypts it using the local account, then stores that password in a file you specify. Only the account that creates the output file can decrypt the password stored in the file. .PARAMETER Path Path and file name for the password file that will be created. .PARAMETER Password Plain text version of password. .OUTPUTS File Specified in Path variable will contain an encrypted version of the password. .EXAMPLE .\New-StoredCredential.ps1 -Path c:\scripts\O365Account.txt -Password "Password123" Puts the encrypted version of Password123 into the c:\scripts\O365Account.txt file #>
Second script to pull data from the audit log (you need to define your own variables in some places as I have removed mine from this copy)
# Set constants $Username = "[YOUR ACCOUNT, RECOMMENDED TO USE THE PRINCIPLE OF LEAST PRIVILEGE]" $PasswordPath = "[YOUR PATH DEFINED IN SCRIPT ABOVE]" # Read the password from the file and convert to SecureString Write-Host "Getting password from $passwordpath" $SecurePassword = Get-Content $PasswordPath | ConvertTo-SecureString # Build a Credential Object from the password file and the $username constant $Credential = New-Object System.Management.Automation.PSCredential -ArgumentList $Username, $SecurePassword # Open a session to Exchange Online were the audit log cmdlet is located $O365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Credential -Authentication Basic -AllowRedirection Import-PSsession $O365session # Define dates for search. MSFT claims Windows short date format on the machine running the script should be used, but test proved "MM/dd/yyyy" must be used $StartDate = (get-date).AddDays(-3).ToString("MM/dd/yyyy") $EndDate = (get-date).AddDays(-2).ToString("MM/dd/yyyy") # Define file name and path for csv export $FileAppend = (get-date).AddDays(-3).ToString("yyyyMMdd") # [string]$FileAppend = (Get-Date -Format yyyyMMdd) $OutputFile = "[DEFINE OUTPUT FILE PATH]" + $FileAppend + ".csv" # Set variable value to trigger loop below (can be anything) $AuditOutput = 1 # Loop will run until $AuditOutput returns null which equals that no more event objects exists from the specified date while ($AuditOutput) { # Search the defined date(s), SessionId + SessionCommand in combination with the loop will return and append 100 object per iteration until all objects are returned (minimum limit is 50k objects) $AuditOutput = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -SessionId "1" -SessionCommand ReturnLargeSet # Select and expand the nested object (AuditData) as it holds relevant reporting data. Convert output format from default JSON to enable export to csv $ConvertedOutput = $AuditOutput | Select-Object -ExpandProperty AuditData | ConvertFrom-Json # Export results exluding type information. Append rather than overwrite if the file exist in destination folder $ConvertedOutput | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent | Export-Csv $OutputFile -NoTypeInformation -Append } # Close the sesssion to Exhange Online Remove-PSSession $O365Session
I hope this helps to get you started. You can read more about the audit log and how you can limit it's scope if needed here.