Custom Logs and Custom Fields
18 TopicsAzure Monitor AMA Migration helper workbook question for subscriptions with AKS clusters
Hi, In an ongoing project, I've been looking into helping a customer updating their agents from the Microsoft Monitoring Agent (MMA) to the new Azure Monitoring Agent (AMA) that consolidates installation and the previous Log Analytics agent, Telegraf agent, diagnostics extension in Azure Event Hubs, Storage etc., and then configure Data Collection Rules (DCRs) to collect data using the new agent. One of the first steps is of course to identify which resources are affected and that needs to be migrated. There are multiple tools to identify the resources such as this PowerShell script as well as the built-in AMA Migration workbook in Azure Monitor, which is what I used as the initial option at the start of the AMA migration process. When running the notebook, it will list all VMs, VMSSs etc. in the subscription that do not have the AMA agent installed, e.g., through an Azure Policy or automatically by having configured a DCR, or that do have the old MMA installed, and thus needs to be migrated. In Azure, Azure Kubernetes Services (AKS), as Kubernetes is a rather specific hosting service, almost like its own mini-ecosystem in regard to networking, storage, scaling etc., enables access and control of the underlying infrastructure composing the cluster created by the AKS and its master node, providing the potential fine-grain and granular control of these resources for IT administrators, power users etc. However, in most typical use cases the underlying AKS infrastructure resources should not be modified as it could break configured SLOs. When running the Azure Monitor built-in AMA migration workbook, it includes all resources by default that do not have the AMA installed already, no matter what type of resource it is, including potential underlying cluster infrastructure resources created by AKS in the "MC_" resource group(s), such as virtual machine scale sets handling the creation and scaling of nodes and node pools of an AKS cluster. Perhaps the underlying AKS infrastructure resources could be excluded from the AMA migration results of the Azure Monitor workbook by default, or if underlying non-AMA migrated AKS infrastructure resources are found, perhaps accompanied with a text describing potential remediation steps for AMA agent migration for AKS cluster infrastructure resources. Has anyone encountered the same issue and if so how did you work around it? Would be great to hear some input and if there's already some readily available solutions/workaround out there already (if not, I've been thinking perhaps making a proposed PR here with a filter and exclusion added to the default workbook e.g. here https://github.com/microsoft/AzureMonitorCommunity/tree/master/Azure%20Services/Azure%20Monitor/Agents/Migration%20Tools/Migration%20Helper%20Workbook). Thanks!40Views0likes1Commenthow to parse logs in DCR if RawMessage is in JSON
Dear Fellow Members, I am going through the tutorial on ingesting logs through the Azure Log Ingestion API. At the moment I am at the point where I need to create a DCR for ingesting the logs. I managed to upload the sample logs, and now I would have to set up the schema/transformation rules for the log ingestion. Now my problem is that the RawMessage part of the ingested logs is basically a JSON document: [ { "RawData": "{\"SourceName\":\"Microsoft-Windows-DNSServer\",\"ProviderGuid\":\"{EB79061A-A566-4698-9119-3ED2807060E7}\",\"EventID\":256,\"Version\":0,\"ChannelID\":16,\"Channel\":\"Microsoft-Windows-DNS-Server/Analytical \",\"LevelValue\":4,\"Level\":\"Information \",\"OpcodeValue\":0,\"TaskValue\":1,\"Category\":\"LOOK_UP \",\"Keywords\":\"9223372036854775809\",\"EventTime\":\"2023-04-13T10:22:14.043901+02:00\",\"ExecutionProcessID\":6624,\"ExecutionThreadID\":4708,\"EventType\":\"INFO\",\"SeverityValue\":2,\"Severity\":\"INFO\",\"Hostname\":\"windns\",\"Domain\":\"NT AUTHORITY\",\"AccountName\":\"SYSTEM\",\"UserID\":\"S-1-5-18\",\"AccountType\":\"User\",\"Flags\":\"256\",\"TCP\":\"0\",\"InterfaceIP\":\"172.18.88.20\",\"Source\":\"172.18.88.20\",\"RD\":\"1\",\"QNAME\":\"v10.events.data.microsoft.com.\",\"QTYPE\":\"1\",\"XID\":\"21030\",\"Port\":\"59130\",\"ParsedPacketData\":{\"dns.id\":21030,\"dns.flags.recursion_desired\":\"true\",\"dns.flags.truncated_response\":\"false\",\"dns.flags.authoritative\":\"false\",\"dns.opcode\":\"QUERY\",\"dns.flags.query_or_response\":\"false\",\"dns.response.code\":\"NOERROR\",\"dns.flags.checking_disabled\":\"false\",\"dns.flags.authentic_data\":\"false\",\"dns.flags.recursion_available\":\"false\",\"dns.query\":[{\"dns.query.name\":\"v10.events.data.microsoft.com\",\"dns.query.type\":\"A\",\"dns.query.class\":\"IN\"}]},\"PacketData\":\"0x52260100000100000000000003763130066576656E74730464617461096D6963726F736F667403636F6D0000010001\",\"AdditionalInfo\":\".\",\"GUID\":\"{B021826E-78B1-4574-8B19-0FF06408A144}\",\"EventReceivedTime\":\"2023-04-13T10:22:16.140231+02:00\",\"SourceModuleName\":\"in_windowsdns_auditanalytics_sentinel_windows\",\"SourceModuleType\":\"im_etw\",\"HostIP\":\"172.18.88.20\",\"BufferSize\":\"N/A\"}", "Time": "2023-04-19T07:30:08.5953753Z", "Application": "LogGenerator" } ] Now that is already in a structured format which should be reasonably easy to parse. However I haven't seen any examples of doing that. I have only encountered JSON parsing examples where the JSON text was contained in some field, and the result of the parsing would be assinged to a different/new field. In this case the JSON content is filled with key-value pairs that should belong to different fields in the new table. Have any of you encountered a similar situation? If yes, how did you manage to solve it? Is anything like this even possible in a DCR? source | parse RawData as json Thanks, János1.5KViews0likes5CommentsKusto Query for troubleshooting the Network Security Group
Hi Team, i need some help on Kusto Query for troubleshooting the Network Security Group connectivity between source IP and Destination IP, can someone please help in Kusto Query to check the NSG logs for source and destination to check connectivity is allowed between source and destination. I'm very new to Kusto Query so posted here, appreciate for help Source Ip : 10.226.16.165 destination : 159.123.12.3Solved1.5KViews0likes2CommentsEditing Custom Fields for syslog message extraction
Hi, I am currently creating new custom fields to extract the data from a syslog data source. Having initially setup the three fields I need I've now found a set of messages that do not parse correctly. How can I update the Wizard for the custom field to include this new extraction? Right now the only option I can see is to delete the custom field and start again. This is going to cause me all sorts of problems if we need to check every single possible message from a data source before we create a custom field. Or, alternatively am I just missing something and there is a much easier way to do this?Solved3.5KViews0likes2CommentsUnable to set Custom logs with log path configure to c:\users\%username%\...
I'm trying to set a Windows Custom Log with a file path in c:\users\%username%\ but the Agent doesn't seem to recognize the %username% Windows variable. Is there any way to set the custom path using Windows 10 Environment variables ? My path are currently defined to : C:\Users\%username%\AppData\Roaming\ICAClient\wfcwin32.log File type: ANSI2.2KViews0likes2CommentsLog Analytics Query for computer last login/active date and time
Hi, I am looking for a query where I can get last login/active date and time for computers in a separate column. I am already using the below query for windows update WaaSDeploymentStatus | where UpdateCategory == "Quality" and TimeGenerated > ago(60d) | summarize arg_max(ReleaseName, DeploymentStatus, DetailedStatus, DetailedStatusLevel, ExpectedInstallDate) by Computer Please suggest what should we add to this query to get a new column which will give me last login/active date and time for computers21KViews0likes13CommentsQuery could not be parsed at 'SecurityEvent' on line.....
We upgraded to the standard tier, but this still isn't working. I can query events, but if I use any queries that involve SecurityEvent it doesn't work. Basically I'm trying to follow this: https://pixelrobots.co.uk/2019/07/query-active-directory-security-events-using-azure-log-analytics-on-the-cheap/Solved14KViews0likes2CommentsCPU utilization for VMs in past 3 months in different time zone (PST) for Specific working hours
Hi Team, I need a help in getting the average CPU utilization for VMs in last 3 months in different time zone (PST) only for specific time range. i have written the query to fetch the average CPU utilization for last 3 months and i set the time range in portal option. Heartbeat | where SubscriptionId != '' | summarize by TenantId, SubscriptionId, Computer, ResourceGroup=tolower(ResourceGroup), ResourceId=tolower(ResourceId) | where ResourceGroup == "azrg-oc-ame-tds-vm" | join kind=inner ( Perf | where (ObjectName == "Processor" and CounterName == "% Processor Time") | summarize CPUAvg = (avg(CounterValue)) by Computer ) on Computer | project Computer, CPUAvg how can i filter the CPU utilization only for 12 hours\day in PST time zone for last three months. Thanks in advance.3KViews0likes4CommentsWindows Event Forwarding
We are trying to use Windows Event Forwarding to get logs in to Log Analytics. We have configured the security log to forward on to a central server. This works fine and I can see entries. We have set up Log Analytics to collect the "ForwardedEvents" log. From a restart of the Monitoring Agent service I can see the following: The Windows Event Log Provider has resumed processing the ForwardedEvents event log on computer 'fqdn' after recovering from errors. One or more workflows were affected by this. This indicates that it should be collecting the logs fine. I cannot however for love nor money find these events in Log Analytics. Is there anything I am missing? Is this supported? I've googled the forwarded events in to LA and found the UserVoice post asking for this to work but not actually found anything on making it work. ThanksSolved7.5KViews0likes2Comments