As a wise man once said, never ask a goat to install software, they’ll just end up eating the instructions. It may appear that the pesky goats have eaten some of those instructions or eaten too many sticker bushes to keep up with recent Microsoft Sentinel changes if you’ve tried configuring the CEF and Azure Connected Machine Agents. This guide is for you whether you have spent considerable time trying to get these agents to work or just dabbling in the Sentinel waters!
If you want to avoid this headache, just use Cribl Stream. We have a much simpler way, and we’ll give you up to 1 TB/day for free through Cribl.Cloud! Sentinel is particularly relevant since the agent collection process can be confusing to configure and install. Here are some of the advantages of using Cribl over native Sentinel agents:
The goats here at Cribl have also helped dozens of clients integrate Sentinel data from many vendors including Palo Alto Networks, ExtraHop, Fortinet, and Cisco.
In addition, Cribl has also added the native Sentinel Destination. With a product that helps easily get data into Microsoft Sentinel and the expertise that Cribl brings to help you on your Sentinel journey, what do you have to lose? Reach out to one of our sales folks for a customized demo if you’re ready to take the next step! We also have cloud-hosted sandboxes if you want to put your hands on the product right now.
OK fine! You didn’t fall for my shameless sales pitch. That’s OK. I’m not getting paid by the hour and I’m more of an engineer type so my sales pitches tend to fall flat. At Cribl, we want to demystify the process of installing and configuring Microsoft Sentinel AMA and CEF collectors so you can get started quickly.. And I get to pad my fragile ego with a technical blog.
To get familiar with common Sentinel terms, check out the glossary at the end of this blog.
Here’s a high-level overview of the components and configurations for the Arc agent.
Some prerequisites before you begin:
OK! We are finally ready to install the Azure Linux Monitor agent!
In order for the Azure-connected machine to work on Linux, we need to set up a syslog listener on port 514. You can use either Rsyslog and/or Syslog-ng. In my case, I’ll be using Rsyslog and the config file should be under: /etc/rsyslog.conf
. If you do not see it and/or Rsyslog is not installed, you can install it via instructions here: https://www.rsyslog.com/ubuntu-repository/.
My Ubuntu 22.04 host came with Rsyslog already installed. I will check the config at /etc/rsyslog.conf
and make sure that UDP and TCP syslog are listening. In my case, I had to uncomment the module and inputs for UDP and TCP.
Next, enable Rsyslog so it starts every time the host is started.
systemctl enable rsyslog
Start it.
systemctl start rsyslog
Confirm Rsyslog is listening on the correct by running:
netstat -tulpn
You should see Rsyslog listeners on both ports UDP/TCP 514.
Now that Rsyslog is running, it’s time to connect our Linux host to Azure Arc.
In Azure search for “Azure Arc” and within the “Azure Arc” pane, select “Servers” on the left rail.
Click the “+ Add” button on the top rail.
We’ll perform a single server installation. Other options are available for multiple deployment types but in our guide, we will just do a simple install of the script on a single host.
Under “Add a single server” click “Generate Script”.
Fill in the appropriate resource details, location, and VM type. In this case, select “Windows” instead of “Linux”.
Then click “Next” so that you can add any tags (if necessary), then copy and/or download the bash script.
SSH into your Linux host and upload the install script and run it:
sh install_arc.sh
You must authenticate with a browser once the script is run and enter a code for Azure Arc to connect to the VM.
After logging into Azure and entering the auth code at https://microsoft.com/device, you should see this message that the server is successfully authenticated.
The installation can also be verified within your terminal.
In the Azure Portal UI, go back to the “Servers Arc” page and you should now see your VM is connected!
Now that the Azure Arc agent is connected and configured, we can create a Data collection rule (DCR) to collect Syslog data and route it to the CommonSecurityLog table in Sentinel. This will auto-deploy the Azure Monitor Linux agent extension. You can configure a basic DCR to collect basic Syslog data but we’ll deploy the CEF connector to collect CEF based data via Syslog.
In the search bar, search for “Microsoft Sentinel” and click on your Sentinel workspace name. Next, click on “Data Connectors”.
Click on the “Content Hub” link under: “More content at”.
Search for “Common Event Format” in the search box, select “Common Event Format” and then click on “Install” in the bottom right.
Once installed click on the “Open Connector Page” by clicking on “Common Event Format (CEF) via AMA”.
Click “+Create data collection rule”.
Give it a name and define the Subscription details.
On the next page click “+Add Resource(s)” and find your Azure Arc server by its instance id and add it to the servers to be configured. Once finished, click “Next: Collect”.
Now we need to tell the DCR what data sources to collect. We do this by clicking “+ Add Data source”.
We’ll need to collect Syslog based data and we’ll need to define what facilities to collect data from. In my case I just want “Syslog ” and “User” based facilities since my other devices will be sending data to the”Syslog ” and”User ” based facilities. You can turn on the kitchen sink if you will, but keep in mind that we really only want CEF based data sources.
Define the log level you’d like to use and click “Next: Review and Create”.
Hit “Next: Review + create”.
Whew! We are almost done. After this step, your DCR deployment has been successfully deployed to Azure.
Let’s first check to see if the DCR deployed the “AzureMonitorLinuxAgent” extension. Go back to the “Arc Servers” page and select the VM instance id of the VM and then Click “Extensions”. You should see the status of the “AzureMontiorLinuxAgent” install status and hopefully it says “Succeeded”.
Let’s make sure that the agent has successfully connected to our workspace.
We verify even further by selecting “See them in logs” under the “X Linux Computers connected”.
By default, it will run a summarize query and give you details of the agent which should align to the server you just configured. You should see your VM and you can verify by examining the name, server IP, etc.
Moment of truth, at this point, you should be good to start configuring your applications to start sending CEF based syslog messages to the Azure Arc Azure Monitor Linux Agent we’ve just configured. First, let’s do a quick test to make sure it’s successfully receiving and ingesting our CEF data into the CommonSecurityLog table.
We can use Netcat to send test data into Sentinel to make sure it’s being collected and parsed correctly into the CommonSecurityLog table. In the following example, I’m sending a sample CEF Fortinet syslog message into my log analytics workspace on the host.
SSH into your host, and run the following command (You can also use your own CEF based message if you’d prefer).
echo "Dec 27 14:36:15 FGT-A-LOG CEF: 0|Fortinet|Fortigate|v6.0.3|61002|utm:ssh ssh-command passthrough|3|deviceExternalId=FGT5HD3915800610 FTNTFGTlogid=1600061002 cat=utm:ssh FTNTFGTsubtype=ssh FTNTFGTeventtype=ssh-command FTNTFGTlevel=notice FTNTFGTvd=vdom1 FTNTFGTeventtime=1545950175 FTNTFGTpolicyid=1 externalId=12921 duser=bob FTNTFGTprofile=test-ssh src=10.1.100.11 spt=56698 dst=172.16.200.55 dpt=22 deviceInboundInterface=port12 FTNTFGTsrcintfrole=lan deviceOutboundInterface=port11 FTNTFGTdstintfrole=wan proto=6 act=passthrough FTNTFGTlogin=root FTNTFGTcommand=ls FTNTFGTseverity=low" | nc -q0 127.0.0.1 514
We should now see messages coming by doing a simple query looking for the message contents in our workspace.
CommonSecurityLog | where DeviceVendor has("Fortinet")
Looks like my CEF messages are being sent successfully!
Hopefully, things went well, but if not, here are a few steps.
An article is here Use Azure Monitor Troubleshooter – Azure Monitor | Microsoft Learn but here are a few good starting points.
5-azuremonitoragent-loadomuxsock.conf
10-azuremonitoragent.conf
20-ufw.conf
21-cloudinit.conf 50-default.conf
/var/lib/GuestConfig/arc_policy_logs/gc_agent.log
/var/lib/GuestConfig/extension_logs/Microsoft.Azure.Monitor.AzureMonitorLinuxAgent-xxxxx
/Let’s configure setting up the Windows connected machine agent.
In Azure search for “Azure Arc” and within the “Azure Arc” pane, select “Servers”.
Select the “+ Add” button on the top rail.
After hitting add click “Generate Script”
Fill in the appropriate resource details for your VM.
Click “Next” copy and/or download the Powershell script to onboard your Arc Server.
Copy and/or upload the script from step 1 onto your Windows host, open a Powershell terminal and run it.
The Powershell script will automatically open a browser to authenticate your Azure Arc server. Log in to Azure with your credentials and you should see this returned after you authenticate.
Check the Powershell terminal after you run the script and check the status. You should see a “Connected machine to Azure”.
The next step is for us to create a simple Data collection to start collecting Windows events. Search for “Data Collection Rules” and hit “Create” on the top.
Add in the resource details.
In the Resources select “Add Resource” and find your VM by instance ID.
Click “+ Add a source” and choose the type of data to collect. In my example, I want to collect “Windows Event Logs” and I’m configuring to collect all log information and types. Click “Next: Destination” once finished.
Under the “Destination” select the workspace(s) you’d like to send this data.
Finally, select “Add Data Source”.
Click “Create”.
You should see this!
Let’s check our analytics workspace to see if the agent is registered.
Go to your Log Analytics workspace and select “Agents” on the left rail.
Looks like we have a few agents registered, but let’s be 100% confident the VM we just configured is registered with our workspace. Click the “See them in Logs” link.
This will open this pane running a default KQL query. Verify that your agent is within the list.
From my end, it looks good as I see my VM name and IP.
We can also check for standard “Events” coming in from the AMA agent by running a simple Kql query with your Windows host name.
Event | where Computer has("EC2AMAZ-8RJR126")
Woohoo! We are now getting basic Windows events.
We’ve got Arc installed and are getting basic Windows event data into our Sentinel analytics workspace. Let’s enhance it by using the Data connector to tell the AMA agent to pull in SecurityEvents and send it to the “SecurityEvents” Sentinel table for more enhanced Security data.
First, go to your Microsoft Sentinel instance and select the “Data connectors” on the left rail.
Click on “Content Hub” under “More content at”.
Search for “Security Events” and select the “Windows Security Events” connector. In the bottom right if it is your first time adding this extension it should say “Install”. In my case, I’ve already installed it, so I will hit “manage”.
In the “Windows Security Events” connector pane, select the “Windows Security Events via AMA” and select “Open Connector page”.
Now we need to create a DCR rule. You can either create a new rule and/or modify an existing rule to start pulling in SecurityEvents to the SecurityEvent Sentinel table.
In my case, I am going to create a new one.
Similar to our previous DCR creation example, fill in the resource details and after click: “Next: Resources”.
Add our resource by hitting “+Add resource(s)”.
Find your VM you have installed the AMA agent on. Add it.
Select the type of security events. I am going to collect all of them. Hit “Review and create”.
Hit “Create”.
Generate some Windows Security events. For example, log off and log back in, starting a random process, and adding a user are common security events.
Let’s run a simple query to see if we are getting Windows Security Events into the SecurityEvent Sentinel table.
We can use a kql query to check if our host is successfully sending security events to the SecurityEvent table. Go to the log analytics workspace and run the following query.
SecurityEvent | where Computer has("EC2AMAZ-8RJR126")
Looks like we are getting SecurityEvent data back!
If you aren’t seeing the agent connect to your log analytics workspaces here’s a few things to check:
Microsoft has also provided a document to help troubleshoot which is very useful.
Whew! That was A LOT of steps and hopefully, at this point, you’ve got the hang of onboarding some Windows and Linux using Azure Arc to send data into Microsoft Sentinel. Please reach out to me via email if you have any questions, comments, and/or feedback. As we said at the beginning, Cribl Stream makes this process a lot easier, and we’ve got a much simpler guide to get you started
Alright, this section is kind of boring, but I feel it’s necessary to go over some common terms so that you have an understanding of them when they are referenced in the guide.
As mentioned in the previous term definition page, the OMS agent, often referred to as the “OMS Agent” is being deprecated and should NOT be used. It will still work however; support for this agent will end in August of 2024.
Making matters worse is the old documentation and references to the OMS agent within the Microsoft Sentinel portal and UI.
Cribl, the Data Engine for IT and Security, empowers organizations to transform their data strategy. Customers use Cribl’s suite of products to collect, process, route, and analyze all IT and security data, delivering the flexibility, choice, and control required to adapt to their ever-changing needs.
We offer free training, certifications, and a free tier across our products. Our community Slack features Cribl engineers, partners, and customers who can answer your questions as you get started and continue to build and evolve. We also offer a variety of hands-on Sandboxes for those interested in how companies globally leverage our products for their data challenges.
Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.