x
AdobeStock_599423563

Cribl Packs a Punch: Unpacking the Integration with Microsoft Azure Sentinel with Cribl Source and Destination Packs

May 20, 2024

With IT modernization and increased cloud usage, more organizations are looking to Software-as-a-Service offerings for their security and data needs. Microsoft Azure Sentinel is a cloud-based SIEM that security operation centers rely on for data analytics. Cribl makes it easier for Microsoft Azure Sentinel customers to get data into their security analytics platform.

Leveraging Cribl Packs, organizations can easily ingest data from various vendors with various formats while requiring little effort. Let’s take a little deeper dive into Cribl packs, how they work, and how they’re used to ease the burden of data ingestion by Microsoft Azure Sentinel customers.

Cribl Packs allow Cribl Stream customers to build and share configuration models, including pipelines, lookups, data, samples, and knowledge objects. These prebuilt configurations allow for faster time to value and prevent redundancy when creating pipelines. The Cribl Pack dispensary has a variety of pre-built packs for common use cases, data, sources, and destinations.

To make data ingestion easier for Microsoft Sentinel, we’ll take advantage of pre-and post-processing capabilities, which provide data transformations at both the Cribl source and the Cribl destination, which in this case is Microsoft Azure Sentinel. Cribl documentation provides information about event processing orders with Cribl Stream, but the TLDR is that data arrives at a source. There is an option to use a pre-processing pipeline to normalize data before the data proceeds further. Additionally, just before data is delivered to a destination, you can optionally leverage a post-processing pipeline to normalize the data before it is delivered. We will use some Cribl source packs to normalize data to Common Schema that we can then map to the Microsoft Azure Sentinel Common Security Log schema.

Cribl Source Packs

Let’s look at two source packs that map data into a Common Schema: The CEF Source and Palo Alto Networks Source packs. Often, firewall vendors provide options for the data format in which logs will be delivered, and Palo Alto Networks is no different. Data can be delivered in CSV or Common Event Format (CEF). The Palo Alto Networks Source pack assumes the data arrives in CSV format. For the CEF Source pack, you guessed it, it handles data in the Common Event Format. While both packs map data into a Common Schema, the CEF Source pack also creates an internal field with the CEF data. This provides more flexibility and options for the Microsoft Azure Sentinel Common Security Log destination pack. But first, let’s look at the internal fields the packs create.

By extracting information from the Palo Alto Networks Firewall Traffic event, the source pack maps that data into an internal field named __schema. In order to see it you’ll need to select the gear icon in the Data Preview pane and select Show Internal Fields. The Common Schema structure maps details about source and destination IP addresses, ports, and interfaces, among other information. Instead of creating an individual pack for each source and destination combination, this normalization approach allows a single pack to map data for any number of destination packs. While discussing Microsoft Sentinel in this blog post, I noticed that other destination packs are compatible with the CEF and Palo Alto Networks source packs.

In addition, the internal __schema field, the CEF Source pack also extracts the CEF fields to an internal __cef field as shown below.

The __cef internal field offers an alternative to the Common Schema that is helpful for many destinations. Microsoft’s Azure Sentinel is one of those and the documentation for the Common Security Log table schema outlines the CEF mapping.

It’s also important to note that the source packs are designed to be applied to the source for pre-processing. With Palo Alto Networks Firewall traffic arriving via Syslog we’ll add the source pack as a pre-processing pipeline. Navigate to Sources > Syslog, select the Syslog input > Processing Settings > Pre-Processing, and then select the source pack.

Cribl Destination Packs

Now that we’ve seen two ways to normalize data to schemas as part of pre-processing when data arrives, let’s look at the Microsoft Sentinel Common Security Log destination pack that takes advantage of the schemas to format data fields correctly.

While some data may be arriving in the internal __cef field, others in the internal __schema field, and some already in the destinations format, the Microsoft Sentinel Common Security Log destination pack maps data based on an order of precedence. This is highlighted in the eval function in the common_security_log pipeline included in the pack.

When mapping into the Common Security Log table fields, the pack first looks for an existing field, followed by the corresponding CEF field in the internal __cef schema, and last, the corresponding field in the internal __schema field. This allows for the ability to override a field as part of a traditional pipeline. Additionally, Cribl users can easily change the order to meet additional requirements and use cases.

Wrap Up

In conclusion, integrating Cribl Stream with Microsoft Azure Sentinel brings a powerful solution to the challenges of data ingestion and normalization in modern IT environments. Cribl Packs are pivotal in simplifying the process, allowing organizations to ingest data from diverse vendors in various formats effortlessly. The pre-built configurations offered through Cribl Packs significantly reduce the time required to set up pipelines, enhancing efficiency and preventing redundancy.

The utilization of Cribl Source Packs, exemplified by the CEF Source and Palo Alto Networks Source packs, showcases the platform’s capability to normalize data into a Common Schema. This not only streamlines the data integration process but also provides flexibility. Normalizing data at both the source and destination levels, with pre- and post-processing pipelines, demonstrates a comprehensive strategy for optimizing data flow.


 

Cribl, the Data Engine for IT and Security, empowers organizations to transform their data strategy. Customers use Cribl’s suite of products to collect, process, route, and analyze all IT and security data, delivering the flexibility, choice, and control required to adapt to their ever-changing needs.

We offer free training, certifications, and a free tier across our products. Our community Slack features Cribl engineers, partners, and customers who can answer your questions as you get started and continue to build and evolve. We also offer a variety of hands-on Sandboxes for those interested in how companies globally leverage our products for their data challenges.

 

.
Blog
Feature Image

Mastering Tail Sampling for OpenTelemetry: Cost-Effective Strategies with Cribl

Read More
.
Blog
Feature Image

The Stream Life Podcast 110: Microsoft Azure + Cribl – Better together

Read More
.
Blog
Feature Image

Rethinking Security: Why Organizations are Flocking to Microsoft Sentinel

Read More
pattern

Try Your Own Cribl Sandbox

Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.

box

So you're rockin' Internet Explorer!

Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari

Got one of those handy?