Cribl puts your IT and Security data at the center of your data management strategy and provides a one-stop shop for analyzing, collecting, processing, and routing it all at any scale. Try the Cribl suite of products and start building your data engine today!
Learn more ›Evolving demands placed on IT and Security teams are driving a new architecture for how observability data is captured, curated, and queried. This new architecture provides flexibility and control while managing the costs of increasing data volumes.
Read white paper ›Cribl Stream is a vendor-agnostic observability pipeline that gives you the flexibility to collect, reduce, enrich, normalize, and route data from any source to any destination within your existing data infrastructure.
Learn more ›Cribl Edge provides an intelligent, highly scalable edge-based data collection system for logs, metrics, and application data.
Learn more ›Cribl Search turns the traditional search process on its head, allowing users to search data in place without having to collect/store first.
Learn more ›Cribl Lake is a turnkey data lake solution that takes just minutes to get up and running — no data expertise needed. Leverage open formats, unified security with rich access controls, and central access to all IT and security data.
Learn more ›The Cribl.Cloud platform gets you up and running fast without the hassle of running infrastructure.
Learn more ›Cribl.Cloud Solution Brief
The fastest and easiest way to realize the value of an observability ecosystem.
Read Solution Brief ›Cribl Copilot gets your deployments up and running in minutes, not weeks or months.
Learn more ›AppScope gives operators the visibility they need into application behavior, metrics and events with no configuration and no agent required.
Learn more ›Explore Cribl’s Solutions by Use Cases:
Explore Cribl’s Solutions by Integrations:
Explore Cribl’s Solutions by Industry:
Try Your Own Cribl Sandbox
Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Get inspired by how our customers are innovating IT, security and observability. They inspire us daily!
Read Customer Stories ›Sally Beauty Holdings
Sally Beauty Swaps LogStash and Syslog-ng with Cribl.Cloud for a Resilient Security and Observability Pipeline
Read Case Study ›Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Transform data management with Cribl, the Data Engine for IT and Security
Learn More ›Cribl Corporate Overview
Cribl makes open observability a reality, giving you the freedom and flexibility to make choices instead of compromises.
Get the Guide ›Stay up to date on all things Cribl and observability.
Visit the Newsroom ›Cribl’s leadership team has built and launched category-defining products for some of the most innovative companies in the technology sector, and is supported by the world’s most elite investors.
Meet our Leaders ›Join the Cribl herd! The smartest, funniest, most passionate goats you’ll ever meet.
Learn More ›Whether you’re just getting started or scaling up, the Cribl for Startups program gives you the tools and resources your company needs to be successful at every stage.
Learn More ›Want to learn more about Cribl from our sales experts? Send us your contact information and we’ll be in touch.
Talk to an Expert ›For so many, the unknown sucks. Knowing or knowing what to expect is best. Why? Because it puts us at ease, and peace and gives us a calm sense of knowing without having experienced it yet. That’s part of my mission here at Cribl. I talk to a lot of people and the one consistent part of these conversations is the unknown. Cribl and Cribl Stream forging new territory and doing things that no one is doing or has taken the time to do for other people places a tremendous focus on me and my peers ensuring that everyone we spend time with talking about Cribl Stream knows what to expect.
If you have talked to me, and my peers or have been paying attention to our media releases, blog posts, or any number of our communications, you may know that we focus on a variety of aspects when it comes to being able to observe data as it streams from its source to its destination. Just to name some of the transformations that can occur before data arrives at its destination could include:
Putting this kind of choice, control and flexibility means that things change. For Cribl Stream users, it means having an expectation that each Source’s events will not always remain and look the same as when it left that source….and that’s okay. If we know what to expect then we are removing the unknown and providing an opportunity to set our expectations properly AND prepare accordingly.
What guidance or awareness can we provide you so that you too can know what to expect when you are expecting Cribl data at your Cribl destinations?
The first recommendation comes from a nod to my friend, peer, and partner-in-good, Jordan Perks, and his Blog post from November – document your destination’s expectations. Cribl Stream’s array of helpful and easy-to-use functions makes it super easy to deliver a pipeline chucked full of hyper-transformation. In fact, being able to quickly pull up stream and event statistics on how streamlined that data source makes you feel like you are always trying to beat your high score on your favorite 80’s video game. But don’t forget that you have a downstream system to send that stream and event to that may have some expectations. If you have dove in and followed Jordan’s Best Practices for documenting your observability data, then you will have those expectations at your fingertips.
Before you get carried away with building out that pipeline, spend some time with the consumers (those who use that downstream destination system) to know what they expect and document that expectation.
Once you have this information, now you are armed with what you need to put the finishing touches on that pipeline masterpiece in Cribl Stream. Utilizing Cribl Stream’s functions like Serialize, Flatten, and Eval can ensure that downstream destination systems continue to receive data events how your consumers documented and expected.
Many of Cribl’s Packs have already taken your destination into consideration. Case in point, the Microsoft Windows Events pack has the following information in the READ ME of the Pack:
“This pack may be incompatible with some Splunk dashboards that depend on specific field extractions. The Windows-TA will also not work with this pack as all events are in a clean universal format.
Please review various Splunk add-ons and configuration files such as props.conf or transforms.conf and make adjustments as necessary. The final output is JSON, but you can use Serialize to change to other formats if necessary. JSON or KV formats can be auto-extracted in Splunk
In Splunk:
Step 1: Disable the Windows-TA
Step 2: If events are transformed to JSON set kv_mode=json
Step 3: Evaluate the fields and dashboards and see if you need to make alias in Splunk or add a Rename function in Stream.”
Having this clear and concise guidance for what to do to ensure your downstream systems continue to operate as the consumers of the system expect can be is very valuable. As you review the Packs at your disposal, be sure to read the READ ME as well as dig into the details of how to prepare for your downstream destination appropriately.
It truly is all about the destination…configuration. Many of Cribl Stream’s Destinations can be configured to deliver what your downstream systems expect. For example, if you are sending your data events to CrowdStrike Falcon LogScale, you can specify to send the data events in a JSON or a raw format from within the Destination configuration.
As another example, if you are sending your data events to Google Security Operations, you can specify which Log Type value to send with your events, including custom log types. Always check the Destination configuration in Cribl Stream first to see if there are any expectations that you can set and apply before sending your data to that destination.
And while you are there, have a look at the Post-Processing configuration menu as there are additional areas that allow you to set expected fields as well as run the Cribl Stream transformed data events through another Cribl Stream pipeline’s transformation functions before sending the data to the Destination. These little configurations can produce the right data event output for what your downstream systems expect.
Before we wrap this blog up, there is one concern that may be raised when discussing Cribl Stream’s data output – what do you do if you need to capture and retain the full fidelity of the original Source’s event? Due to legality, compliance, or even governance-type requirements, being able to retain and recall the original event that has been unmodified could pose a genuine concern. In some cases, certain business units/departments (such as Cybersecurity) will be afraid to make any change to the stream of events coming from a source. This is totally understandable and reasonable and aligns with Cribl Stream’s Replay functionality.
Cribl Stream’s Destination such as object stores (Azure Blob Storage, S3, etc.) or filesystems (NFS) makes for good, inexpensive, long-term storage/retention targets for your original copies of data events. Utilizing an early or higher priority route in Cribl Stream to then passthrough the original source event data to one of these destinations brings comfort in knowing that those events will be “archived” while Cribl Stream changes a copy of those events for downstream use. While the simple routing of this data to those destinations is simple what can be more challenging is how to recall and “Replay” those events and streams of data if they are needed again. Using Archiving and Replay with Cribl Stream will set you free (Learn more about this capability in this Blog HERE).
What I’ve shared here does not do an exhaustive list of guidance and awareness but I hope you realize that part of your success with Cribl Stream in your environment is communicating, discovering, gathering, and knowing what’s expected for your downstream systems and users of these systems to continue to operate. Doing nothing is the quickest way for systems that depend on data events to incur issues, downtime, and unusable solutions. A proactive approach utilizing some of the above guidance and resources can change that and produce successful deployments and use of Cribl Stream in your environments.
The fastest way to get started with Cribl Stream, Edge, and Search is to try the Free Cloud Sandboxes.
Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.
Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari
Got one of those handy?