Route data to multiple destinations
Enrich data events with business or service context
Search and analyze data directly at its source, an S3 bucket, or Cribl Lake
Reduce the size of data
Shape data to optimize its value
Store data in S3 buckets or Cribl Lake
Replay data from low-cost storage
Collect logs and metrics from host devices
Centrally receive and route telemetry to all your tools
Redact or mask sensitive data
Optimize data for better threat detection and response
Streamline infrastructure to reduce complexity and cost
Simplify Kubernetes data collection
Optimize logs for value
Control how telemetry is stored
Easily handle new cloud telemetry
Ensure freedom in your tech stack
Accelerate the value of AIOps
Effortlessly search, collect, process, route and store telemetry from every corner of your infrastructure—in the cloud, on-premises, or both—with Cribl. Try the Cribl Suite of products today.
Learn moreGet telemetry data from anywhere to anywhere
Get started quickly without managing infrastructure
Streamline collection with a scalable, vendor-neutral agent
AI-powered tools designed to maximize productivity
Easily access and explore telemetry from anywhere, anytime
Instrument, collect, observe
Store, access, and replay telemetry.
Get hands-on support from Cribl experts to quickly deploy and optimize Cribl solutions for your unique data environment.
Work with certified partners to get up and running fast. Access expert-level support and get guidance on your data strategy.
Get inspired by how our customers are innovating IT, security, and observability. They inspire us daily!
Read customer storiesFREE training and certs for data pros
Log in or sign up to start learning
Step-by-step guidance and best practices
Tutorials for Sandboxes & Cribl.Cloud
Ask questions and share user experiences
Troubleshooting tips, and Q&A archive
The latest software features and updates
Get older versions of Cribl software
For registered licensed customers
Advice throughout your Cribl journey
Connect with Cribl partners to transform your data and drive real results.
Join the Cribl Partner Program for resources to boost success.
Log in to the Cribl Partner Portal for the latest resources, tools, and updates.
Goal:
Confidently migrate existing applications and tooling to the cloud (or to multiple clouds) on time and under budget.
Challenge:
Reconfiguring architectures and data flows to ensure parity and visibility in the cloud (or multiple clouds), while keeping a handle on ingress and egress charges.
Example:
You are migrating a widely-deployed application from an on-premises deployment to a cloud deployment, with the primary goals for migration being optimized performance, reduced management overhead and streamlined costs. This is your opportunity to address some ongoing challenges of your on-premises deployment, and ensure parity between your old deployment and new cloud deployment before fully switching over.
How Can Cribl Help?
By routing your data from existing sources to multiple destinations, you can ensure data parity in your new cloud destinations, before turning off your on-premises (or legacy) analytics, monitoring, storage or database products and tooling. Further, Cribl can reduce your costs significantly by putting our worker nodes inside of your cloud, to cheaply and effectively compress and move the data to reduce egress charges.
To do this, you will test and deploy several Cribl Stream technical use cases:
Before You Begin:
What You’ll Achieve:
From your existing collectors and agents, set up destinations and pipelines for your new cloud destinations. (If you need new collectors or agents, you can learn more about Cribl Edge, a vendor-neutral, small footprint agent, which allows you to configure which data you want to send from the edge to your destination. Edge also provides a clean UI to ease fleet management.)
Identify the data being sent to each destination. For each type of data you will accomplish the following:
For data that requires shaping or normalization, create a pipeline or use the out-of-the-box Packs.
For data that requires reduction, create a pipeline or use the out-of-the-box Packs. (Most Packs help to reduce data volumes by up to 30%.)
Spec out each source:
For your QuickStart, we recommend no more than 2 Sources.
Where does your data need to go:
For your QuickStart, we recommend no more than 2 Destinations.
As part of the exercise to prove your use case, we recommend you limit your evaluation to 1-2 sources and 1-2 destinations (or fewer).
Note: for an alternative to setting up Sources and Destinations you can use Cribl packs and sample data for your evaluation. Look at step 9 to use Packs and the included Sample data.
Another way you can get started quickly with Cribl is with QuickConnect or Routes.
Cribl QuickConnect lets you visually connect Cribl Stream Sources to Destinations using a simple drag-and-drop interface. If all you need are independent connections that link parallel Source/Destination pairs, Cribl Stream’s QuickConnect rapid visual configuration tool is a useful alternative to configuring Routes.
For maximum control, you can use Routes to filter, clone, and cascade incoming data across a related set of Pipelines and Destinations. If you simply need to get data flowing fast, use QuickConnect.
Capture Sample Data set for each Sourcetype.
Capturing a sample data set allows Cribl Pipeline and Packs to validate the logic against your sample data, and show a before and after view to prove that your Reduction and Enrichment use cases are working.
As an alternative to capturing sample data at the Source, use QuickConnect to capture a sample dataset.
In the QuickConnect UI, when you hover over the destination, you can click on Capture. The Capture button captures a sample of data flowing through the Source.
As an alternative to capturing sample data at the Source, use Routes to capture a sample dataset:
For your use cases you will:
Streamline the number of fields or volume of data you send to your analysis tool:
Modify Logs to Metrics:
Enrich data with third-party sources:
Transform data to prepare it with Common Information Model fields (for Splunk) or Elastic Common Schema (ECS for Elastic):
Packs enable Cribl Stream administrators to pack up and share Pipelines and Functions across organizations, and include sample data for testing. The following Packs might be helpful:
As an alternative to Packs and the out-of-the-box Pipelines that are part of the Packs, you can create your own Pipeline. Pipelines are Cribl’s main way to manipulate events. Examine Cribl Tips and Tricks for additional examples and best practices. Look for all the sections that have Try This at Home for Pipeline examples https://docs.cribl.io/stream/usecase-lookups-regex/#try-this-at-home
Please note: If you are working with existing data source being sent to your downstream systems and you do nothing to the output from Cribl Stream, it has a may break any existing dependencies on the original format of the data. Be sure to consult this Best Practices blog or the users and owners of your downstream systems before committing any data source to a destination from within Cribl Stream.
Technical Use Cases Tested:
Also select Monitoring > Data > Pipelines and examine slicendice.
For additional examples, see:
When you’re convinced that Stream is right for you, reach out to your Cribl team and we can work with you on advanced topics like architecture, sizing, pricing, and anything else you need to get started!
Goal:
Confidently migrate existing applications and tooling to the cloud (or to multiple clouds) on time and under budget.
Challenge:
Reconfiguring architectures and data flows to ensure parity and visibility in the cloud (or multiple clouds), while keeping a handle on ingress and egress charges.
Example:
You are migrating a widely-deployed application from an on-premises deployment to a cloud deployment, with the primary goals for migration being optimized performance, reduced management overhead and streamlined costs. This is your opportunity to address some ongoing challenges of your on-premises deployment, and ensure parity between your old deployment and new cloud deployment before fully switching over.
How Can Cribl Help?
By routing your data from existing sources to multiple destinations, you can ensure data parity in your new cloud destinations, before turning off your on-premises (or legacy) analytics, monitoring, storage or database products and tooling. Further, Cribl can reduce your costs significantly by putting our worker nodes inside of your cloud, to cheaply and effectively compress and move the data to reduce egress charges.
To do this, you will test and deploy several Cribl Stream technical use cases:
Before You Begin:
What You’ll Achieve:
From your existing collectors and agents, set up destinations and pipelines for your new cloud destinations. (If you need new collectors or agents, you can learn more about Cribl Edge, a vendor-neutral, small footprint agent, which allows you to configure which data you want to send from the edge to your destination. Edge also provides a clean UI to ease fleet management.)
Identify the data being sent to each destination. For each type of data you will accomplish the following:
For data that requires shaping or normalization, create a pipeline or use the out-of-the-box Packs.
For data that requires reduction, create a pipeline or use the out-of-the-box Packs. (Most Packs help to reduce data volumes by up to 30%.)
Spec out each source:
For your QuickStart, we recommend no more than 2 Sources.
Where does your data need to go:
For your QuickStart, we recommend no more than 2 Destinations.
Another way you can get started quickly with Cribl is with QuickConnect or Routes.
Cribl QuickConnect lets you visually connect Cribl Stream Sources to Destinations using a simple drag-and-drop interface. If all you need are independent connections that link parallel Source/Destination pairs, Cribl Stream’s QuickConnect rapid visual configuration tool is a useful alternative to configuring Routes.
For maximum control, you can use Routes to filter, clone, and cascade incoming data across a related set of Pipelines and Destinations. If you simply need to get data flowing fast, use QuickConnect.
Capture Sample Data set for each Sourcetype.
Capturing a sample data set allows Cribl Pipeline and Packs to validate the logic against your sample data, and show a before and after view to prove that your Reduction and Enrichment use cases are working.
As an alternative to capturing sample data at the Source, use QuickConnect to capture a sample dataset.
In the QuickConnect UI, when you hover over the destination, you can click on Capture. The Capture button captures a sample of data flowing through the Source.
As an alternative to capturing sample data at the Source, use Routes to capture a sample dataset:
For your use cases you will:
Streamline the number of fields or volume of data you send to your analysis tool:
Modify Logs to Metrics:
Enrich data with third-party sources:
Transform data to prepare it with Common Information Model fields (for Splunk) or Elastic Common Schema (ECS for Elastic):
Packs enable Cribl Stream administrators to pack up and share Pipelines and Functions across organizations, and include sample data for testing. The following Packs might be helpful:
As an alternative to Packs and the out-of-the-box Pipelines that are part of the Packs, you can create your own Pipeline. Pipelines are Cribl’s main way to manipulate events. Examine Cribl Tips and Tricks for additional examples and best practices. Look for all the sections that have Try This at Home for Pipeline examples https://docs.cribl.io/stream/usecase-lookups-regex/#try-this-at-home
Please note: If you are working with existing data source being sent to your downstream systems and you do nothing to the output from Cribl Stream, it has a may break any existing dependencies on the original format of the data. Be sure to consult this Best Practices blog or the users and owners of your downstream systems before committing any data source to a destination from within Cribl Stream.
Technical Use Cases Tested:
Also select Monitoring > Data > Pipelines and examine slicendice.
For additional examples, see:
When you’re convinced that Stream is right for you, reach out to your Cribl team and we can work with you on advanced topics like architecture, sizing, pricing, and anything else you need to get started!
Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari
Got one of those handy?