Cribl Stream Use Case:
Route From Any Source to Any Destination

The easiest way to get all observability data into the destinations that matter most.

The Challenge:

Observability requires you to analyze the widest set of data possible to truly understand the security, performance, and general health of your environment. You need an easier way to collect all data, at all times, and move it into any tool that requires it – collect once, use it everywhere.

The Solution:

Stream acts as a universal receiver and collector of log and metrics data. With it, you can retrieve, transform, analyze, and correlate data from any source and send it to any destination, or even to multiple destinations, without adding any tooling.

Stream can receive push data from sources such as Splunk, HTTP, Elastic Beats, Kinesis, Kafka, TCP JSON, and pull data from Kafka, Kinesis Streams, Azure Event Hubs, SQS, S3, Microsoft Office 365, or even external inputs such as weather data, air quality, and anything else your business wants or needs to drive better decisions.

With Stream it’s finally possible to send the exact data your organization needs, in the precise format required, to the optimal location to leverage it most effectively.

Stream data through Stream to Splunk, AWS Kinesis Streams, SQS, and CloudWatch Logs, Elasticsearch, Honeycomb, TCP JSON, Syslog, Kafka Azure Event Hubs and Monitor Logs, StatsD and StatsD Extended, Graphite, InfluxDB, Wavefront, SignalFx, and more, as well as destinations that support batch or non-streaming output such as S3 compatible stores, filesystem/NFS, MinIO, Google Cloud Storage, and Azure Blob Storage.

Stream goes beyond just collecting data from any source and delivering it anywhere, it maximizes the value of observability data by transforming and adding context from other sources, in real-time, enhancing the value of all your analytics tools.

Solution Benefits:

Key Features of Stream

Route Data from Any Source to Any Destination

Use Cribl Stream to send data to the most effective destinations, including low-cost storage locations like S3. Quickly route data to the best tool for the job – or all the tools for the job – by translating and formatting data into the tooling schemas you need

Use Summary Data for
Faster Insights

Easily extract fields of interest and publish the result to metrics. Once aggregated, you will see a major reduction in event counts and data volume, freeing up space in your analytics tools. Send the resulting metrics to your analytics tool or route them to a dedicated time series database to get further analytics and better insights into your own data sources.

Reduce Data Volume to Free Up Resources

Stream can help reduce as much as 50% of ingested log volume. Easily eliminate duplicate fields, null values, and any elements that provide little analytical value. Filter and screen events for dynamic sampling, or aggregate log data into metrics for volume reduction at scale while keeping a full-fidelity copy in low-cost storage, improving system performance and freeing up valuable resources for digital transformation.

Monitor Pipelines to Inform Critical Business Decisions

Reduce management overhead, with a robust and easy-to-use GUI-based configuration and testing interface. Capture live data and monitor your observability pipelines in real time, enabling further visibility to inform critical business decisions.

BlueVoyant Offers Next-Generation Cybersecurity Services, Backed By Cribl

Stream helps keep BlueVoyant ahead of a 1000x increase in attacks since March 2020.