Route observability data where it has the most value. Slash costs, improve performance, and get the right data, to the right destinations, in the right formats, at the right time.
Send data to the most effective destinations including low-cost storage locations like S3 for long-term retention. Route data to the best tool for the job – or all the tools for the job – by translating and formatting data into any tooling schema you require. Let different departments choose different analytics environments without having to deploy new agents or forwarders.Put data where it has the most value
Reduce as much as 50% of ingested log volume to control costs and improve system performance. Eliminate duplicate fields, null values, and any elements that provide little analytical value. Filter and screen events for dynamic sampling, or aggregate log data into metrics for massive volume reduction. Do all of this without worry: You can keep a full-fidelity copy in a low-cost destination and replay it back if needed.Eliminate data to control costs
LogStream is the best way to get multiple data formats into your analytics tools. Use the LogStream universal receiver to collect from any observability data source – and even to schedule batch collection from multiple APIs. In addition, recall data from low-cost storage to replay logs to analytics tools for later investigations with ad-hoc data collection.
Shape all of the data you need to drive decisions about your environment. Translate and transform data from all of your sources to the tools you choose. Get a more complete picture of your data by enriching logs with third-party data. LogStream collects data from all of your sources and shapes it into actionable logs and metrics for analysis.
Deploy observability pipelines within minutes with our cloud-based platform
Get a private infrastructure optimized solution for every size deployment
The Packs framework provides a way for LogStream customers to build and share configuration models – including pipelines, lookups, data samples, and knowledge objects – across distributed LogStream deployments. Reduce the overhead of building and sharing LogStream content while reducing cost, complexity, and time to manage observability pipelines.
Packs also give LogStream users an isolated and secure space to build, test, and share their work, whether within their own company or among the 2000+ Cribl Community members. Add new Packs directly from our public Github repository, via URL, or manage your own Packs library private to your organization.
Send data to the most effective destinations, including low-cost storage locations for long-term retention. Route data to the best tool for the job – or all the tools for the job – by translating and formatting data into any tooling schema you require. Let different departments choose different analytics environments without having to deploy new agents or forwarders.
LogStream features several new integrations, so you can continue to route your data to and from even more sources and destinations in your toolkit. Check out the LogStream docs for the complete list.
Never deal with dropped data again. Notifications for LogStream let you know immediately when a destination is experiencing problems, improving data reliability. You’ll also get a heads up when your LogStream license is about to expire, so you’re not caught off guard.
Get notified within LogStream, via PagerDuty, or leverage Webhooks to integrate with the system of your choice.
The Secret Store gives you a way to centrally manage and reuse secrets for all sources and destinations, making it easier and safer than ever to integrate with external systems. LogStream provides support for both authorization tokens and key/value secrets, like API keys. Already using a certification manager? LogStream can integrate directly with the certification manager of your choice.
By centrally managing secrets with the Secret Store, you reduce repetitive UI input, decrease odds of misconfiguration, and enable security improvements via one location.