Learn about the new best practice for observability–log batch processing
Log analytics solutions used to require seeing all of the data regardless of whether it was worth analyzing or not. Cribl LogStream changed this by helping you determine what data was analytics-worthy and what data was best served by being put to rest in a lower-cost storage system or dropped altogether. But what happens when you need to analyze data at rest at a later point in time? With the introduction of Cribl LogStream 2.2, we have you covered.
LogStream helps you route the most valuable security and observability data to the tool you need in real-time. With 2.2 you can also collect data from storage locations like S3, data lakes or file systems and “replay” them by routing them to an analytics tool. If you need to analyze data at a later date, LogStream 2.2 can collect it, shape it and send it to the right tool. Putting log data in lower cost locations until you need to analyze it, if ever, can help you dramatically reduce your data ingestion into analytics tools and substantially reduce data retention.
Join this webinar to learn more about this new best practice for observability – batch log processing.
- How Cribl LogStream 2.2 collects data from archival storage and “replays” it to an analytics tool
- Strategies for dramatically reducing real-time log data volume and routing full fidelity data to lower cost storage locations
- Tactics to investigate security breaches long after they occur by replaying security log data when breaches are discovered
- Ways to dramatically reduce the data retention period by storing in low-cost storage and analyzing data when you need it while still meeting compliance goals