“We needed to shape our logs into a common language to make the data make sense,” said Shane Huston, Senior Security Specialist at Bank of New Zealand. “Stream puts the structure around the unstructured data, which enhances data quality, and quality data means quality reporting, quality alerting, and from our perspective, quality incident handling.”
Whether you’re new to the world of Observability, or a seasoned veteran in the space, one thing is for certain – Observability requires you to analyze the widest set of data possible to truly understand the security, performance, and general health of your environment. And since we’ve seen that data volumes are growing close to 25% CAGR, you know you need an easier way to collect your data and move it into the right (and most affordable) tool for analysis, action or compliance.
That was one of the challenges facing the Bank of New Zealand security team-–how could they get the right data, in the right format to the right location in an efficient and cost effective way?
They were building out the notion of a “data ocean” but even so, the complexity of managing it for security and compliance was cumbersome.
“Cribl simplifies the way we’re managing data,” said Huston. “It takes away a lot of the administrative overhead of building out indexes and assigning permissions to various parts of our data ocean.”
The team had to manage a vast number of data inputs to monitor their environment from a security and performance perspective. With Cribl Stream they can now send the exact data they need, in the precise format required, to the optimal location to analyze and take action on it most effectively. They can retrieve, transform, analyze, and enrich data from any source and send it to any destination–in fact to multiple destinations–using an intuitive GUI. Cribl doesn’t require them to sew together and manage multiple open source tools using a command line interface and trial and error to get data where it needs to go.
“Stream steered us down the path of taking a critical look at how we onboard data,” said Huston. “We work with the data owner to determine: What is the required format? Why are you pushing data here–is it retention, regulatory, security? Having the ability to carve data off and push data to S3 cold storage is a huge cost savings. We’re managing our data and costs a lot better.”
Whether you’re trying to send data to a data lake (or ocean) for the low cost storage benefits, enrich an event before sending it to a security incident and event management tool, or route data to multiple tools for different teams and retention periods, Stream is the way to go.
Cribl Stream can receive push data from sources such as Splunk, HTTP, Elastic Beats, Kinesis, Kafka, TCP JSON, and pull data from Kafka, Kinesis Streams, Azure Event Hubs, SQS, S3, Microsoft Office 365, or even external inputs such as threat intelligence feeds, asset management, identity systems, even weather data, or anything else your business wants or needs to drive better decisions. And you can divide data across multiple pipelines into your desired destinations, including AWS, Azure, Google Cloud Platform, DataDog, Elastic, Splunk, MinIO and many others.
With Stream, the team feels confident that they are getting access to the quality data they need in order to better protect the bank and service their customers–both internal and external. “By doing some pre-work in Cribl by normalizing and enriching data, it speeds up incident response,” said Huston.
Don’t take it from me, get more gems of wisdom from Bank of New Zealand. In this Stream Life podcast, Shane Huston and his team from the Bank of New Zealand used Stream to reduce costs while increasing the value of the data as it landed in their analytics systems.
Ready to try it yourself? Jump into this on-demand Data Shaping sandbox to see how you can get started today!
Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.