Cribl puts your IT and Security data at the center of your data management strategy and provides a one-stop shop for analyzing, collecting, processing, and routing it all at any scale. Try the Cribl suite of products and start building your data engine today!
Learn more ›Evolving demands placed on IT and Security teams are driving a new architecture for how observability data is captured, curated, and queried. This new architecture provides flexibility and control while managing the costs of increasing data volumes.
Read white paper ›Cribl Stream is a vendor-agnostic observability pipeline that gives you the flexibility to collect, reduce, enrich, normalize, and route data from any source to any destination within your existing data infrastructure.
Learn more ›Cribl Edge provides an intelligent, highly scalable edge-based data collection system for logs, metrics, and application data.
Learn more ›Cribl Search turns the traditional search process on its head, allowing users to search data in place without having to collect/store first.
Learn more ›Cribl Lake is a turnkey data lake solution that takes just minutes to get up and running — no data expertise needed. Leverage open formats, unified security with rich access controls, and central access to all IT and security data.
Learn more ›The Cribl.Cloud platform gets you up and running fast without the hassle of running infrastructure.
Learn more ›Cribl.Cloud Solution Brief
The fastest and easiest way to realize the value of an observability ecosystem.
Read Solution Brief ›Cribl Copilot gets your deployments up and running in minutes, not weeks or months.
Learn more ›AppScope gives operators the visibility they need into application behavior, metrics and events with no configuration and no agent required.
Learn more ›Explore Cribl’s Solutions by Use Cases:
Explore Cribl’s Solutions by Integrations:
Explore Cribl’s Solutions by Industry:
Try Your Own Cribl Sandbox
Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Get inspired by how our customers are innovating IT, security and observability. They inspire us daily!
Read Customer Stories ›Sally Beauty Holdings
Sally Beauty Swaps LogStash and Syslog-ng with Cribl.Cloud for a Resilient Security and Observability Pipeline
Read Case Study ›Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Transform data management with Cribl, the Data Engine for IT and Security
Learn More ›Cribl Corporate Overview
Cribl makes open observability a reality, giving you the freedom and flexibility to make choices instead of compromises.
Get the Guide ›Stay up to date on all things Cribl and observability.
Visit the Newsroom ›Cribl’s leadership team has built and launched category-defining products for some of the most innovative companies in the technology sector, and is supported by the world’s most elite investors.
Meet our Leaders ›Join the Cribl herd! The smartest, funniest, most passionate goats you’ll ever meet.
Learn More ›Whether you’re just getting started or scaling up, the Cribl for Startups program gives you the tools and resources your company needs to be successful at every stage.
Learn More ›Want to learn more about Cribl from our sales experts? Send us your contact information and we’ll be in touch.
Talk to an Expert ›Brendan Dalpe is the Chief Technology Officer and Co-Founder of SOI Solutions. Prior to h... Read Moreis role at SOI Solutions, he spent three years at Cribl in different roles including Solutions Architecture and Technical Marketing. Brendan has a strong background in technology, with specializations in security, cloud, and containerization. Read Less
Recently, a customer brought me a challenging use case: They were looking to enforce quotas on their internal customers, i.e. other teams in the organization. The analytics team provides services such as searching and reporting capabilities to those other teams, which subscribe to the services through a chargeback model. Each team that subscribes is supposed to limit its ingestion of data to a quota: a maximum permitted ingest per 24-hour period.
What made the situation urgent for my customer was that some of these teams had figured out that they could subscribe to the minimum daily ingestion, while forwarding 10 or even 10 times that amount – completely disregarding their quota and racking up insane amounts of usage! The resulting cost overruns and unexpected infrastructure charges caused my customer and their analytics team real heartburn. They needed to find a way to enforce the quotas ASAP.
As part of Cribl’s Proof of Value process, we built a Pipeline that pulls the quota information from the customer’s internal application and strictly enforces daily ingestion limits. What follows is a simplified account of how we did this.
First, we needed to determine how to categorize the incoming data. Luckily for the customer, almost all of their applications are deployed in Kubernetes so we just used pre-existing metadata embedded in the event, with application labels and annotations. For this example, we’ll assume the events contain a field called app.
Now that we know how to group incoming data, it needs quotas. For this example, we’ll use a Lookup file with two columns, the first for the app, the second for the quota (in bytes). We’ll assign a 1 KB quota to the goat_farm app, and a 2 KB quota to the acme_web app.
Instead of a Lookup file, you could opt to store your app/quota key/value pairs in Redis. We’ll use Redis anyway to persist the state of usage across the Worker Group, as storing usage as an aggregation would limit the value to the single Worker process.
Enforcing our quotas will be a five-step process:
Note that we’ve opted to set a TTL period from the current time until midnight. The idea is that each app can send up to its quota worth of events within that time period; then the TTL resets and the app is allowed another quota’s worth of events.
For simplicity, we’ve chosen to simply drop events once the app exceeds its quota. Of course there are lots of other things you can do instead!
Let’s go through the process in detail:
1. Get the quota for each app key.
The Lookup function inserts the quota value into the event as an internal field called __quota, which is removed before Cribl forwards the event to the destination.
2. Get the usage value from Redis.
The Redis GET command will return the usage value if it exists, and otherwise will set the field to null. Then place this value into an internal field called __usage. We’ll also prepend a static string, cribl_usage:, to the key, to avoid any collisions in our Redis store. This is optional, and you can customize the string as desired. We also need to get the TTL of the key to check if it’s been expired in order to prevent a race condition when updating it in another function later in the pipeline.
3. If there’s no usage yet, set a time-to-live (TTL) period.
This is the most important step of all. If the usage internal field is not null, meaning that a value was returned from the previous step, we skip it. But if there is no usage yet, we use the SETEX command to set an expiring value in Redis. SETEX takes two parameters: the time-to-live (TTL) and the value. We’ll set the value stored in Redis to the size of the event we received in Cribl Stream. You’ll need to determine how to calculate this for your events, but for simplicity, we’ll take the length of the _raw field. You’ll also need to manually enter the SETEX command as it does not populate from the drop-down.
Now we need to determine the TTL for this key in Redis. My customer wanted to reset the usage every day at midnight. You can customize this however you wish (setting TTL to every hour, every week, etc.)
When we calculate the TTL, we can’t assume that usage starts at midnight – therefore we need to find the offset from the current time to midnight. To do this, we first find the timestamp (in epoch milliseconds) of the next midnight with this code:
new Date().setHours(24, 0, 0, 0)
Then we get the current timestamp in milliseconds:
new Date().getTime()
Take the difference of the two timestamps – then you have the number of milliseconds to midnight. Since Redis accepts this value in seconds, we need to divide by 1000 and then round.
Math.round(offset / 1000)
Let’s put the entire calculation into a function:
Math.round(( new Date().setHours(24, 0, 0, 0) – new Date().getTime() ) / 1000)
Finally, we’ll put the function into an array with the initial usage value we want to store in Redis:
[Math.round(( new Date().setHours(24, 0, 0, 0) – new Date().getTime() ) / 1000), _raw.length]
4. If TTL has already been set, increment the usage.
Recall that the previous step only happens before there’s been any usage. If there has been usage, we skip that step and run this one instead. We’ll use the Redis INCRBY command to increment the usage value. The INCRBY command does not alter the TTL of the key, so the TTL will continue counting down as midnight approaches.
5. Check usage against the quota, and if quota has been exceeded, start dropping events.
We’ll create a two-part filter to implement the following logic:
Note the plus sign to the left of the __quota variable. This is called an unary plus operator, and it attempts to convert a value to a number (if it is not one already). We need the unary plus here because the values that the Lookup function returns to events are always strings. You can see this in the Event Preview window, where a script alpha sign will appear to the left of the key.
If the usage is greater than the quota, we will continue to drop events until midnight. This works regardless of which Worker or Worker process receives the data, because Redis, being a distributed key value store, overcomes some of the limitations of a highly scalable shared-nothing architecture.
By building Redis Functions into a Pipeline, we were able to set the appropriate usage quota for each team sending data through a customer application, and to make sure that those quotas are enforced. The analytics team managing the application has been rescued from an ongoing stressful situation, and has the application’s data usage nicely under control now.
Cribl Stream lets you unlock the value of all of your observability data. For more in-depth, practically-oriented courses and content to help you become a Cribl Stream expert, don’t forget to check out Cribl University.
Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.
Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari
Got one of those handy?