Kafka

Metrics for Apache Kafka with Datadog and Aiven | Ryan Martin (Aiven)

Using managed services is all very well, but how do you get the data you need from the different services into Datadog so you can see it all in one place? This session will walk through the configuration for bringing your Aiven-managed Apache Kafka service metrics into your Datadog explorer. You’ll see how to filter the metrics to focus on specific topics or consumer groups, and how to use the Aiven client to create a repeatable, scriptable setup. This session is recommended for anyone living in the as-a-Service world who cares about data and is interested in using metrics to optimize their Kafka clusters.
nexthink

Adapting Legacy Systems to a Stream-Based Platform

In a perfect world, small pieces of software scale up and down with a high-performance message broker like a heart, pumping data in a controlled, efficient manner. However, at some point, in every single system, an application starts pumping out large data requests, in a sporadic, uncontrollable manner. Because here, we are talking about the real world, right? In this article, we will address the problem of adapting a legacy system to play with a stream-based platform.

logdna

Apache Kafka Tutorial: Use Cases and Challenges of Logging at Scale

Enterprises often have several servers, firewalls, databases, mobile devices, API endpoints, and other infrastructure that powers their IT. Because of this, organizations must provide resources to manage logged events across the environment. Logging is a factor in detecting and blocking cyber-attacks, and organizations use log data for auditing during an investigation after an incident. Brokers, such as Apache Kafka, will ingest logging data in real-time, process, store, and route data.

cloudera

Operating Apache Kafka with Cruise Control

There are two big gaps in the Apache Kafka project when we think of operating a cluster. The first is monitoring the cluster efficiently and the second is managing failures and changes in the cluster. There are no solutions for these inside the Kafka project but there are many good 3rd party tools for both problems. Cruise Control is one of the earliest open source tools to provide a solution for the failure management problem but lately for the monitoring problem as well.

epsagon

A Complete Guide to Monitoring Apache Kafka - Part 2

In the first part of this blog, we covered the basics of the Kafka ecosystem and explored the options for exporting Kafka metrics—first using the Jolokia JVM agent and then via the Prometheus JMX agent. Here in this post, we’ll go through some key Kafka metrics that are available on Grafana for building visualizations and alerts. Although Kafka provides hundreds of metrics, as described here , we are going to cover the most important ones to monitor.

epsagon

A Complete Guide for Monitoring Apache Kafka - Part 1

Apache Kafka is an open-source platform for distributed data streaming that provides highly reliable and fault-tolerant capabilities to process a large number of events using the publish-subscribe model. Kafka also provides the capability to store and process events per a given use case and requirements, plus it can run as a single node or scale up to a cluster of nodes.

lenses

Event-Driven Architecture is unblocking data-driven decisions in shipping

In March 2021, a 200,000 tonne ship got stuck in the Suez Canal, and the global shipping industry suddenly caught the world’s attention. It made us realize ships play an important role in our daily lives. Really important in fact; 90% of the things we consume arrive by ship. Take a look at this map. By visualizing vessel routes over time, the pattern creates a map of the earth. Note the lack of vessels travelling close to the coast of Somalia where piracy is common.

lenses

Assessing security risks with Kafka audits

Suppose that you work for the infosec department of a government agency in charge of tax collection. You recently noticed that some tax fraud incident records went missing from a certain Apache Kafka topic. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data. But for Kafka in particular, this can prove challenging.

lenses

Increase compliance with Kafka audits

Suppose that you work for a government tax agency. You recently noticed that some tax fraud incident records have been leaked on the darknet. This information is held in a Kafka Topic. The incident response team wants to know who has accessed this data over the last six months. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data to respond to this kind of situation.

lenses

How to create a Kafka topic (the safe way)

We live in a dynamic world. It is safe to say that companies aim to speed up time-to-market and out-innovate their competition with Kafka, but at the same time struggle with some limitations. These can range from compliance-related setbacks for regulations such as GDPR, CCPA and HIPAA, to self-service slip-ups that could see a whole Kafka cluster going down. Even something as seemingly innocuous as configuring and creating a Kafka Topic can lead to operational U-turns, slowdowns and even downtime.