Using Kafka for JUnit with Spring Kafka

The last articles gave a couple of examples on how to write Kafka-enabled integration tests at various levels of abstraction using Kafka for JUnit. For component-tests, we kept the scenarios quite simple and built a minimal producer and consumer on top of the official kafka-clients library for Java. This is perfectly fine and my personal recommendation is that you stick to this approach if you have any requirements that aren't particulary standard. Oftentimes, another abstraction layer on top of kafka-clients that integrates well with your chosen application framework will suffice though. We will take a look at the Spring ecosystem for that matter. Hence, the question is: What do I need to do if I want write integration tests with Kafka for JUnit in the context of a Spring-based application that leverages Spring Kafka to integrate messaging capabilities?

more ...

Writing system tests for a Kafka-enabled microservice

In this article, we'll shift our attention away from component tests and will take a look at how you can leverage Kafka for JUnit in order to write system tests at a higher level of abstraction. A couple of years ago, I was exploring Spring for Kafka and implemented a couple of showcases along the way. One of these showcases is a small system of microservices that implement a solution for managing todo's according to David Allen's Getting Things Done method. This system of microservices follows a CQRS-style architecture with a dedicated microservice - call it command service - that is concerned with altering data and another dedicated microservice - call it query service - which serves the read model. We are going to use Kafka for JUnit to verify that the command service integrates properly with Apache Kafka.

more ...

Writing component tests for Kafka consumers

We have seen how easy it is to write concise and readable component tests for Kafka producers in the last article. In this installment, we will focus on the read-side and write component tests for a Kafka consumer. The example is centered around the same small lifecycle event service that we saw in the last article.

more ...

Writing component tests for Kafka producers

Kafka for JUnit makes it easy to write integration tests for custom-built Kafka-enabled components as well as entire systems that integrate with a Kafka cluster. In the course of this article, I'd like to demonstrate how you can leverage this testing library to write whitebox tests for software components that integrate with Kafka. We'll start off with the write-path of a small lifecycle event service and implement a custom publisher on top of the Kafka Clients library that ought to be tested.

more ...

Using KaDeck for exploring and manipulating data in Kafka topics

I'd like to talk a bit about a tool which I've used during past and present projects that employ Apache Kafka: KaDeck. KaDeck is developed by the Frankfurt/Germany-based software company Xeotek GmbH. Their website states that it is a holistic monitoring solution for Apache Kafka and Amazon Kinesis. I'm not going to recite the product brochure here, you can get the necessary details over at the folks of Xeotek. Instead I want to show you how you can use a tool like KaDeck to play around with the data you have, manipulate it and gain further insights from it. KaDeck comes in different shapes: You can use the desktop version which comes with basic functionality and only supports Apache Kafka, or you can use KaDeck as a web service and enjoy its full feature set including the integration of Amazon Kinesis. The single-user license for both versions is free of charge. Get it here. I'll be using the desktop version in the course of this article.

more ...