Emailing microservice with Apache Kafka and Spring Boot made easy

20 min
Last Update : 6 October 2020

This step-by-step guide will help you set up a messaging feature using Apache Kafka for your Spring Boot micro-service application.


Providing feedback to users is of tremendous importance in a modern Web application. E-mails continue to be the best way to communicate when a user is not active on the platform. Sending e-mail can however take time and may fail. In modern micro-service architecture, the need to give feedback can occur in any of many services. A robust design requires a dedicated service for messaging.

I faced this issue on an ambitious project for one of our customers. The customer had an Apache Kafka infrastructure on site. I figured it could be the ideal tool to asynchronously send e-mail from various parts of our software stack.

Apache Kafka: what it is and why?


As described on its website, "Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications." Said otherwise, Kafka is a message queue software on steroids. Its flexibility makes it perfect for both simple and complex projects. It acquired therefore vast popularity in the IT of influential companies.

The goal of the article is not to explain how Kafka works. It is by itself a topic for a whole book. I recommend "Kafka: The Definitive Guide" by Narkhede, Shapira, and Palino. If you have no experience of Kafka, here are a few definitions of its concepts:

  • A topic is a queue on which messages can be pushed. Each topic several partitions that allow parallel processing of messages.
  • A producer is a client that pushes messages to a topic on a group of servers called brokers.
  • A consumer is a client that process messages from a topic and gets attributed partitions. Each message is processed by one consumer of each consumer group.

The messaging micro-service


Our application is comprised of several micro-services. Each of them has its responsibility; whether it is to manage permissions, customer data, products... The strength of using Kafka is that it permits several of our micro-services to send notifications by pushing messages to a single Kafka topic. On the other end of the queue, a single Spring Boot application is responsible for handling the request for e-mails of our whole application.

Now, let's set up the project

The distributed messaging system is comprised of the following elements:

  • The reverse proxy Nginx
  • A Spring Boot micro-service, called messaging-api
  • Kafka, and its dependency Zookeeper
  • Mailhog: a simple mail catcher that is quite handy for debugging our e-mailing application.

I combine readily available Docker images to write a ready to use Docker-Compose file:


  # Our messaging micro-service
      context: ./messaging-api/
      dockerfile: ./docker/Dockerfile
      - nginx
      - kafka
    command: mvn spring-boot:run
      - JAVA_TOOL_OPTIONS="-Xmx512m"
      - ./configuration:/tmp/configuration:delegated
      - ./messaging-api:/tmp/app:delegated
      - ~/.m2:/home/deploy/.m2:cached
      - default

  # Zookeeper: required by Kafka
    image: 'bitnami/zookeeper:3'
      - '2181:2181'
      - 'zookeeper_data:/bitnami'

  # Kafka itself
    image: 'bitnami/kafka:2'
      - '9092:9092'
      - 'kafka_data:/bitnami'
      - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
      - zookeeper

# Mailhog: mail catcher for local debugging
    image: mailhog/mailhog
      - 1025:1025 # SMTP server ports
      - 8025:8025 # Web UI ports


Concerning the Spring Boot application itself, I generated a pom.xml from the automated generation tool (, including Kafka, Emailing and Thymeleaf. Alternatively, you can include the following in the dependency section:


A trivial producer for the demonstration


In an actual application using a micro-service architecture, the request for an e-mail notification could come from any service. In this example, I set up a simple Spring Boot controller directly in the messaging service. Using Kafka in such a situation is, of course, ridiculous, but will serve my demonstration purpose.

To create a Kafka producer, I instantiate the KafkaTemplate class and use the send method. In this case, I set it in a trivial HTTP route:

public class DemoController {
    private final KafkaTemplate<String, ProjectStatusChangeDto> kakfaProducer;
    private final KafkaProperties kafkaProperties;

    public void sendProjectStatusEmail(@RequestBody ProjectStatusChangeDto statusChange) {"Sending mailing request: " + statusChange.toString());
        kakfaProducer.send(kafkaProperties.getTopics().getProjectStatusChanged(), statusChange);

The Kafka consumer


Configuring our application with the application file

The Spring Kafka library is configured via the application file. The spring.kafka section can be configured the following way:

      security.protocol: 'PLAINTEXT'
    bootstrap-servers: kafka:9092
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      group-id: messaging_api
      auto-offset-reset: earliest
        spring.json.trusted.packages: '*'
        spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
      missing-topics-fatal: false

Note: I utilized the YAML format for this project, but it should work likewise with a properties file. In local environment, I use a plain text security protocol. Naturally, in production, it is advisable to use SSL and a proper configuration of security keys.

The serialization formats are set using the spring.kafka.producer section. I use simple string keys and JSON for the body of the messages. For deserialization, we must set the same formats. Please note that instead of directly using the StringSerializer and JsonSerializer, we use the ErrorHandlingSerializer class and configure it in its dedicated section. This is extremely effective to avoid poison pill situations: if a corrupted object is pushed to the Kafka topic, serialization will likely fail with a deserialization exception. The consumer would retry again, yielding the same result, and enter a deadlock situation. This class handles deserialization errors for us. More detail about poison pills and deserialization error management can be found in the following article.

Configuring e-mail settings

The Spring SMTP client can also be configured through the application.yml file. In this case, for local development, we configure it to send e-mail to the mail catcher and deactivate SSL. In production, those parameters are of course overridden.

Mailhog can be accessed in the Web browser at http://localhost:8025.

    host: mailhog
    port: 1025
      mail.smtp.auth: false
      mail.smtp.starttls.enable: false

The Kafka consumer

The @KafkaListener annotation is particularly handy to construct a new Kafka consumer.

@KafkaListener(topicPattern = "${kafka.topics.project-status-changed}", autoStartup = "${kafka.enabled}")
public void listenToProjectStatusChange(ConsumerRecord<String, ProjectStatusChangeDto> record) {"Request for project status change received: " + record.toString());

    ProjectStatusChangeDto payload = record.value();

    try {
            "Votre demande",
    } catch (MailException e) {
        log.error("Could not send e-mail", e);

The e-mailing service

Using the e-mail library from the Spring framework, developping a service that sends e-mail only takes a few lines of code:

public void sendEmail(String recipient, String subject, EmailContentDto content) throws MailException {
    MimeMessagePreparator messagePreparator = mimeMessage -> {
        MimeMessageHelper messageHelper = new MimeMessageHelper(mimeMessage, true);
        messageHelper.setText(content.getText(), content.getHtml());

A word on templating

The generation of the content of the e-mail is out of the scope of this article. The Thymeleaf templating service, which is native of Spring, provides a practical way of generating both text and HTML e-mails.

A word on error management

The attentive reader will have noted that besides deserialization errors, I don't manage application errors in this example. A more suitable design would require the handling of thrown exceptions and potentially implement a retry strategy. This is outside the scope of this article, but I recommend the following article if you want to dive further into the topic of error management.


In this article, I explained how to set up a simple e-mailing micro-service with Spring Boot and running on a Kafka infrastructure. It allows applications running in a micro-service architecture to asynchronously send e-mails to the user. If you want more details on the technical implementation or bootstrap your micro-service, the code for this project is hosted on my Github page.