Integrating Apache Kafka with React & Spring Boot: A Dockerized Tutorial

Apache Kafka with React & Spring Boot Tutorial

Enterprises and developers face big data challenges. Real-time data processing is now a top priority for many applications across various industries. Apache Kafka is a leading distributed streaming platform. It’s designed for high-throughput real-time streams. Its features appeal to professionals needing to handle, analyze, and process data effortlessly.

In this Apache Kafka with React & Spring Boot tutorial, we’ll cover Apache Kafka’s integration with a modern tech stack. We use React for frontend and Spring Boot for backend development. Our goal? Develop a (near) real-time message monitor using Kafka’s powerful streaming. Additionally, we incorporate Docker. This ensures our application stack is replicable, isolated, and scalable. It also guarantees consistency across different development stages.

By this guide’s end, you’ll understand how these technologies work together. You’ll learn how to use them for creating efficient (near) real-time applications. We’ll examine architectural decisions, codebase structure, and integration details crucial to this system.

In this tutorial, like our others, we use IntelliJ IDEA for development. It is assumed you understand Java development and have set up a working environment for both Java and Node/NPM.

You can find the source code for this tutorial in our repository at: https://github.com/tucanoo/full_stack_kafka_queue_monitor

Project setup

We’ll work within three main folders: the root folder containing our docker-compose file for Docker containers, and two separate folders for the backend and frontend.

You might like to use different IDEs for the backend to the frontend, as often many use VSCode for any JS based development. For this project, I have established a single GIT repository for both modules, and the docker compose file in the outer directory, therefore will use IntelliJ for managing the entire project.

Parent project setup

Using IntelliJ IDEA, we’ll initially create just an empty project, such as:

Creating full stack kafka monitor example application

Take care to uncheck “Create Git repository”, else as there is no .gitignore file at this stage, IntelliJ will annoyingly add its own ‘.idea’ folder to the repository. So unless you have a very good reason that defies common logic and all known understanding of the universe, you do not want this folder in your repository.

Once created, then if you plan on using version control, please go ahead and create a new .gitignore file (Or use mine from the repository) in the root and exclude any IntelliJ files. You can then safely initialise a repository and add the top-level folder.

Backend (Spring Boot) module

Right-click the top-level folder in the project window and choose New > Module:

New backend module

Select Spring Initializer, and enter the appropriate defaults to establish the API module.
Ensure you select Java and Gradle – Groovy as our DockerFile will expect this to be so.

Backend module configuration

Next, we need to select what dependencies our project will rely on. Select the following;

  • Spring Web
  • Spring Boot DevTools
  • Spring for Apache Kafka

Hit Create, and once dependencies are downloaded and indexed, you should see the new module in your Project pane:

Project structure

Frontend React module

Once again, right-click the top-level project, and select New Module. This time we will select “Vite“, name the module “frontend”, ensure Template is set to React, and confirm the other defaults are filled.

Front-end module creation

As your (real) projects grow in size and component number, you’ll notice a dramatic improvement in build performance.

If everything went smoothly, the system should have successfully created your frontend module. However, if IntelliJ faced issues with NPM and Vite, you might need to create the module from the command line:

npm init vite frontend --template react

Run “npm install” to pull the dependencies and node modules required.

We are also going to use Twitter Bootstrap styles in this project just to quickly make the UI neat and tidy, so from the command line (make sure your current folder is frontend) execute

npm i bootstrap

At this point, we have the two modules set up and are now ready for further development.

Backend API development

Configuration

As already mentioned we are going to use Docker to launch a Kafka server, but we still need to inform our API how to connect to such a service.

So in src/main/resources/application.properties, add the following line:

spring.kafka.bootstrap-servers=kafka:9092

Next we’re going to need a number of beans to configure both the Kafka consumer and Kafka producer. Create a new class named KafkaConfig under src/main/java/<main package>/config and add the following Bean definitions to establish a Producer, and a Consumer for our Kafka broker.

@Configuration
public class KafkaConfig {

    @Value("${spring.kafka.bootstrap-servers}")
    private String kafkaHost;

    // for message production
    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaHost);
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }


    // for message consumption
    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaHost);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "messageGroup");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

Next, we are going to need another Configuration class, to permit our React app to make successful calls to our API without being blocked by CORS (Cross Origin Request Sharing) rules.

Create another class called WebConfig in the same ‘config’ package with the following content, which will then allow requests from our React app, which will eventually be running on port 3000 under localhost.

@Configuration
public class WebConfig implements WebMvcConfigurer {

    @Override
    public void addCorsMappings(CorsRegistry registry) {
        registry.addMapping("/**")
            .allowedMethods("*")
            .allowedOrigins("http://localhost:3000"); // Change this to the domain of your React app if it's different
    }

    @Bean
    public WebMvcConfigurer corsConfigurer() {
        return new WebConfig();
    }
}

Kafka message Production

To push messages onto our Kafka queue, we’re going to create a simple Scheduled task that creates a new message every ten seconds continuously.

Create a new class named “KafkaProducerService” under src/main/java/<main package>/services and add the following content:

@Service
public class KafkaProducerService {

    @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    private static final String TOPIC = "my_kafka_topic";
    private static final Logger log = LoggerFactory.getLogger(KafkaProducerService.class);

    @Scheduled(fixedRate = 10000)
    public void sendMessage() {
        String message = "Hello world : the time is " + LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME);
        kafkaTemplate.send(TOPIC, message).whenComplete((stringStringSendResult, throwable) -> log.info("Message resut {}", stringStringSendResult));
    }
}

This uses Spring Scheduler to call the sendMessage method every 10 seconds. To enable ‘Scheduling’ however, we also need to add an additional annotation to our main application class.
Navigate to the BackendApplication Class and annotate it with @EnableScheduling as such:

Enable Scheduling in main application class

With that change, once our API is running, it will happily feed messages to our Kafka server.

Kafka message consumption

Now we’re producing messages, we need to allow the React app to connect to our API and read any new messages. So the first item on our list is to create a service that will listen for new messages, and temporarily hold them in memory until they are read. At which point we’ll empty our memory store until the next batch comes in from Kafka.

So create another service, named KafkaConsumerService and provide the following content:

@Service
public class KafkaConsumerService {
    private final List<String> messages = new ArrayList<>();

    private static final String TOPIC = "my_kafka_topic";

    @KafkaListener(topics = TOPIC, groupId = "messageGroup")
    public void listen(String message) {
        messages.add(message);
    }

    public List<String> getMessages() {
        List<String> currentMessages = new ArrayList<>(messages);
        messages.clear();
        return currentMessages;
    }
}

In the above, the KafkaListener connects to the Kafka server and listens for new messages, adding them to this temporary messages ArrayList. Then when our API endpoint is called from React, this will call ‘getMessage()’, which returns the contents of the list and then clears it ready for the next data.

API endpoint controller

Finally, we need an endpoint for our React app to call. Create a new class named MessageController under src/main/java/<main package>/controllers

This is very light class, it only need to call our consumer service, and return the messages list, so add the following content:

@RestController
@RequestMapping("/api/messages")
public class MessageController {

    @Autowired
    private KafkaConsumerService kafkaConsumerService;

    @GetMapping("/all")
    public ResponseEntity<List<String>> getAllMessages() {
        return ResponseEntity.ok(kafkaConsumerService.getMessages());
    }

}

That is the majority of the code for the backend. Onwards!

Frontend development

Let’s begin first by ensuring our Bootstrap styles will be available to our components. Open main.jsx and replace the reference to index.css with an import for our bootstrap stylesheets. The content should look as follows:

import React from 'react'
import ReactDOM from 'react-dom/client'
import App from './App.jsx'
import 'bootstrap/dist/css/bootstrap.css';

ReactDOM.createRoot(document.getElementById('root')).render(
  <React.StrictMode>
    <App />
  </React.StrictMode>,
)

Next, we’ll create our component to call our API and display any pulled messages in a list.

Create a folder under src called components, and add there a new file named “KafkaMonitor.jsx”. We’re going to use ‘fetch’ to make an Ajax call to our API endpoint. We’ll use setInterval to repeat the call every 10 seconds. Messages will then be displayed in a simple unordered list. Add the following code to the component:

import React, { useState, useEffect, startTransition } from 'react';

const KafkaMonitor = () => {
  const [messages, setMessages] = useState([]);
  const [loading, setLoading] = useState(true);

  useEffect(() => {
    const fetchMessages = async () => {
      try {
        let response = await fetch('http://localhost:8080/api/messages/all');
        if (response.ok) {
          let data = await response.json();
          startTransition(() => {
            setMessages(data);
            setLoading(false);
          });
        } else {
          console.error('Failed to fetch messages');
        }
      } catch (error) {
        console.error('Error fetching messages:', error);
      }
    };

    fetchMessages();

    const interval = setInterval(fetchMessages, 10000);  // Fetch every 10 seconds
    return () => clearInterval(interval);
  }, []);

  return (
    <div>
      {loading ?
        <div className="d-flex justify-content-center my-3">
          <div className="spinner-border" role="status">
            <span className="visually-hidden">Loading...</span>
          </div>
        </div>
        :
        <ul className="list-group">
          {messages.map((message, idx) => <li key={idx} className="list-group-item">{message}</li>)}
        </ul>
      }
    </div>
  );
};

export default KafkaMonitor;

Now, we simply need to call this Component from App.jsx, and remove the default code in there. We also apply a little styling and make use of some of the Bootstrap components for layout. Copy the following code over your existing App.jsx file:

import KafkaMonitor from "./components/KafkaMonitor.jsx";
function App() {
  return (
    <div>
      {/* Navbar */}
      <nav className="navbar navbar-dark bg-dark mb-5">
        <div className="container-fluid">
          <span className="navbar-brand mb-0 h1">Kafka Message Monitor App</span>
        </div>
      </nav>

      {/* Kafka Monitor inside a card */}
      <div className="container">
        <div className="card">
          <div className="card-header">Kafka Message Monitor</div>
          <div className="card-body">
            <KafkaMonitor />
          </div>
        </div>
      </div>
    </div>
  )
}

export default App

That is all we need for the front-end side. Our next task is to wire everything together into a runnable application.

Docker and deployment

By utilising Docker, we can take care of configuring and launching an Apache Kafka server, building both the front and backend modules and wiring them all together, ultimately exposing URLs that we can call in a browser and see everything working together.

Each module will contain its own DockerFile, which will be responsible for establishing the necessary environment for building and exposing its own artifacts. And in the top level of the project, we will have a docker-compose file which will establish the Kafka server environment and wire each of the services together.

Backend Dockerfile

Create a new file named Dockerfile in the backend folder, and add the following content:

# Step 1: Build the application using Gradle
FROM gradle:jdk17 AS build

WORKDIR /app
COPY . .

# Build the project using Gradle Wrapper
RUN ./gradlew clean bootJar

# Step 2: Run the JAR in a new layer with a smaller base image for reduced image size
FROM eclipse-temurin:17-jre-alpine

# Make port 8080 available to the world outside this container
EXPOSE 8080

# Set the location of the JAR
ARG JAR_FILE=/app/build/libs/*.jar

# Copy the JAR from the build stage
COPY --from=build ${JAR_FILE} app.jar

# Run the JAR
ENTRYPOINT ["java","-jar","/app.jar"]

Frontend Dockerfile

Create a new file named Dockerfile in the frontend folder, and add the following content:

# Build stage
FROM node:16 AS build
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build

# Serve using an HTTP server
FROM nginx:alpine
COPY --from=build /app/dist /usr/share/nginx/html

Docker compose file

In the top-level folder, create a new file named docker-compose.yml and add the following content:

version: '3.8'

services:
  # Zookeeper service
  zookeeper:
    image: bitnami/zookeeper:latest
    ports:
      - "2181:2181"
    environment:
      - ALLOW_ANONYMOUS_LOGIN=yes
  # Kafka service
  kafka:
    image: bitnami/kafka:latest
    ports:
      - "9092:9092"
    environment:
      KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_LISTENERS: INSIDE://0.0.0.0:9092
      KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock

  # Spring Boot backend service
  backend:
    build:
      context: ./backend
      dockerfile: Dockerfile
    ports:
      - "8080:8080"
    environment:
      KAFKA_BROKER: kafka:9094
    depends_on:
      - kafka

  # React frontend service
  frontend:
    build:
      context: ./frontend
      dockerfile: Dockerfile
    ports:
      - "3000:80"
    depends_on:
      - backend

You can see in this file how we establish the Kafka services using official Docker images from Bitnami and apply some basic configuration, establishing our running ports which we refer to in our Backend configuration.

Running it all.

Now we should have everything we need to establish running Docker containers. From a command line / Terminal prompt, run docker-compose up -d

If all is successful, you should see 4 containers launched. You can call “docker container ls” from the command line to check:

Docker containers for full stack kafka monitor

Finally, accessing localhost:3000 in your browser should launch the React app. After every 10 seconds, you should see messages with a different timestamp.

Apache Kafka with React & Spring Boot Tutorial running in browser

You can of course modify the frequency we poll for messages in the front end, and modify the Scheduled method frequency to add new messages.

Additionally, you can monitor the logs for the backend container and you will see log output for every new message added to Kafka.

We hope this Apache Kafka with React & Spring Boot Tutorial has proven useful, again you can find the full source code for reference in our repository, And please do not hesitate to contact us if you require any assistance with your Spring Boot, or React development requirements.

Founder of Tucanoo Solutions Ltd, a Cloud / Web Application development company. AWS Cloud Solutions Architect. Specialties: Spring Boot, Java, Grails, React.JS, App Architecture, Agile, Scrum, Git, AWS, Javascript.

Leave a Reply

Your email address will not be published. Required fields are marked *