Convert Figma logo to code with AI

spring-projects logospring-kafka

Provides Familiar Spring Abstractions for Apache Kafka

2,176
1,560
2,176
48

Top Related Projects

28,601

Mirror of Apache Kafka

23,929

Apache Flink

14,185

Apache Pulsar - distributed pub-sub messaging system

5,496

Apache Camel is an open source integration framework that empowers you to quickly and easily integrate various systems consuming or producing data.

Alpakka Kafka connector - Alpakka is a Reactive Enterprise Integration library for Java and Scala, based on Reactive Streams and Akka.

Quick Overview

Spring for Apache Kafka (spring-kafka) is a project that provides integration between the Spring Framework and Apache Kafka. It offers a high-level abstraction for sending and receiving messages using Apache Kafka, simplifying the development of Kafka-based messaging solutions in Spring applications.

Pros

  • Seamless integration with Spring Framework and Spring Boot
  • Simplified configuration and usage of Kafka producers and consumers
  • Support for Kafka Streams and Kafka transactions
  • Comprehensive error handling and retry mechanisms

Cons

  • Learning curve for developers new to both Spring and Kafka
  • Potential overhead compared to using native Kafka client directly
  • Limited control over low-level Kafka configurations
  • May introduce additional dependencies to your project

Code Examples

  1. Configuring a Kafka listener:
@KafkaListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received message: " + message);
}
  1. Sending a message using KafkaTemplate:
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

public void sendMessage(String topic, String message) {
    kafkaTemplate.send(topic, message);
}
  1. Configuring a Kafka producer:
@Bean
public ProducerFactory<String, String> producerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    return new DefaultKafkaProducerFactory<>(configProps);
}

Getting Started

To get started with Spring Kafka, add the following dependency to your Maven pom.xml:

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
    <version>3.0.7</version>
</dependency>

For Spring Boot applications, you can use the Spring Kafka Boot Starter:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-kafka</artifactId>
</dependency>

Then, configure Kafka properties in your application.properties or application.yml:

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      group-id: myGroup

Now you can use @KafkaListener annotations and KafkaTemplate in your Spring components to consume and produce Kafka messages.

Competitor Comparisons

28,601

Mirror of Apache Kafka

Pros of Kafka

  • Core Kafka implementation with full feature set and direct control
  • Highly scalable and performant, designed for large-scale distributed systems
  • Extensive documentation and community support

Cons of Kafka

  • Steeper learning curve and more complex setup
  • Requires more manual configuration and management
  • Less integration with Spring ecosystem

Code Comparison

Kafka (Producer):

Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
KafkaProducer<String, String> producer = new KafkaProducer<>(props);

Spring Kafka (Producer):

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

public void sendMessage(String msg) {
    kafkaTemplate.send("topicName", msg);
}

Spring Kafka provides a higher-level abstraction and integrates seamlessly with Spring's dependency injection and configuration. Kafka offers more direct control but requires more boilerplate code. Spring Kafka simplifies Kafka usage in Spring applications, while Kafka provides the core functionality and is suitable for various environments and languages.

23,929

Apache Flink

Pros of Flink

  • More comprehensive distributed stream processing framework
  • Supports both batch and stream processing with a unified API
  • Offers advanced features like stateful computations and event time processing

Cons of Flink

  • Steeper learning curve due to its broader scope and complexity
  • Requires more setup and configuration for basic use cases
  • May be overkill for simple Kafka-based applications

Code Comparison

Spring Kafka:

@KafkaListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received: " + message);
}

Flink:

DataStream<String> stream = env.addSource(new FlinkKafkaConsumer<>("myTopic", new SimpleStringSchema(), properties));
stream.map(message -> "Received: " + message)
      .print();

Summary

Spring Kafka is a lightweight library for integrating Kafka with Spring applications, focusing on simplicity and ease of use. Flink, on the other hand, is a full-fledged distributed processing engine that goes beyond Kafka integration, offering powerful stream and batch processing capabilities. While Spring Kafka is ideal for straightforward Kafka-centric applications, Flink shines in complex, large-scale data processing scenarios that may involve multiple data sources and advanced analytics.

14,185

Apache Pulsar - distributed pub-sub messaging system

Pros of Pulsar

  • Multi-tenancy support with built-in isolation between tenants
  • Geo-replication capabilities for global data distribution
  • Tiered storage for cost-effective data retention

Cons of Pulsar

  • Steeper learning curve due to more complex architecture
  • Requires more resources to run and maintain
  • Less mature ecosystem compared to Kafka

Code Comparison

Spring Kafka:

@KafkaListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received: " + message);
}

Pulsar:

@PulsarListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received: " + message);
}

Summary

Spring Kafka is a Spring project that provides integration with Apache Kafka, offering a simpler programming model for Kafka consumers and producers. It's well-integrated with the Spring ecosystem and has a gentler learning curve.

Pulsar is a distributed messaging and streaming platform with a more complex architecture. It offers advanced features like multi-tenancy and geo-replication but requires more resources and expertise to operate.

Both projects have similar basic usage patterns, as shown in the code comparison. The choice between them depends on specific project requirements, scalability needs, and team expertise.

5,496

Apache Camel is an open source integration framework that empowers you to quickly and easily integrate various systems consuming or producing data.

Pros of Camel

  • More versatile, supporting integration with numerous technologies beyond just Kafka
  • Offers a wide range of Enterprise Integration Patterns (EIPs) for complex routing and mediation
  • Provides a Domain-Specific Language (DSL) for easier route definition

Cons of Camel

  • Steeper learning curve due to its extensive feature set
  • Can be overkill for simple Kafka-only integrations
  • Potentially higher resource overhead for basic use cases

Code Comparison

Spring Kafka:

@KafkaListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received: " + message);
}

Camel:

from("kafka:myTopic")
    .process(exchange -> {
        String message = exchange.getIn().getBody(String.class);
        System.out.println("Received: " + message);
    });

Summary

Spring Kafka is more focused and lightweight, ideal for Spring-based applications primarily working with Kafka. Camel offers a broader integration platform with support for numerous technologies and complex routing scenarios. Spring Kafka has a gentler learning curve for Kafka-specific tasks, while Camel provides more flexibility at the cost of increased complexity. The choice between them depends on the specific project requirements and the scope of integration needs.

Alpakka Kafka connector - Alpakka is a Reactive Enterprise Integration library for Java and Scala, based on Reactive Streams and Akka.

Pros of Alpakka Kafka

  • Built on Akka Streams, providing powerful reactive streaming capabilities
  • Offers more fine-grained control over message processing and backpressure
  • Integrates seamlessly with other Akka ecosystem components

Cons of Alpakka Kafka

  • Steeper learning curve, especially for developers unfamiliar with Akka
  • Smaller community and ecosystem compared to Spring Kafka
  • Less extensive documentation and fewer readily available examples

Code Comparison

Spring Kafka:

@KafkaListener(topics = "myTopic")
public void listen(String message) {
    System.out.println("Received: " + message);
}

Alpakka Kafka:

Consumer
  .plainSource(consumerSettings, Subscriptions.topics("myTopic"))
  .runForeach(record => println(s"Received: ${record.value}"))

Both Spring Kafka and Alpakka Kafka are popular libraries for working with Apache Kafka in JVM-based applications. Spring Kafka is part of the Spring ecosystem and offers seamless integration with Spring Boot, making it easier to set up and use for developers familiar with Spring. It provides a high-level abstraction layer and annotation-based configuration.

Alpakka Kafka, on the other hand, is built on top of Akka Streams and offers more advanced streaming capabilities. It provides greater flexibility and control over message processing, making it suitable for complex streaming scenarios. However, it may require more in-depth knowledge of Akka and reactive programming concepts.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Spring for Apache Kafka Build Status Revved up by Develocity

Code of Conduct

Please see our Code of conduct.

Reporting Security Vulnerabilities

Please see our Security policy.

Checking out and Building

To check out the project and build from source, do the following:

git clone git://github.com/spring-projects/spring-kafka.git
cd spring-kafka
./gradlew build

Java 17 or later version is recommended to build the project.

If you encounter out of memory errors during the build, change the org.gradle.jvmargs property in gradle.properties.

To build and install jars into your local Maven cache:

./gradlew install

To build API Javadoc (results will be in build/api):

./gradlew api

To build reference documentation (results will be in spring-kafka-docs/build/site):

./gradlew antora

To build complete distribution including -dist, -docs, and -schema zip files (results will be in build/distributions)

./gradlew dist

Using Eclipse

To generate Eclipse metadata (.classpath and .project files), do the following:

./gradlew eclipse

Once complete, you may then import the projects into Eclipse as usual:

File -> Import -> Existing projects into workspace

Browse to the 'spring-kafka' root directory. All projects should import free of errors.

Using IntelliJ IDEA

To generate IDEA metadata (.iml and .ipr files), do the following:

./gradlew idea

Resources

For more information, please visit the Spring Kafka website at: Reference Manual

Contributing to Spring Kafka

Here are some ways for you to get involved in the community:

  • Get involved with the Spring community on the Spring Community Forums.
    Please help out on the StackOverflow by responding to questions and joining the debate.
  • Create GitHub issues for bugs and new features and comment and vote on the ones that you are interested in.
  • GitHub is for social coding: if you want to write code, we encourage contributions through pull requests from forks of this repository.
    If you want to contribute code this way, please reference a GitHub issue as well covering the specific issue you are addressing.
  • Watch for upcoming articles on Spring by subscribing to springframework.org

Before we accept a non-trivial patch or pull request we will need you to sign the contributor's agreement. Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team and given the ability to merge pull requests.

Code Conventions and Housekeeping

None of these is essential for a pull request, but they will all help. They can also be added after the original pull request but before a merge.

  • Use the Spring Framework code format conventions (import eclipse-code-formatter.xml from the root of the project if you are using Eclipse).
  • Make sure all new .java files to have a simple Javadoc class comment with at least an @author tag identifying you, and preferably at least a paragraph on what the class is for.
  • Add the ASF license header comment to all new .java files (copy from existing files in the project)
  • Add yourself as an @author to the .java files that you modify substantially (more than cosmetic changes).
  • Add some Javadocs and, if you change the namespace, some XSD doc elements.
  • A few unit tests would help a lot as well - someone has to do it.
  • If no-one else is using your branch, please rebase it against the current main (or another target branch in the main project).

Getting Support

Use the spring-kafka tag on Stack Overflow to ask questions; include code and configuration and clearly explain your problem, providing an MCRE if possible. Commercial support is also available.

License

Spring Kafka is released under the terms of the Apache Software License Version 2.0 (see LICENSE.txt).