FedoraRHEL Based

How To Install Apache Kafka on Fedora 39

Install Apache Kafka on Fedora 39

In this tutorial, we will show you how to install Apache Kafka on Fedora 39. Apache Kafka and Java are two powerful tools that are widely used in the world of software development. Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-throughput, low-latency messaging.

This article assumes you have at least basic knowledge of Linux, know how to use the shell, and most importantly, you host your site on your own VPS. The installation is quite simple and assumes you are running in the root account, if not you may need to add ‘sudo‘ to the commands to get root privileges. I will show you the step-by-step installation of the Apache Kafka on a Fedora 39.

Prerequisites

Before diving into the installation process, let’s ensure that you have everything you need:

  • A server running one of the following operating systems: Fedora 39.
  • It’s recommended that you use a fresh OS install to prevent any potential issues.
  • You will need access to the terminal to execute commands. Fedora 39 provides the Terminal application for this purpose. It can be found in your Applications menu.
  • A network connection or internet access to download the Apache Kafka repository.
  • A non-root sudo user or access to the root user. We recommend acting as a non-root sudo user, however, as you can harm your system if you’re not careful when acting as the root.

Install Apache Kafka on Fedora 39

Step 1. Before installing Java, it’s a good practice to update your Fedora system to ensure you have the latest packages. Open your terminal and enter the following commands:

sudo dnf clean all
sudo dnf update

Step 2. Installing Java.

To install Java, use the dnf install command followed by the package name of the Java version you want to install. For example, to install OpenJDK 11, you would use the following command:

sudo dnf install java-11-openjdk.x86_64

To confirm that Java has been successfully installed, you can check the Java version on your system with the following command:

java --version

Step 3. Installing Apache Kafka on Fedora 39.

You can download the latest version of Apache Kafka from the official Apache Kafka website. Choose the binary download that matches your system’s architecture:

wget https://downloads.apache.org/kafka/3.6.1/kafka_2.13-3.6.1.tgz

This command downloads the Kafka 3.6.1 version. Replace the version number in the URL with the version you want to download.

After downloading the Kafka tar.gz file, you need to extract it. You can do this using the tar command:

sudo tar -xzf /opt/kafka_2.13-3.6.1.tgz -C /opt

Rename the extracted directory to kafka:

sudo mv /opt/kafka_2.13-3.6.1 /opt/kafka

Assign ownership of the Kafka directory to your current user:

sudo chown -R $USER:$USER /opt/kafka

Step 4. Configure Kafka Server.

Kafka relies on Apache ZooKeeper for coordination between nodes. By default, Kafka is configured to run ZooKeeper but we will use the standalone mode. Open server.properties:

nano /opt/kafka/config/server.properties

Set the following to use standalone mode:

zookeeper.connect=localhost:2181

Also, edit the log directories to:

log.dirs=/tmp/kafka-logs

Save and close the file when done.

Step 5. Start Apache Kafka Server.

You can start the Kafka server by running the following command:

cd /opt/kafka/bin
./kafka-server-start.sh ../config/server.properties

 This will start Kafka in the foreground. To start it in the background, use:

./kafka-server-start.sh -daemon ../config/server.properties

Step 6. Creating Kafka Topics.

 Kafka organizes data streams into topics. Let’s create a test topic named test:

./kafka-topics.sh --create --topic test --partitions 1 --replication-factor 1 --bootstrap-server localhost:9092

This creates a topic with 1 partition and 1 replica.

List the topics to confirm:

./kafka-topics.sh --list --bootstrap-server localhost:9092

You should see the test topic listed.

Step 7. Test Kafka.

To verify that Kafka is working correctly, start a console producer to publish some test messages:

./kafka-console-producer.sh --topic test --bootstrap-server localhost:9092

This will open an input prompt. Type a few messages and hit enter to publish them:

Message 1
Message 2 
Message 3

Next, consume those messages:

./kafka-console-consumer.sh --topic test --from-beginning --bootstrap-server localhost:9092

You should see the published messages printed in the consumer output. Use Ctrl+C to stop the producer and consumer.

Step 8. Configure Systemd Service

To ensure Kafka starts automatically when the system boots up, we will create a systemd service file.

Create a file named kafka.service:

sudo nano /etc/systemd/system/kafka.service

Add the following:

[Unit]
Description=Apache Kafka Server
Documentation=http://kafka.apache.org/documentation.html
Requires=network.target remote-fs.target
After=network.target remote-fs.target

[Service]  
Type=simple
User=kafka
ExecStart=/opt/kafka/bin/kafka-server-start.sh /opt/kafka/config/server.properties
ExecStop=/opt/kafka/bin/kafka-server-stop.sh
Restart=on-abnormal

[Install]
WantedBy=multi-user.target

Save and close the file when done, then reload systemd to pick up the new service:

sudo systemctl daemon-reload

Now start Kafka and enable it to start on boot:

sudo systemctl start kafka
sudo systemctl enable kafka

Check the status with:

sudo systemctl status kafka

Congratulations! You have successfully installed Apache Kafka. Thanks for using this tutorial for installing Apache Kafka on your Fedora 39 system. For additional or useful information, we recommend you check the official Apache website.

VPS Manage Service Offer
If you don’t have time to do all of this stuff, or if this is not your area of expertise, we offer a service to do “VPS Manage Service Offer”, starting from $10 (Paypal payment). Please contact us to get the best deal!

r00t

r00t is an experienced Linux enthusiast and technical writer with a passion for open-source software. With years of hands-on experience in various Linux distributions, r00t has developed a deep understanding of the Linux ecosystem and its powerful tools. He holds certifications in SCE and has contributed to several open-source projects. r00t is dedicated to sharing her knowledge and expertise through well-researched and informative articles, helping others navigate the world of Linux with confidence.
Back to top button