Skip to main content

Apache Kafka

Version: 3.9.1 | OS: Ubuntu 22.04

Overview

Apache Kafka is a distributed data streaming platform used to build real-time data pipelines and streaming applications. This single-node Kafka deployment provides an easy development environment but is not recommended for production use.

Note: This instance is configured as a single-node cluster and does not offer high availability or fault tolerance. Ideal for development and testing environments only.

Included Software

  • Apache Kafka - 3.5.0
  • OpenJDK - 11

Firewall & Access

PortServicePurpose
22SSHRemote server access
9093Apache KafkaSecure Kafka endpoint

Initial Deployment

During initial deployment:

  • Kafka and Zookeeper services are installed
  • Shell credentials are stored in /root/.shell_user_passwords
  • The system sets up secure SSL and SCRAM-SHA-256 Kafka access

You will be logged out with this message:

Please wait until the installation is completed....
Connection to $IPADDRESS closed.

Wait at least 2 minutes before logging in again.

Accessing the Kafka Instance

Log in via SSH using your configured method (password or key). Then:

sudo su -

Kafka Configuration Update

Before using public access, update internal IPs to your public IP:

  1. Edit: /opt/kafka/ssl/example.librdkafka.config

    bootstrap.servers=$PUBLIC_IP:9093
  2. Edit: /opt/kafka/config/server.properties

    advertised.listeners=INTERNAL_PLAINTEXT://127.0.0.1:9092,PUBLIC_SSL://$PUBLIC_IP:9093
  3. Restart services:

systemctl restart kafka.service zookeeper.service

Available Kafka Listeners

  • localhost:9092: PLAINTEXT (internal access only)
  • 0.0.0.0:9093: TLS/SSL with SCRAM-SHA-256 authentication

Basic Kafka CLI Operations

Connect via SSH and run:

Create Topic

/opt/kafka/bin/kafka-topics.sh \
--create \
--topic quickstart \
--bootstrap-server localhost:9092

Produce Message

echo 'Hello, CloudPortal!' | /opt/kafka/bin/kafka-console-producer.sh \
--topic quickstart \
--bootstrap-server localhost:9092

Consume Message

/opt/kafka/bin/kafka-console-consumer.sh \
--from-beginning \
--max-messages 1 \
--topic quickstart \
--bootstrap-server localhost:9092

External TLS Access with kafkacat

On your local machine:

  1. Copy required files:
scp root@$IPADDRESS:'/opt/kafka/ssl/{ca.crt,example.librdkafka.config}' .
  1. Install kafkacat:
apt install kafkacat -y  # Debian/Ubuntu
  1. Produce a message:
echo 'Hello, CloudPortal TLS!' | kafkacat -P -b $IPADDRESS:9093 \
-F ./example.librdkafka.config \
-X ssl.ca.location=./ca.crt \
-t quickstart
  1. Consume the message:
kafkacat -C -b $IPADDRESS:9093 \
-F ./example.librdkafka.config \
-X ssl.ca.location=./ca.crt \
-t quickstart

Press Ctrl+C to stop consumer

SSL Certificate Files

Located in /opt/kafka/ssl:

FileDescription
ca.crtSelf-signed CA certificate
ca.keyCA private key
kafka.crtKafka server certificate
kafka.keystore.jksJava Keystore with server and CA certificates
example.librdkafka.configSample secure client config
.keystore_passwordPassword for Java keystore

Best Practices

  • Never expose PLAINTEXT listener to the public internet
  • Change default credentials
  • Use a valid certificate from a trusted CA in production
  • Avoid using this setup in production environments
  • Enable log rotation and monitoring if using for extended testing

For help, contact Cloud4India Support.