Camel Kafka Connector: No Code, No Hassle Integrations
In this article, we are going to discuss the combination of Camel components and Kafka which has made integrations with Kafka even more easy, stable, and versatile.
Join the DZone community and get the full member experience.
Join For FreeHi,
In this article, we are going to discuss Camel Kafka Connectors. Apache Camel has more than 300 components used for the integration of different endpoints and protocols. Thus, this combination of Camel components and Kafka has made integrations with Kafka even more easy, stable, and versatile. Also, not a single line of code is required.
We can find more details about Apache Camel Kafka Connectors in community documentation. There are two types of connectors; source and sink. This I want to highlight as per the documentation:
Camel-Kafka Source Connector is a pre-configured Camel consumer which will perform the same action on a fixed rate and send the exchanges to Kafka, while a Camel-Kafka Sink Connector is a pre-configured Camel producer which will perform the same operation on each message exported from Kafka.
In this article, we will implement the Camel-SSH component of Kafka Sink Connector. This example is based on camel-kafka-connector-examples.
I have tested this on Fedora 33 with Apache Kafka-2.7.0 and Podman. Podman we are going to use for running SSH server and KafkaCat utility to send Kafka messages.
So let us start our findings and learning.
1. Let us first download camel-ssh-kafka-connector. At the time of writing this article, the version I downloaded is camel-ssh-kafka-connector-0.7.0-package.zip.
2. I extracted it in my local disk.
[chandrashekhar@localhost camel-kafka-connectors]$ ls -ltrh |grep ssh
drwxr-xr-x. 2 chandrashekhar chandrashekhar 4.0K Dec 21 18:50 camel-ssh-kafka-connector
-rw-rw-r--. 1 chandrashekhar chandrashekhar 31M Dec 21 23:20 camel-ssh-kafka-connector-0.7.0-package.zip
[chandrashekhar@localhost camel-kafka-connectors]$
3. Download Apache Kafka. While writing this article, the latest version I downloaded is Kafka-2.7.
4. Start Kafka and create a testTopic
.
x
[chandrashekhar@localhost kafka_2.13-2.7.0]$ ./bin/zookeeper-server-start.sh config/zookeeper.properties
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-server-start.sh config/server.properties
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic testTopic
Created topic testTopic.
5. Start SSH Server using Podman.
xxxxxxxxxx
[chandrashekhar@localhost strimzi-0.20.1]$ podman run -d -P --name test_sshd rastasheep/ubuntu-sshd:14.04
# check ports mapped
[chandrashekhar@localhost strimzi-0.20.1]$ podman port test_sshd
22/tcp -> 0.0.0.0:21947
[chandrashekhar@localhost strimzi-0.20.1]$
6. Setup plugin path ofcamel-ssh-kafka-connector
in Kafka.
xxxxxxxxxx
#Note path were connector is extracted.
[chandrashekhar@localhost camel-kafka-connectors]$ pwd
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors
[chandrashekhar@localhost camel-kafka-connectors]$ ls -ltr|grep ssh
drwxr-xr-x. 2 chandrashekhar chandrashekhar 4096 Dec 21 18:50 camel-ssh-kafka-connector
-rw-rw-r--. 1 chandrashekhar chandrashekhar 31543032 Dec 21 23:20 camel-ssh-kafka-connector-0.7.0-package.zip
[chandrashekhar@localhost camel-kafka-connectors]$
# In [KAFKA_HOME]/config/connect-standalone.properties configure plugin.path with the path where connector is extracted.
[chandrashekhar@localhost kafka_2.13-2.7.0]$ vi config/connect-standalone.properties
plugin.path=/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors
7. Setup Connector with aCamelSshSinkConnector.properties
file, which has SSH sink configurations.
xxxxxxxxxx
[chandrashekhar@localhost config]$ pwd
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors/config
[chandrashekhar@localhost config]$ vi CamelSshSinkConnector.properties
name=CamelSshSinkConnector
connector.class=org.apache.camel.kafkaconnector.ssh.CamelSshSinkConnector
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
# kafka topic
topics=testTopic
camel.sink.path.host=localhost
#ssh port
camel.sink.path.port=21947
camel.sink.endpoint.username=root
camel.sink.endpoint.password=root
8. Run connector in standalone mode. Being a POC in Kafka one node setup, we will run [KAFKA_HOME]/bin/connect-standalone.sh
xxxxxxxxxx
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/connect-standalone.sh config/connect-standalone.properties ../camel-kafka-connectors/config/CamelSshSinkConnector.properties
9. Create a file with Linux commands to create the file and then append some records.
xxxxxxxxxx
[chandrashekhar@localhost config]$ cat sshCommands.txt
touch sshexample.txt
echo 'apple is fruit' >> sshexample.txt
echo 'rose is flower' >> sshexample.txt
[chandrashekhar@localhost config]$
10. Send the record withinsshCommands.txt
using KafkaCat utility to Kafka. Here we are using Podman to run a KafkaCat docker image for sending messages.
xxxxxxxxxx
[chandrashekhar@localhost config]$ pwd
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors/config
[chandrashekhar@localhost config]$ ls -ltr|grep ssh
-rwxrwxrwx. 1 chandrashekhar chandrashekhar 101 Jan 16 23:19 sshCommands.txt
[chandrashekhar@localhost config]$ sudo podman run -it --network=host --volume `pwd`/sshCommands.txt:/data/sshCommands.txt --security-opt label=disable edenhill/kafkacat:1.6.0 -P -b 0.0.0.0:9092 --name kafkacat -t testTopic -P -l /data/sshCommands.txt
[sudo] password for chandrashekhar:
[chandrashekhar@localhost config]$
11. Now, after sending messages to Kafka testTopics
and camel-ssh-kafka-connector
sink already running, we expect that the SSH server which we started earlier should have received these commands from Kafka with camel-ssh sink connector. Here username and password of this SSH server is root.
xxxxxxxxxx
[chandrashekhar@localhost config]$ ssh root@localhost -p 21947
root@localhost's password:
Last login: Sat Jan 16 17:59:21 2021 from localhost
root@30aebbbcdb82:~# ls
sshexample.txt
root@30aebbbcdb82:~# cat sshexample.txt
apple is fruit
rose is flower
root@30aebbbcdb82:~#
12. Once tested, we can stop the Podman container.
x
[chandrashekhar@localhost strimzi-0.20.1]$ podman ps -a|grep ssh
36ffa61633bd docker.io/rastasheep/ubuntu-sshd:14.04 /usr/sbin/sshd -D 8 minutes ago Up 8 minutes ago 0.0.0.0:26237->22/tcp test_sshd
[chandrashekhar@localhost strimzi-0.20.1]$
[chandrashekhar@localhost strimzi-0.20.1]$ podman container rm -f test_sshd
36ffa61633bd9c5fef110e550a9eb789660d31d0b4949b0da919b1f00269efe8
[chandrashekhar@localhost strimzi-0.20.1]$
[chandrashekhar@localhost strimzi-0.20.1]$ podman ps -a|grep ssh
[chandrashekhar@localhost strimzi-0.20.1]$
13. We can finally stop Kafka and Zookeeper. The connector instance can be closed with Ctrl + C on it's terminal or just closing that terminal.
x
[chandrashekhar@localhost bin]$ pwd
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/kafka_2.13-2.7.0/bin
[chandrashekhar@localhost bin]$ ./kafka-server-stop.sh
[chandrashekhar@localhost bin]$ ./zookeeper-server-stop.sh
14. Another important point is to check the group and offset details associated with testTopic
.
x
# get list of all consumer-group
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-consumer-groups.sh --list --bootstrap-server localhost:9092
connect-CamelSshSinkConnector
[chandrashekhar@localhost kafka_2.13-2.7.0]$
# check offset details of connect-CamelSshSinkConnector group.
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-consumer-groups.sh --describe --group connect-CamelSshSinkConnector --bootstrap-server localhost:9092
Consumer group 'connect-CamelSshSinkConnector' has no active members.
GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID
connect-CamelSshSinkConnector testTopic 0 12 12 0 - - -
[chandrashekhar@localhost kafka_2.13-2.7.0]$
That's it guys, hope you would have found this article interesting and informative.
Opinions expressed by DZone contributors are their own.
Comments