Install Linux Distribution on Windows
Steps to run Confluent in Windows are provided below :
- Open Powershell in Administrator Mode and run the command as –
wsl --list --online
Here you can see the Name and Friendly Name of the distributions.
NAME FRIENDLY NAME
Ubuntu Ubuntu
Debian Debian GNU/Linux
kali-linux Kali Linux Rolling
openSUSE-42 openSUSE Leap 42
SLES-12 SUSE Linux Enterprise Server v12
Ubuntu-16.04 Ubuntu 16.04 LTS
Ubuntu-18.04 Ubuntu 18.04 LTS
Ubuntu-20.04 Ubuntu 20.04 LTS
- Now to install the distribution you want run the below command –
wsl --install -d <Distribution Name>
- After the installation of any one of the Distribution is successful, then run the command to set up the User –
- Username – <Your choice-able username>
- Password – <Your choice-able password>
Now you can see the Distribution is accessible from the Start menu of Windows
Run Confluent on Windows
- Run the below command –
curl -L --http1.1 https://cnfl.io/cli | sh -s -- -b /usr/local/bin
- If it didn’t run successfully, then change the permission for /usr/local/bin directory by running the below commands as –
sudo chown -R $(whoami) /usr/local/bin/
sudo chmod -R u=rwX,go=rX /usr/local/bin/
- After that run the above command again as –
curl -L --http1.1 https://cnfl.io/cli | sh -s -- -b /usr/local/bin
It will successfully install Confluent on your Windows Linux Distribution Account
- After that update, the Confluent, command is –
confluent update
- Now save the credentials for the Confluent –
confluent login --save
Provide the same credentials that you have used to Sign Up in Confluent
Once the credentials are saved it will show an output message as provided below –
Wrote credentials to netrc file “/home/<your name which you have set when you are setting up the user>/.netrc”
- To see all the environments run the command as –
confluent environment list
- To select a specific environment, run the command –
confluent environment use <Environment Id>
- To see all the Clusters in that environment, run the command –
confluent kafka cluster list
- To select a specific cluster, run the command –
confluent kafka cluster use <Cluster Id>
- To create a new API key, run the command –
confluent api-key create --resource <Cluster Id>
It will create a new API Key as –
+———+——————————————————————+
| API Key | <API Key> |
| Secret | <Secret Value> |
+———+——————————————————————+
- Run the command to use the newly created API Key to the specified cluster –
confluent api-key use <API Key> --resource <Cluster Id>
- To use an existing API Key, run the command –
confluent api-key store --resource <Cluster Id>
- To see the list of Topics in that Cluster, run the command –
confluent kafka topic list
- To create a new Topic in that Cluster, run the command –
confluent kafka topic create <Topic Name>
- To publish a message in the newly created Topic, run the command –
confluent kafka topic produce <Topic Name>
And then enter the values as :
1:"test"
2:"amik"
Here 1, 2 represents the key and "test", "amik" are the values, separated by colon (:)
- To consume the messages from the newly created topic, run the command –
confluent kafka topic consume -b <Topic Name> - Here -b means from --begining
- If the consumer falls under a specific group, then the command to consume the message is –
confluent kafka topic consume -b <Topic Name> --group <Consumer Group Name>
- If you want to generate a client config for that the command is –
confluent kafka client-config create <Language> --api-key=<API Key> --api-secret=<API Secret>
Here the language can be Java, Python, Ruby, C++, C#, .Net, Scala, Node.js, Spring Boot, Go, Clojure, Ktor, Rust, Groovy, and REST API.
Partitioning using CLI
- To create a topic by using the partition concept, run the command –
confluent kafka topic create --partitions 1 <Topic Name>
confluent kafka topic create --partitions 4 <Topic Name>
- To publish the messages in the newly created topic, run the command –
confluent kafka topic produce <Topic Name> --parse-key
1:"test"
2:"test2"
3:"test3"
Above are some of the text messages as published in the newly created topic
Once the topic is created and messages are published on the topic, you can see the messages in the Confluent Dashboard.
Other Useful Links
Spring Data using RDBMS (MySQL DB) and Spring REST
Spring Data using NoSQL DB and Spring REST
Spring Data using Cypher Neo4j DB and Spring REST
Spring Apache Kafka – Producer & Consumer
Spring Kafka Confluent – Producer & Consumer
Pingback: Apache Kafka – Producer Consumer using Confluent - springcavaj