Skip to content

Commit fd288ba

Browse files
authored
Merge pull request #24 from sebastienblanc/gitpod
add gitpod support
2 parents fc042eb + 3833f0d commit fd288ba

File tree

2 files changed

+95
-0
lines changed

2 files changed

+95
-0
lines changed

.gitpod.yml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
tasks:
2+
- init: pip install -r requirements.txt
3+
name: install requierements
4+
- name: Install avn CLI
5+
init: pip install aiven-client
6+
- name: Open Readme
7+
command: gp preview https://aiven.io/developer/create-your-own-data-stream-for-kafka-with-python-and-faker

README-gitpod.md

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# Quickstart with Gitpod
2+
3+
This workspace comes with some pre-installed stuff for you :
4+
5+
* Python requirements have already been installed
6+
* avn CLI has already been installed
7+
* jq has benn installed
8+
9+
First make sure to have an Aiven account, otherwise you are just a few clicks away of creating one [here](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=post)
10+
11+
Then make sure to get an personal access token, check this [video](https://www.youtube.com/watch?v=64G2QIMYOL4) to learn how to get one.
12+
13+
Open a terminal, you'll need to copy-paste or re-type all the bash commands below.
14+
15+
Now you can login :
16+
17+
```bash
18+
avn user login --token
19+
20+
```
21+
22+
Create a `certs` folder :
23+
24+
```bash
25+
mkdir certs
26+
```
27+
28+
Set your variables :
29+
```bash
30+
KAFKA_INSTANCE_NAME=my-kafka-demo
31+
CLOUD_REGION=aws-eu-south-1
32+
AIVEN_PLAN_NAME=startup-2
33+
DESTINATION_FOLDER_NAME=certs
34+
```
35+
36+
If you haven't yet, create a Aiven for Apache Kafka service :
37+
38+
```bash
39+
avn service create $KAFKA_INSTANCE_NAME \
40+
-t kafka \
41+
--cloud $CLOUD_REGION \
42+
-p $AIVEN_PLAN_NAME \
43+
-c kafka.auto_create_topics_enable=true \
44+
-c kafka_rest=true
45+
46+
```
47+
48+
Retrieve your host and port from the console and set them :
49+
And retrieve the Apache Kafka Service URI with
50+
51+
```bash
52+
53+
KAFKA_HOST=$(avn service get $KAFKA_INSTANCE_NAME --json | jq -r '.service_uri_params.host')
54+
KAFKA_PORT=$(avn service get $KAFKA_INSTANCE_NAME --json | jq -r '.service_uri_params.port')
55+
56+
```
57+
58+
You can wait for the newly created Apache Kafka instance to be ready with :
59+
60+
```bash
61+
avn service wait $KAFKA_INSTANCE_NAME
62+
```
63+
64+
Now get your certificates :
65+
66+
```bash
67+
avn service user-creds-download $KAFKA_INSTANCE_NAME \
68+
-d $DESTINATION_FOLDER_NAME \
69+
--username avnadmin
70+
```
71+
72+
And finally run the demo :
73+
74+
```bash
75+
76+
python main.py \
77+
--security-protocol ssl \
78+
--cert-folder $DESTINATION_FOLDER_NAME\
79+
--host $KAFKA_HOST \
80+
--port $KAFKA_PORT \
81+
--topic-name pizza-orders \
82+
--nr-messages 0 \
83+
--max-waiting-time 2 \
84+
--subject pizza
85+
86+
```
87+
88+
You should see a continuous flow of data being pushed to Apache Kafka, to the topic defined by the `--topic-name` parameter. You can either use the Aiven console, or tools like [kcat](https://docs.aiven.io/docs/products/kafka/howto/kcat) to browse the data.

0 commit comments

Comments
 (0)