|
| 1 | +# Quickstart with Gitpod |
| 2 | + |
| 3 | +This workspace comes with some pre-installed stuff for you : |
| 4 | + |
| 5 | +* Python requirements have already been installed |
| 6 | +* avn CLI has already been installed |
| 7 | +* jq has benn installed |
| 8 | + |
| 9 | +First make sure to have an Aiven account, otherwise you are just a few clicks away of creating one [here](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=post) |
| 10 | + |
| 11 | +Then make sure to get an personal access token, check this [video](https://www.youtube.com/watch?v=64G2QIMYOL4) to learn how to get one. |
| 12 | + |
| 13 | +Open a terminal, you'll need to copy-paste or re-type all the bash commands below. |
| 14 | + |
| 15 | +Now you can login : |
| 16 | + |
| 17 | +```bash |
| 18 | +avn user login --token |
| 19 | + |
| 20 | +``` |
| 21 | + |
| 22 | +Create a `certs` folder : |
| 23 | + |
| 24 | +```bash |
| 25 | +mkdir certs |
| 26 | +``` |
| 27 | + |
| 28 | +Set your variables : |
| 29 | +```bash |
| 30 | +KAFKA_INSTANCE_NAME=my-kafka-demo |
| 31 | +CLOUD_REGION=aws-eu-south-1 |
| 32 | +AIVEN_PLAN_NAME=startup-2 |
| 33 | +DESTINATION_FOLDER_NAME=certs |
| 34 | +``` |
| 35 | + |
| 36 | +If you haven't yet, create a Aiven for Apache Kafka service : |
| 37 | + |
| 38 | +```bash |
| 39 | +avn service create $KAFKA_INSTANCE_NAME \ |
| 40 | + -t kafka \ |
| 41 | + --cloud $CLOUD_REGION \ |
| 42 | + -p $AIVEN_PLAN_NAME \ |
| 43 | + -c kafka.auto_create_topics_enable=true \ |
| 44 | + -c kafka_rest=true |
| 45 | + |
| 46 | +``` |
| 47 | + |
| 48 | +Retrieve your host and port from the console and set them : |
| 49 | +And retrieve the Apache Kafka Service URI with |
| 50 | + |
| 51 | +```bash |
| 52 | + |
| 53 | +KAFKA_HOST=$(avn service get $KAFKA_INSTANCE_NAME --json | jq -r '.service_uri_params.host') |
| 54 | +KAFKA_PORT=$(avn service get $KAFKA_INSTANCE_NAME --json | jq -r '.service_uri_params.port') |
| 55 | + |
| 56 | +``` |
| 57 | + |
| 58 | +You can wait for the newly created Apache Kafka instance to be ready with : |
| 59 | + |
| 60 | +```bash |
| 61 | +avn service wait $KAFKA_INSTANCE_NAME |
| 62 | +``` |
| 63 | + |
| 64 | +Now get your certificates : |
| 65 | + |
| 66 | +```bash |
| 67 | +avn service user-creds-download $KAFKA_INSTANCE_NAME \ |
| 68 | + -d $DESTINATION_FOLDER_NAME \ |
| 69 | + --username avnadmin |
| 70 | +``` |
| 71 | + |
| 72 | +And finally run the demo : |
| 73 | + |
| 74 | +```bash |
| 75 | + |
| 76 | +python main.py \ |
| 77 | + --security-protocol ssl \ |
| 78 | + --cert-folder $DESTINATION_FOLDER_NAME\ |
| 79 | + --host $KAFKA_HOST \ |
| 80 | + --port $KAFKA_PORT \ |
| 81 | + --topic-name pizza-orders \ |
| 82 | + --nr-messages 0 \ |
| 83 | + --max-waiting-time 2 \ |
| 84 | + --subject pizza |
| 85 | + |
| 86 | +``` |
| 87 | + |
| 88 | +You should see a continuous flow of data being pushed to Apache Kafka, to the topic defined by the `--topic-name` parameter. You can either use the Aiven console, or tools like [kcat](https://docs.aiven.io/docs/products/kafka/howto/kcat) to browse the data. |
0 commit comments