1) Source connectors: Start our kafka cluster. This is the docker-compose.yml file:
version: '2' services: kafka-cluster: image: landoop/fast-data-dev:cp3.3.0 environment: ADV_HOST: 127.0.0.1 RUNTESTS: 0 ports: - 2181:2181 # Zookeeper - 3030:3030 # Landoop UI - 8081-8083:8081-8083 # REST Proxy, Schema Registry, Kafka Connect - 9581-9585:9581-9585 # JMX Ports - 9092:9092 # Kafka Broker
Create and start the container:
docker-compose up kafka-cluster
2) Create the topic we're going to write to:
docker run --rm -it --net=host landoop/fast-data-dev:cp3.3.0 bash kafka-topics --create --topic demo-2-distributed --partitions 3 --replication-factor 1 --zookeeper 127.0.0.1:2181
3) In a browser go to 127.0.0.1:3030 -> Connect UI
Create a new connector -> File Source
Paste the next configuration and click the Create button:
name=file-stream-demo-distributed connector.class=org.apache.kafka.connect.file.FileStreamSourceConnector tasks.max=1 file=demo-file.txt topic=demo-distributed key.converter=org.apache.kafka.connect.json.JsonConverter key.converter.schemas.enable=true value.converter=org.apache.kafka.connect.json.JsonConverter value.converter.schemas.enable=true
4) Create the source file:
Execute docker ps to see the containerId.
docker exec -it <containerId> bash touch demo-file.txt echo "add any content" >> demo-file.txt
3) In a browser go to 127.0.0.1:3030 -> Kafka Topics UI
Click the topic defined previously (demo-distributed) and validate the content:
Top comments (0)