I created the elasticsearch and kibina setup which is running outside the kubernetes cluster and i am using fluentd to gather kubernetes logs and to send data to elasticsearch. I am running fluentd as a deamonset in kubernetes.
apiVersion: apps/v1 kind: DaemonSet metadata: namespace: efk name: fluentd labels: app: fluentd spec: selector: matchLabels: app: fluentd template: metadata: labels: app: fluentd spec: serviceAccount: fluentd serviceAccountName: fluentd containers: - name: fluentd image: fluent/fluentd-kubernetes-daemonset:v1.4.2-debian-elasticsearch-1.1 env: - name: FLUENT_ELASTICSEARCH_HOST value: "HOST_IP" - name: FLUENT_ELASTICSEARCH_PORT value: "9200" - name: FLUENT_ELASTICSEARCH_SCHEME value: "http" - name: FLUENTD_SYSTEMD_CONF value: disable - name: FLUENT_CONTAINER_TAIL_PARSER_TYPE value: /^(?<time>.+) (?<stream>stdout|stderr) [^ ]* (?<log>.*)$/ - name: FLUENT_CONTAINER_TAIL_EXCLUDE_PATH value: /var/log/containers/fluent* resources: limits: memory: 512Mi requests: cpu: 100m memory: 200Mi volumeMounts: - name: varlog mountPath: /var/log - name: varlibdockercontainers mountPath: /var/lib/docker/containers readOnly: true terminationGracePeriodSeconds: 30 volumes: - name: varlog hostPath: path: /var/log - name: varlibdockercontainers hostPath: path: /var/lib/docker/containers But with this code i am able to gather only kubernetes node level logs, I want to gather logs from application pod from a specific path eg: /tmp/logs/* This is the path my application generating the logs.
To do this just running fluentD as Deamonset works or do we need to run fluentD as sidecar for all the application pods ?