Config file example: Apache HTTP Server
Include these sample settings in your standalone Elastic Agent elastic-agent.yml
configuration file to ingest data from Apache HTTP server.
outputs: default: type: elasticsearch hosts: - '{elasticsearch-host-url}' api_key: "my_api_key" agent: download: sourceURI: 'https://artifacts.elastic.co/downloads/' monitoring: enabled: true use_output: default namespace: default logs: true metrics: true inputs: - id: "insert a unique identifier here" name: apache-1 type: logfile use_output: default data_stream: namespace: default streams: - id: "insert a unique identifier here" data_stream: dataset: apache.access type: logs paths: - /var/log/apache2/access.log* - /var/log/apache2/other_vhosts_access.log* - /var/log/httpd/access_log* tags: - apache-access exclude_files: - .gz$ - id: "insert a unique identifier here" data_stream: dataset: apache.error type: logs paths: - /var/log/apache2/error.log* - /var/log/httpd/error_log* exclude_files: - .gz$ tags: - apache-error processors: - add_locale: null
- For available output settings, refer to Configure outputs for standalone Elastic Agents.
- For settings specific to the Elasticsearch output, refer to Configure the Elasticsearch output.
- The URL of the Elasticsearch cluster where output should be sent, including the port number. For example
https://12345ab6789cd12345ab6789cd.us-central1.gcp.cloud.es.io:443
. - An API key used to authenticate with the Elasticsearch cluster.
- For available download settings, refer to Configure download settings for standalone Elastic Agent upgrades.
- For available monitoring settings, refer to Configure monitoring for standalone Elastic Agents.
- For available input settings, refer to Configure inputs for standalone Elastic Agents.
- Specify a unique ID for the input.
- For available input types, refer to Elastic Agent inputs.
- Learn about Data streams for time series data.
- Specify a unique ID for each individual input stream. Naming the ID by appending the associated
data_stream
dataset (for example{{user-defined-unique-id}}-apache.access
or{{user-defined-unique-id}}-apache.error
) is a recommended practice, but any unique ID will work. - Refer to Logs in the Apache HTTP Server integration documentation for the logs available to ingest and exported fields.
- Path to the log files to be monitored.
outputs: default: type: elasticsearch hosts: - '{elasticsearch-host-url}' api_key: "my_api_key" agent: download: sourceURI: 'https://artifacts.elastic.co/downloads/' monitoring: enabled: true use_output: default namespace: default logs: true metrics: true inputs: type: apache/metrics use_output: default data_stream: namespace: default streams: - id: "insert a unique identifier here" data_stream: dataset: apache.status type: metrics metricsets: - status hosts: - 'http://127.0.0.1' period: 30s server_status_path: /server-status
- For available output settings, refer to Configure outputs for standalone Elastic Agents.
- For settings specific to the Elasticsearch output, refer to Configure the Elasticsearch output.
- The URL of the Elasticsearch cluster where output should be sent, including the port number. For example
https://12345ab6789cd12345ab6789cd.us-central1.gcp.cloud.es.io:443
. - An API key used to authenticate with the Elasticsearch cluster.
- For available download settings, refer to Configure download settings for standalone Elastic Agent upgrades.
- For available monitoring settings, refer to Configure monitoring for standalone Elastic Agents.
- For available input settings, refer to Configure inputs for standalone Elastic Agents.
- For available input types, refer to Elastic Agent inputs.
- Learn about Data streams for time series data.
- Specify a unique ID for each individual input stream. Naming the ID by appending the associated
data_stream
dataset (for example{{user-defined-unique-id}}-apache.status
) is a recommended practice, but any unique ID will work. - A user-defined dataset. You can specify anything that makes sense to signify the source of the data.
- Refer to Metrics in the Apache HTTP Server integration documentation for the type of metrics collected and exported fields.