Wired streams
Serverless Stack
With wired streams, all logs are sent to a single /logs endpoint, from which you can route data into child streams based on partitioning rules you set up manually or with the help of AI suggestions.
For more on wired streams, refer to:
- Wired streams field naming
- Turn on wired streams
- Send data to wired streams
- View wired streams in Discover
Wired streams store and process data in a normalized OpenTelemetry (OTel)–compatible format. This format aligns Elastic Common Schema (ECS) fields with OTel semantic conventions so all data is consistently structured and OTTL-expressible.
When data is ingested into a wired stream, it’s automatically translated into this normalized format:
- Standard ECS documents are converted to OTel fields (
message → body.text,severity_text → log.level,host.name → resource.attributes.host.name, and so on). - Custom fields are stored under
attributes.*.
To preserve backward-compatible querying, Streams creates aliases that mirror existing logs-*.otel-* data streams behavior. This allows queries to use either ECS or OTel field names interchangeably.
| ECS field | OTel field |
|---|---|
message | body.text |
log.level | severity_text |
span.id | span_id |
trace.id | trace_id |
host.name | resource.attributes.host.name |
host.ip | resource.attributes.host.ip |
custom_field | attributes.custom_field |
To turn on wired streams:
- Go to the Streams page using the navigation menu or the global search field, then open Settings.
- Turn on Enable wired streams.
To send data to wired streams, configure your shippers to send data to the /logs endpoint. To do this, complete the following configurations for your shipper:
processors: transform/logs-streams: log_statements: - context: resource statements: - set(attributes["elasticsearch.index"], "logs") service: pipelines: logs: receivers: [myreceiver] processors: [transform/logs-streams] exporters: [elasticsearch, otlp] - works with any logs receiver
- works with either
filebeat.inputs: - type: filestream id: my-filestream-id index: logs enabled: true paths: - /var/log/*.log # No need to install templates for wired streams setup: template: enabled: false output.elasticsearch: hosts: ["<elasticsearch-host>"] api_key: "<your-api-key>" output { elasticsearch { hosts => ["<elasticsearch-host>"] api_key => "<your-api-key>" index => "logs" action => "create" } } Use the Custom Logs (Filestream) integration to send data to wired streams:
- Find Fleet in the navigation menu or use the global search field.
- Select the Settings tab.
- Under Outputs, find the output you want to use to send data to streams, and select the icon.
- Turn on Write to logs streams.
- Add the Custom Logs (Filestream) integration to an agent policy.
- Enable the Use the "logs" data stream setting in the integration configuration under Change defaults.
- Under Where to add this integration, select an agent policy that uses the output you configured in step 4.
To view wired log streams in Discover:
- Manually create a data view for the wired streams index pattern (
logs,logs.*). - add the wireds streams index pattern (
logs,logs.*) to theobservability:logSourcesKibana advanced setting, which you can open from the navigation menu or by using the global search field.
After sending your data to wired streams:
- Partition data: Use the Partitioning tab to send data into meaningful child streams.
- Extract fields: Use the Processing tab to filter and analyze your data effectively.
- Map fields: Use the Schema tab to make fields easier to query.