Dear Elastic Team,
It looks like this issue is similar to the following:
We do use Python APM Elastic Agent along with Sentry and in Sentry we do see the following exceptions:
Failed to submit message: "Unable to reach APM Server: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) (url: http://127.0.0.1:8200/intake/v2/events)
Elasticsearch version:: v8.10.2
APM Server version: v8.10.2
APM Agent language and version: 6.19.0
Original install method (e.g. download page, yum, deb, from source, etc.) and version: with:
- template.yaml Layers: arn:aws:lambda:eu-central-1:267093732750:layer:elastic-apm-extension-ver-1-0-2-x86_64:1
- pip requirements.txt: elastic-apm==6.19.0
I enabled the ELASTIC_APM_LOG_LEVEL: debug in lambda "template.yaml" and captured the logs for the failed request.
Please note that elastic apm python module is installed as layer and as a python module during lambda build (hopefully that is fine).
Right now about the behaviour - it is working fine there is no misconfigurations. The apm transaction information is being properly shipped. From time to time this exception is happening (which is related to the golang library crash - see the log). If such crash is experienced then this transaction information is not being shipped properly.
Provide logs and/or server output (if relevant):
[DEBUG] 2023-11-13T15:55:10.826Z some-guid-here forced flush { "log.level": "debug", "@timestamp": "2023-11-13T15:55:10.827Z", "log.origin": { "file.name": "extension/route_handlers.go", "file.line": 78 }, "message": "Handling APM Data Intake", "ecs.version": "1.6.0" } { "log.level": "debug", "@timestamp": "2023-11-13T15:55:10.827Z", "log.origin": { "file.name": "extension/apm_server_transport.go", "file.line": 238 }, "message": "Adding agent data to buffer to be sent to apm server", "ecs.version": "1.6.0" } 2023/11/13 15:55:10 http: panic serving 127.0.0.1:43610: send on closed channel goroutine 86 [running]: net/http.(*conn).serve.func1() /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:1825 +0xbf panic({0x6d3840, 0x7a5990}) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/runtime/panic.go:844 +0x258 elastic/apm-lambda-extension/extension.handleIntakeV2Events.func1({0x7a8b38, 0xc0001b4ee0}, 0xc000287200) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/src/github.com/elastic/apm-aws-lambda/apm-lambda-extension/extension/route_handlers.go:97 +0x497 net/http.HandlerFunc.ServeHTTP(0x70f27ac438?, {0x7a8b38?, 0xc0001b4ee0?}, 0xc0002938a0?) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:2084 +0x2f net/http.(*ServeMux).ServeHTTP(0xc0001db097?, {0x7a8b38, 0xc0001b4ee0}, 0xc000287200) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:2462 +0x149 net/http.serverHandler.ServeHTTP({0x7a7d20?}, {0x7a8b38, 0xc0001b4ee0}, 0xc000287200) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:2916 +0x43b net/http.(*conn).serve(0xc00018c000, {0x7a8d90, 0xc000074600}) /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:1966 +0x5d7 created by net/http.(*Server).Serve /var/lib/jenkins/workspace/ibrary_apm-aws-lambda-mbp_v1.0.2/.gvm/versions/go1.18.2.linux.amd64/src/net/http/server.go:3071 +0x4db { "log.level": "debug", "@timestamp": "2023-11-13T15:55:10.827Z", "log.origin": { "file.name": "apm-lambda-extension/main.go", "file.line": 125 }, "message": "Received event.", "ecs.version": "1.6.0" } Thank you for your Support!