Skip to content

Conversation

@marclop
Copy link
Contributor

@marclop marclop commented Jul 11, 2025

Wraps the error returned by the rate limiter with the grpc code library to ensure that the code is propagated to the client. For non 5xx errors, the code is set to codes.ResourceExhausted, which is the appropriate code for rate limiting errors.

Part of elastic/hosted-otel-collector#991

Wraps the error returned by the rate limiter with the grpc code library to ensure that the code is propagated to the client. For non 5xx errors, the code is set to `codes.ResourceExhausted`, which is the appropriate code for rate limiting errors. Part of elastic/hosted-otel-collector#991 Signed-off-by: Marc Lopez Rubio <marc5.12@outlook.com>
@marclop marclop requested a review from a team as a code owner July 11, 2025 06:23
@marclop marclop added the bug Something isn't working label Jul 11, 2025
@marclop marclop requested review from a team, JonasKunz and rogercoll July 11, 2025 06:23
@marclop marclop merged commit 1ed1d17 into elastic:main Jul 11, 2025
13 checks passed
@marclop marclop deleted the b/ratelimiter-propagate-code-in-error branch July 11, 2025 07:38
@zmoog
Copy link
Contributor

zmoog commented Jul 11, 2025

I see @axw already approved, but I already ran a quick test, and now it returns 429.

From make run-local-otelbench:

2025-07-11T07:42:51.464Z error internal/base_exporter.go:117 Exporting failed. Rejecting data. Try enabling sending_queue to survive temporary failures. {"resource": {"service.instance.id": "3d49f0e7-eb0e-47ca-8002-edb4abd0e5dd", "service.name": "/ko-app/loadgen", "service.version": "0.0.1"}, "otelcol.component.id": "otlp", "otelcol.component.kind": "exporter", "otelcol.signal": "traces", "error": "not retryable error: Permanent error: rpc error: code = Code(429) desc = 429 Too Many Requests", "rejected_items": 5} go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send go.opentelemetry.io/collector/exporter@v0.129.0/exporterhelper/internal/base_exporter.go:117 go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesRequest.newConsumeTraces.func1 go.opentelemetry.io/collector/exporter@v0.129.0/exporterhelper/traces.go:193 go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces go.opentelemetry.io/collector/consumer@v1.35.0/traces.go:27 go.opentelemetry.io/collector/processor/processorhelper.NewTraces.func1 go.opentelemetry.io/collector/processor/processorhelper@v0.129.0/traces.go:66 go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces go.opentelemetry.io/collector/consumer@v1.35.0/traces.go:27 go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces go.opentelemetry.io/collector/consumer@v1.35.0/traces.go:27 go.opentelemetry.io/collector/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces go.opentelemetry.io/collector/internal/fanoutconsumer@v0.129.0/traces.go:60 github.com/elastic/opentelemetry-collector-components/receiver/loadgenreceiver.(*tracesGenerator).Start.func1 github.com/elastic/opentelemetry-collector-components/receiver/loadgenreceiver@v0.0.0-00010101000000-000000000000/traces.go:130

QQ: does it says "not retryable error: Permanent error" because the OTLP exporter in otelbench has sending_queue.enabled: false?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

3 participants