https://signoz.io logo
#support
Title
# support
s

Sony Venkatesh

01/03/2024, 5:19 PM
Hello, I am getting
"Permanent error: rpc error: code = Unknown desc = data refused due to high memory usage"
error while my opentelemetry exporter is trying to export to signoz. Also this doesnt happen always. Can it because of the
max_recv_msg_size_mib
set for signoz otlp receiver
Detailed error looks like this
Copy code
2024-01-03T04:51:24.701Z	error	exporterhelper/retry_sender.go:145	Exporting failed. The error is not retryable. Dropping data.	{"kind": "exporter", "data_type": "logs", "name": "otlp", "error": "Permanent error: rpc error: code = Unknown desc = data refused due to high memory usage", "dropped_items": 823}
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send>
	<http://go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/retry_sender.go:145|go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/retry_sender.go:145>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send>
	<http://go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/logs.go:176|go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/logs.go:176>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1|go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1>
	<http://go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/queue_sender.go:126|go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/queue_sender.go:126>
<http://go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1|go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1>
	<http://go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/internal/bounded_memory_queue.go:52|go.opentelemetry.io/collector/exporter@v0.87.0/exporterhelper/internal/bounded_memory_queue.go:52>
I dont see this as a memory limiter thing bcz this is failing at exporter level
s

Srikanth Chekuri

01/03/2024, 5:23 PM
Can it because of the
max_recv_msg_size_mib
set for signoz otlp receiver
No What is your signoz collector config?
s

Sony Venkatesh

01/03/2024, 5:32 PM
Mostly base config.
Copy code
config:
      receivers:
        otlp: 
          protocols: 
            http:
              # Since this collector needs to receive data from the web, enable cors for all origins
              # `allowed_origins` can be refined for your deployment domain
              cors:
                allowed_origins:
                  - "http://*"
                  - "https://*"
            grpc:
s

Srikanth Chekuri

01/03/2024, 5:34 PM
Share the full config bro
s

Sony Venkatesh

01/03/2024, 5:37 PM
Collector config 👇 Let me know if you need more
Copy code
otelCollector: 
    autoscaling:
      enabled: true
      keda: 
        enabled: true
        maxReplicaCount: "10"
    config:
      receivers:
        otlp: 
          protocols: 
            http:
              # Since this collector needs to receive data from the web, enable cors for all origins
              # `allowed_origins` can be refined for your deployment domain
              cors:
                allowed_origins:
                  - "http://*"
                  - "https://*"
            grpc:
    resources:
      requests:
        cpu: 512m
        memory: 1024Mi
s

Srikanth Chekuri

01/03/2024, 5:39 PM
You are probably sharing the override values part. This is not it. Use kubectl describe configmap for otel collector pod and share the full config.
s

Sony Venkatesh

01/03/2024, 5:45 PM
signoz.txt
s

Srikanth Chekuri

01/03/2024, 5:47 PM
Do you have memory limiter enabled in agent collector?
s

Sony Venkatesh

01/03/2024, 5:48 PM
This is my otel collector config. The one that is exporting to signoz
s

Srikanth Chekuri

01/03/2024, 5:49 PM
It's erroring because of the memory limiter
s

Sony Venkatesh

01/03/2024, 5:50 PM
But that shouldnt happen at the exporter level right. Also it should be temporary. Isnt it ?
s

Srikanth Chekuri

01/03/2024, 6:33 PM
Yes, it should be temporary. The main reason is memory limiter put in place and I am not sure about the stacktrace part. Might have to look into detail.
s

Sony Venkatesh

01/04/2024, 1:38 AM
Any suggestions on how do we handke this. Does not having memory limiter helps ? I see that's the default
s

Srikanth Chekuri

01/04/2024, 2:08 AM
You should better estimate the potential memory usage based on the load you expect to generate and set some better upper limits for memory. Or flush the batches more often than 2s etc...
2 Views