Hi Team, I’ve been experimenting with Signoz for ...
# support
r
Hi Team, I’ve been experimenting with Signoz for quite a while, and hats-off to an amazing product. I have 2 things that i needed help with - 1. Is there a way to provide headers in monitoring http endpoints? One of the endpoints that I want to monitor requires an authorisation header. 2. I am constantly seeing these errors in my otel-collector logs. Is this concerning? Can these be disabled somehow so I don’t miss the important logs?
Copy code
error	exporterhelper/queued_retry.go:394	Exporting failed. The error is not retryable. Dropping data.	{"kind": "exporter", "data_type": "metrics", "name": "clickhousemetricswrite", "error": "Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination; Permanent error: invalid temporality and type combination", "errorCauses": [{"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}, {"error": "Permanent error: invalid temporality and type combination"}], "dropped_items": 143}
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send>
	/go/pkg/mod/go.opentelemetry.io/collector@v0.66.0/exporter/exporterhelper/queued_retry.go:394
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send>
	/go/pkg/mod/go.opentelemetry.io/collector@v0.66.0/exporter/exporterhelper/metrics.go:135
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1|go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1>
	/go/pkg/mod/go.opentelemetry.io/collector@v0.66.0/exporter/exporterhelper/queued_retry.go:205
<http://go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1|go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1>
	/go/pkg/mod/go.opentelemetry.io/collector@v0.66.0/exporter/exporterhelper/internal/bounded_memory_queue.go:61
s
1. Is there a way to provide headers in monitoring http endpoints? One of the endpoints that I want to monitor requires an authorisation header.
Not currently supported now.
1. I am constantly seeing these errors in my otel-collector logs. Is this concerning? Can these be disabled somehow so I don’t miss the important logs?
What language SDKs are you using? This happens when the metric temporality is delta. This shouldn’t happen unless you explicitly configure the SDKs to prefer delta as the spec recommended default is cumulative. If you didn’t change any SDK configuration let me know what SDK are you using otherwise you can use the cumulative temporality to not see this error.
r
@Srikanth Chekuri 1. Okay, thank you. I’ll explore other solutions. 2. I am using FastAPI (Python) - To be more specific -> opentelemetry-instrumentation-fastapi = “^0.36b0” I wanted to understand if this error is coming from metrics or from traces? Since I’m seeing these errors in my otel-collector logs and not otel-collector-metrics logs. For metrics, I have implemented Prometheus-FastAPI-Instrumentator, and using prometheus scrape configs in signoz to import metrics.
s
Did you configure the temporality through env or constructor params?
r
Sorry for being a bit dumb here. But I didn’t really understand what you mean. This is my configuration -> In my start application code ->
Copy code
FastAPIInstrumentor.instrument_app(app)
prometheus_instrumentor = Instrumentator(
        should_group_status_codes=True,
        should_ignore_untemplated=True,
        should_respect_env_var=True,
        should_instrument_requests_inprogress=True,
        excluded_handlers=["/health", "/metrics"],
        env_var_name="ENABLE_METRICS",
        inprogress_name="http_requests_inprogress",
        inprogress_labels=True,
)
prometheus_instrumentor.instrument(app).expose(app, include_in_schema=False)
In my environment variables ->
Copy code
OTEL_RESOURCE_ATTRIBUTES=service.name=inventory,deployment.environment=stage,version=1.0.0
OTEL_EXPORTER_OTLP_ENDPOINT=<https://opentelemetryhttp.infra.kroozz.com>
OTEL_METRICS_EXPORTER=none
OTEL_PYTHON_FASTAPI_EXCLUDED_URLS=health,metrics
ENABLE_METRICS=true
SERVICE_NAME=inventory
ENVIRONMENT=stage
In my Dockerfile CMD ->
Copy code
ENTRYPOINT ["opentelemetry-instrument", "--traces_exporter", "otlp_proto_http", "--metrics_exporter" ,"otlp_proto_http", "gunicorn", "app.main:app", "--workers", "2", "--worker-class", "uvicorn.workers.UvicornWorker","--bind", "0.0.0.0:8000"]
My opentelemetry http exporter is placed behind a load balancer, which handles the SSL configuration
@Srikanth Chekuri anything that you could maybe help me with here please?
s
I am not sure what is the issue here. I will have to spend some time digging this deeper.