sup yall, I'm trying to follow this guide and I get the following error from otelcol-contrib. has an...
j

J Gr

7 months ago
sup yall, I'm trying to follow this guide and I get the following error from otelcol-contrib. has anyone seen this before? https://signoz.io/docs/tutorial/opentelemetry-binary-usage-in-virtual-machine/ error:
2024-12-24T10:58:18.630-0500	error	exporterhelper/retry_sender.go:145	Exporting failed. The error is not retryable. Dropping data.	{"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "Permanent error: rpc error: code = Unimplemented desc = unknown service opentelemetry.proto.collector.metrics.v1.MetricsService", "dropped_items": 2456}
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send>
	<http://go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/retry_sender.go:145|go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/retry_sender.go:145>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send>
	<http://go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/metrics.go:176|go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/metrics.go:176>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1|go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1>
	<http://go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/queue_sender.go:126|go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/queue_sender.go:126>
<http://go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1|go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1>
	<http://go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/internal/bounded_memory_queue.go:52|go.opentelemetry.io/collector/exporter@v0.88.0/exporterhelper/internal/bounded_memory_queue.go:52>
<@U03BTNACHNX> im facing issue with DNS for otelcollector deployed in eks with pvt ingress Lb enabl...
g

gopinath nagalla

11 months ago
@nitya-signoz im facing issue with DNS for otelcollector deployed in eks with pvt ingress Lb enabled and it is selfhosted when we are hitting otelcollector DNS we are getting 404 page not found service:
signoz-otel-collector                          ClusterIP   172.20.237.48    <none>        14250/TCP,14268/TCP,8081/TCP,8082/TCP,8888/TCP,4317/TCP,4318/TCP   6d
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  annotations:
    alb.ingress.kubernetes.io/actions.ssl-redirect: '{"Type": "redirect", "RedirectConfig":
      "Protocol": "HTTPS", "Port": "443", "StatusCode": "HTTP_301"}}'
    alb.ingress.kubernetes.io/group.name: private
    alb.ingress.kubernetes.io/listen-ports: '[{"HTTPS": 443}]'
    alb.ingress.kubernetes.io/tags: environment=development
    alb.ingress.kubernetes.io/target-type: ip
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"networking.k8s.io/v1","kind":"Ingress","metadata":{"annotations":{"alb.ingress.kubernetes.io/actions.ssl-redirect":"{\"Type\": \"redirect\", \"RedirectConfig\": \"Protocol\": \"HTTPS\", \"Port\": \"443\", \"StatusCode\": \"HTTP_301\"}}","alb.ingress.kubernetes.io/group.name":"private","alb.ingress.kubernetes.io/listen-ports":"[{\"HTTPS\": 443}]","alb.ingress.kubernetes.io/tags":"environment=development","alb.ingress.kubernetes.io/target-type":"ip","kubernetes.io/ingress.class":"alb"},"labels":{"app.kubernetes.io/component":"otel-collector","app.kubernetes.io/instance":"dev-signoz","app.kubernetes.io/managed-by":"Helm","app.kubernetes.io/name":"signoz","app.kubernetes.io/version":"0.51.0","argocd.argoproj.io/instance":"dev-signoz","helm.sh/chart":"signoz-0.49.0"},"name":"signoz-otel-collector","namespace":"scrapinghub"},"spec":{"rules":[{"host":"otel.fev.engineering","http":{"paths":[{"backend":{"service":{"name":"signoz-otel-collector","port":{"number":4318}}},"path":"/*","pathType":"ImplementationSpecific"}]}}]}}
    kubernetes.io/ingress.class: alb
  creationTimestamp: "2024-08-08T10:26:50Z"
  finalizers:
  - group.ingress.k8s.aws/private
  generation: 5
  labels:
    app.kubernetes.io/component: otel-collector
    app.kubernetes.io/instance: dev-signoz
    app.kubernetes.io/managed-by: Helm
    app.kubernetes.io/name: signoz
    app.kubernetes.io/version: 0.51.0
    argocd.argoproj.io/instance: dev-signoz
    helm.sh/chart: signoz-0.49.0
  name: signoz-otel-collector
  namespace: scrapingt
  resourceVersion: "899762941"
  uid: e5d0c057-f39e0d18
spec:
  rules:
  - host: otel.fev.engineering
    http:
      paths:
      - backend:
          service:
            name: signoz-otel-collector
            port:
              number: 4318
        path: /*
        pathType: ImplementationSpecific
status:
  loadBalancer:
    ingress:
    - hostname: internal-k8s-private-4ae926.us-west-2.elb.amazonaws.com
https://signoz-community.slack.com/archives/C01HWQ1R0BC/p1719997121751949 please refer this too
Hi, I ran into issues setting metrics total retention period in v0.46 and tried now on v0.47 as well...
p

Pavan Kumar Dinesh

about 1 year ago
Hi, I ran into issues setting metrics total retention period in v0.46 and tried now on v0.47 as well. I did see some logs in the query service
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"INFO","timestamp":"2024-06-06T13:05:18.162Z","caller":"app/server.go:388","msg":"/api/v1/settings/ttl\ttimeTaken:42.476559ms","timeTaken":42,"path":"/api/v1/settings/ttl"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"ERROR","timestamp":"2024-06-06T13:05:18.171Z","caller":"clickhouseReader/reader.go:2406","msg":"error while setting ttl.","error":"code: 47, message: There was an error on [chi-signoz-with-clickhouse-cluster-0-0:9000]: Code: 47. DB::Exception: Missing columns: 'timestamp_ms' while processing query: 'toDateTime(toUInt32(timestamp_ms / 1000), 'UTC') + toIntervalSecond(604800)', required columns: 'timestamp_ms' 'timestamp_ms'. (UNKNOWN_IDENTIFIER) (version 24.1.2.5 (official build))","stacktrace":"<http://go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2|go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2>\n\t/home/runner/work/signoz/signoz/pkg/query-service/app/clickhouseReader/reader.go:2406"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"ERROR","timestamp":"2024-06-06T13:05:18.238Z","caller":"clickhouseReader/reader.go:2406","msg":"error while setting ttl.","error":"code: 47, message: There was an error on [chi-signoz-with-clickhouse-cluster-0-0:9000]: Code: 47. DB::Exception: Missing columns: 'timestamp_ms' while processing query: 'toDateTime(toUInt32(timestamp_ms / 1000), 'UTC') + toIntervalSecond(604800)', required columns: 'timestamp_ms' 'timestamp_ms'. (UNKNOWN_IDENTIFIER) (version 24.1.2.5 (official build))","stacktrace":"<http://go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2|go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2>\n\t/home/runner/work/signoz/signoz/pkg/query-service/app/clickhouseReader/reader.go:2406"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"ERROR","timestamp":"2024-06-06T13:05:18.250Z","caller":"clickhouseReader/reader.go:2406","msg":"error while setting ttl.","error":"code: 47, message: There was an error on [chi-signoz-with-clickhouse-cluster-0-0:9000]: Code: 47. DB::Exception: Missing columns: 'timestamp_ms' while processing query: 'toDateTime(toUInt32(timestamp_ms / 1000), 'UTC') + toIntervalSecond(604800)', required columns: 'timestamp_ms' 'timestamp_ms'. (UNKNOWN_IDENTIFIER) (version 24.1.2.5 (official build))","stacktrace":"<http://go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2|go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2>\n\t/home/runner/work/signoz/signoz/pkg/query-service/app/clickhouseReader/reader.go:2406"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"ERROR","timestamp":"2024-06-06T13:05:18.264Z","caller":"clickhouseReader/reader.go:2406","msg":"error while setting ttl.","error":"code: 47, message: There was an error on [chi-signoz-with-clickhouse-cluster-0-0:9000]: Code: 47. DB::Exception: Missing columns: 'timestamp_ms' while processing query: 'toDateTime(toUInt32(timestamp_ms / 1000), 'UTC') + toIntervalSecond(604800)', required columns: 'timestamp_ms' 'timestamp_ms'. (UNKNOWN_IDENTIFIER) (version 24.1.2.5 (official build))","stacktrace":"<http://go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2|go.signoz.io/signoz/pkg/query-service/app/clickhouseReader.(*ClickHouseReader).SetTTL.func2>\n\t/home/runner/work/signoz/signoz/pkg/query-service/app/clickhouseReader/reader.go:2406"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"INFO","timestamp":"2024-06-06T13:05:19.224Z","caller":"clickhouseReader/reader.go:2499","msg":"checkTTLStatusItem query","query":"SELECT id, status, ttl, cold_storage_ttl FROM ttl_status WHERE table_name = ? ORDER BY created_at DESC","tableName":"signoz_metrics.samples_v4"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"INFO","timestamp":"2024-06-06T13:05:19.228Z","caller":"clickhouseReader/reader.go:2499","msg":"checkTTLStatusItem query","query":"SELECT id, status, ttl, cold_storage_ttl FROM ttl_status WHERE table_name = ? ORDER BY created_at DESC","tableName":"signoz_metrics.samples_v4"}
signoz-with-clickhouse-query-service-0 signoz-with-clickhouse-query-service {"level":"INFO","timestamp":"2024-06-06T13:05:19.229Z","caller":"clickhouseReader/reader.go:2591","msg":"Parsing TTL from: ","queryResp":"MergeTree PARTITION BY toDate(unix_milli / 1000) ORDER BY (env, temporality, metric_name, fingerprint, unix_milli) TTL toDateTime(unix_milli / 1000) + toIntervalSecond(2592000) SETTINGS ttl_only_drop_parts = 1, index_granularity = 8192"}
is this a known issue?