Alcibiades Warlock
01/21/2025, 8:12 AM#!/bin/bash
HOST="our.self.hosted.endpoint"
PORT="4317"
# Generate IDs
TRACE_ID_HEX=$(openssl rand -hex 16)
SPAN_ID_HEX=$(openssl rand -hex 8)
TRACE_ID=$(echo -n "$TRACE_ID_HEX" | xxd -r -p | base64)
SPAN_ID=$(echo -n "$SPAN_ID_HEX" | xxd -r -p | base64)
# Timestamps
START_TIME=$(date +%s)
START_TIME_NANO=$((START_TIME * 1000000000))
END_TIME_NANO=$((START_TIME_NANO + 1000000000))
# Create span payload (unchanged)
cat > span_request.json << EOF
{
"resourceSpans": [{
"resource": {
"attributes": [{
"key": "service.name",
"value": { "stringValue": "test-service" }
}]
},
"scopeSpans": [{
"scope": {
"name": "test-client",
"version": "1.0.0"
},
"spans": [{
"traceId": "${TRACE_ID}",
"spanId": "${SPAN_ID}",
"name": "test-span",
"kind": "SPAN_KIND_SERVER",
"startTimeUnixNano": "${START_TIME_NANO}",
"endTimeUnixNano": "${END_TIME_NANO}",
"attributes": [{
"key": "test.attribute",
"value": { "stringValue": "test-value" }
}],
"status": { "code": "STATUS_CODE_OK" }
}]
}]
}]
}
EOF
# Create log payload (unchanged)
cat > log_request.json << EOF
{
"resourceLogs": [{
"resource": {
"attributes": [{
"key": "service.name",
"value": { "stringValue": "test-service" }
}]
},
"scopeLogs": [{
"scope": {
"name": "test-client",
"version": "1.0.0"
},
"logRecords": [{
"timeUnixNano": "${START_TIME_NANO}",
"severityText": "INFO",
"severityNumber": 9,
"body": {
"stringValue": "Test log message"
},
"traceId": "${TRACE_ID}",
"spanId": "${SPAN_ID}",
"attributes": [{
"key": "test.attribute",
"value": { "stringValue": "test-value" }
}]
}]
}]
}]
}
EOF
# Create metric payload
cat > metric_request.json << EOF
{
"resourceMetrics": [{
"resource": {
"attributes": [{
"key": "service.name",
"value": { "stringValue": "test-service" }
}]
},
"scopeMetrics": [{
"scope": {
"name": "test-client",
"version": "1.0.0"
},
"metrics": [{
"name": "test_counter",
"description": "A test counter metric",
"unit": "1",
"sum": {
"dataPoints": [{
"startTimeUnixNano": "${START_TIME_NANO}",
"timeUnixNano": "${END_TIME_NANO}",
"asInt": "42",
"attributes": [{
"key": "test.attribute",
"value": { "stringValue": "test-value" }
}]
}],
"aggregationTemporality": "AGGREGATION_TEMPORALITY_CUMULATIVE",
"isMonotonic": true
}
}]
}]
}]
}
EOF
echo "Testing Span Export..."
grpcurl \
-d @ \
-format json \
-insecure \
-v \
-proto opentelemetry/proto/collector/trace/v1/trace_service.proto \
-import-path ./opentelemetry-proto \
"${HOST}:${PORT}" \
opentelemetry.proto.collector.trace.v1.TraceService/Export < span_request.json
echo -e "\nTesting Log Export..."
grpcurl \
-d @ \
-format json \
-insecure \
-v \
-proto opentelemetry/proto/collector/logs/v1/logs_service.proto \
-import-path ./opentelemetry-proto \
"${HOST}:${PORT}" \
opentelemetry.proto.collector.logs.v1.LogsService/Export < log_request.json
echo -e "\nTesting Metric Export..."
grpcurl \
-d @ \
-format json \
-insecure \
-v \
-proto opentelemetry/proto/collector/metrics/v1/metrics_service.proto \
-import-path ./opentelemetry-proto \
"${HOST}:${PORT}" \
opentelemetry.proto.collector.metrics.v1.MetricsService/Export < metric_request.json
# Cleanup
rm span_request.json log_request.json metric_request.json
I have a self hosted signoz deployment using the helm chart with pretty default settings, and it seems to be ingesting kubernetes and infra logs just fine. Strangely though, when instrumenting an app, I noticed that I could send traces and metrics, but if I tried using an otel exporter I got not implemented for logs, so wrote the above bash script to debug. Lo, and behold:
Testing Log Export...
Resolved method descriptor:
// For performance reasons, it is recommended to keep this RPC
// alive for the entire life of the application.
rpc Export ( .opentelemetry.proto.collector.logs.v1.ExportLogsServiceRequest ) returns ( .opentelemetry.proto.collector.logs.v1.ExportLogsServiceResponse );
Request metadata to send:
(empty)
Response headers received:
(empty)
Response trailers received:
content-type: application/grpc
date: Tue, 21 Jan 2025 08:11:29 GMT
server: envoy
Sent 1 request and received 0 responses
ERROR:
Code: Unimplemented
Message:
Which seems unlikely/wrong? As far as I can tell the default config should be:
service:
telemetry:
logs:
encoding: json
metrics:
address: 0.0.0.0:8888
extensions: [health_check, zpages, pprof]
pipelines:
traces:
receivers: [otlp, jaeger]
processors: [signozspanmetrics/delta, batch]
exporters: [clickhousetraces]
metrics:
receivers: [otlp]
processors: [batch]
exporters: [clickhousemetricswrite]
logs:
receivers: [otlp, httplogreceiver/heroku, httplogreceiver/json]
processors: [batch]
exporters: [clickhouselogsexporter]
Which one would then think would be working and in fact how the other logs are traversing signoz and into clickhouse.
I’m stumped here, any input appreciated.Alcibiades Warlock
01/21/2025, 4:14 PMSrikanth Chekuri
01/21/2025, 6:42 PMAlcibiades Warlock
01/21/2025, 6:57 PMAlcibiades Warlock
01/21/2025, 6:58 PM