Hi we have installed signoz in k8 recently, we are...
# support
p
Hi we have installed signoz in k8 recently, we are facing issue that we are not getting pod logs of application in signoz ui, but we are getting the logs of signoz componets. can you please help me in this
as per documention When you deploy SigNoz to your kubernetes cluster it will automatically start collecting all the pod logs
v
@nitya-signoz Can you look into this?
p
@Vishal Sharma is there anyone who can help me for this?
n
You are running EKS or something else ?
p
AKS
@nitya-signoz
n
Don’t think we have tested properly in AKS. But the only reason this might not be working is that the filereader cannot access files in the node. can you manually check if you can access files present in
/var/log/pods/*/*/*.log
https://github.com/SigNoz/charts/blob/5d322fbf515a3af07b4ea8102be6ca4b8f16654b/charts/k8s-infra/values.yaml#L86 through the collector pod cc @Prashant Shahi
p
there is no file in this path /var/log/pods/*/*/*.log , i have checked inside the pods
n
It means that AKS stores logs in a different path, I will have to check on this. or maybe more permissions are required
p
but some logs we are getting of signoz componet, i have checked on those pod as well , there is no file in this path /var/log/pods/*/*/*.log
although we are getting logs of signoz components
n
Interesting, @Prashant Shahi any idea on the above ^?
p
@prashant kumar do you have signoz and your application deployed in same cluster?
which namespace?
p
yes
there are 3 namespaces 1. zp-rara-backend 2. zp-rara-frontend 3. zp-devops-tools
p
okay, can you please share logs of the
otel-agent
pods?
p
sharing
i am just sending you json which we are getting for signoz component log { "timestamp": 1684758022447026700, "id": "2PyEEZGV7J0yMixQCHwdjkijkSk", "trace_id": "", "span_id": "", "trace_flags": 0, "severity_text": "", "severity_number": 0, "body": "10.20.193.90 - - [22/May/20231220:22 +0000] \"GET /api/5QVY36-0856-7c2867 HTTP/1.1\" 400 30 \"-\" \"okhttp/2.7.5\" 147 0.003 [zp-rara-backend-zp-rara-ms-oms-brs-80] [] 10.20.192.192:8181 30 0.003 400 089ecf4890d6aebe1caec59720a10cf9", "resources_string": { "host_name": "aks-ondemand2c-23288688-vmss000007", "k8s_cluster_name": "", "k8s_container_name": "controller", "k8s_container_restart_count": "2", "k8s_namespace_name": "zp-devops-ingress", "k8s_node_name": "aks-ondemand2c-23288688-vmss000007", "k8s_pod_ip": "k8s_pod_name": "main-public-controller-74d9bc46-hhrcw", "k8s_pod_start_time": "2023-04-25 031309 +0000 UTC", "k8s_pod_uid": "1f5b75c1-eddb-4fe6-a121-f2293f23fcdd", "os_type": "linux", "signoz_component": "otel-agent" }, "attributes_string": { "log_file_path": "/var/log/pods/zp-devops-ingress_main-public-controller-74d9bc46-hhrcw_a80102e1-ff36-496d-9308-7ac18f6edd1b/controller/2.log", "log_iostream": "stdout", "logtag": "F", "time": "2023-05-22T122022.447026621Z" }, "attributes_int": {}, "attributes_float": {} }
signozlogs.odt
Copy code
2023-05-18T14:41:05.902Z	info	exporterhelper/queued_retry.go:426	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "26.412791317s"}
2023-05-18T14:41:05.906Z	info	exporterhelper/queued_retry.go:426	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "23.285570238s"}
2023-05-18T14:41:06.808Z	error	exporterhelper/queued_retry.go:310	Dropping data because sending_queue is full. Try increasing queue_size.	{"kind": "exporter", "data_type": "logs", "name": "otlp", "dropped_items": 395}
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:310|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:310>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func2|go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func2>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/logs.go:115|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/logs.go:115>
<http://go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs|go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs>
	<http://go.opentelemetry.io/collector/consumer@v0.70.0/logs.go:36|go.opentelemetry.io/collector/consumer@v0.70.0/logs.go:36>
<http://go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export|go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export>
	<http://go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:338|go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:338>
<http://go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems|go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems>
	<http://go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:175|go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:175>
<http://go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle|go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle>
	<http://go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:143|go.opentelemetry.io/collector/processor/batchprocessor@v0.70.0/batch_processor.go:143>
2023-05-18T14:41:06.813Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.024Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.225Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.429Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.630Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.830Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.913Z	info	fileconsumer/file.go:171	Started watching file	{"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-cert-manager-webhook-7fd5b5c95b-zhdc7_e9eada80-182a-4d77-8a76-d90afb6fb822/cert-manager-webhook/14.log"}
2023-05-18T14:41:08.033Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.235Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.434Z	error	exporterhelper/queued_retry.go:175	Exporting failed. No more retries left. Dropping data.	{"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "max elapsed time expired rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing dial tcp 100.64.246.181:4317: connect: connection refused\"", "dropped_items": 855}
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).onTemporaryFailure|go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).onTemporaryFailure>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:175|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:175>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:410|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:410>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send|go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/metrics.go:136|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/metrics.go:136>
<http://go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1|go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:205|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/queued_retry.go:205>
<http://go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1|go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1>
	<http://go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/internal/bounded_memory_queue.go:61|go.opentelemetry.io/collector@v0.70.0/exporter/exporterhelper/internal/bounded_memory_queue.go:61>
2023-05-18T14:41:08.443Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.644Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.849Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.053Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.256Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.459Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.661Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.862Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.065Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.269Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.472Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.673Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.874Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.076Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.276Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.477Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.680Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.881Z	warn	batchprocessor@v0.70.0/batch_processor.go:177	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-22T04:38:24.713Z	info	fileconsumer/file.go:171	Started watching file	{"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-signoz-k8s-infra-otel-deployment-54cc957dd7-b26fj_42123d85-2ef3-4677-9fdd-73fa45712f69/zp-devops-signoz-k8s-infra-otel-deployment/0.log"}
@nitya-signoz @Prashant Shahi any update?
p
looks like there were something wrong with signoz otel-collector or related clickhouse writer before. to know more abt this, logs of signoz-otel-collector or clickhouse would be helpful. But seems to be resolved on
2023-05-22T04:38
, but few log files being watched. It should work fine as long as you do not blacklist any pods or namespace.