Vishal Sharma
08/25/2022, 6:56 AMAlexei Zenin
08/25/2022, 2:57 PMRaghu
08/25/2022, 9:12 PMRaghu
08/25/2022, 9:13 PMprometheus:
config:
scrape_configs:
- job_name: otel-collector
scrape_interval: 30s
static_configs:
- targets:
- my-release-signoz-otel-collector:8889
- localhost:8080
I use promql to check the metrics, but do not see anything in UI. Am I missing something?Craig Rodrigues
08/25/2022, 9:25 PMCraig Rodrigues
08/25/2022, 10:38 PMAnkit Nayan
Yash Vardhan
08/26/2022, 8:21 AM주현태
08/28/2022, 12:22 PMPranay
sudhanshu dev
08/29/2022, 7:33 AMsudhanshu dev
08/29/2022, 7:33 AMsudhanshu dev
08/29/2022, 7:33 AMsudhanshu dev
08/29/2022, 7:34 AMsudhanshu dev
08/29/2022, 7:34 AMsudhanshu dev
08/29/2022, 7:34 AMsudhanshu dev
08/29/2022, 7:34 AMsudhanshu dev
08/29/2022, 7:35 AMsudhanshu dev
08/29/2022, 7:35 AMsudhanshu dev
08/29/2022, 7:44 AMsudhanshu dev
08/29/2022, 7:44 AMsudhanshu dev
08/29/2022, 7:45 AMsudhanshu dev
08/29/2022, 7:45 AMsudhanshu dev
08/29/2022, 7:45 AMCraig Rodrigues
08/29/2022, 6:07 PMnitya-signoz
08/30/2022, 7:20 AMReedam Ranjan
08/30/2022, 7:59 AMAshijit Pramanik
08/30/2022, 8:14 AMv0.10.2
was deployed on a GKE Autopilot cluster. I have added an Ingress each for the frontend (443 -> 3313) and for the collector (80 -> 4318) to access the frontend and the collector over the internet.
To see if I can send traces, I used this code <https://gist.github.com/dfrankow/f91aefd683ece8e696c26e183d696c29>
and started it like so and did a POST and GET to it as shared in that gist -
❯ python3 -m venv simple_server
❯ source simple_server/bin/activate
❯ python -V
Python 3.9.13
❯ pip install opentelemetry-distro
❯ pip install opentelemetry-exporter-otlp
❯ wget <https://gist.githubusercontent.com/dfrankow/f91aefd683ece8e696c26e183d696c29/raw/016714b21cb2172b43611aabe03e53249027bd84/simple_server.py>
❯ OTEL_RESOURCE_ATTRIBUTES=service.name=test_apm OTEL_EXPORTER_OTLP_ENDPOINT="http://<sub.domain.here>" opentelemetry-instrument --traces_exporter otlp_proto_http python3 simple_server.py 7000 127.0.0.1
The traces do not get sent and the logs seem to suggest that I necessarily need to open the GRPC port even though I am specifying otlp_proto_http
above. Please suggest if I am missing something.Alexei Zenin
08/30/2022, 3:16 PM929234
time series at the moment. The query service needed to have its memory increased to 3GB for the container to not crash due to OOM. Our PROD setup is much more extensive and has more traffic so will this need dozens of GB to operate the backend service? Is there anything I am doing wrong, I know there are optimizations coming but seems tough to handle this linearly increasing memory in terms of operations.Reidi Qyrku
08/30/2022, 4:15 PM