Now I am instrumenting another KIND of python appl...
# support
a
Now I am instrumenting another KIND of python application which runs from celery as bootable. The only documentation i could find was https://opentelemetry-python-contrib.readthedocs.io/en/latest/instrumentation/celery/celery.html So I did the following Dockerfile:
Copy code
opentelemetry-distro==0.34b0 \
    opentelemetry-exporter-otlp==1.13.0 \
    opentelemetry-launcher==1.9.0 \
    opentelemetry-instrumentation-celery==0.34b0 \
    

RUN opentelemetry-bootstrap --action=install



CMD opentelemetry-instrument --traces_exporter otlp_proto_grpc celery -A wombo.celery_paint.celeryapp worker --loglevel=info --pool=threads
code:
Copy code
@worker_process_init.connect(weak=False)
def init_celery_tracing(*args, **kwargs):
    """
    When tracing a celery worker process, tracing and instrumention both must be initialized after the celery worker
    process is initialized. This is required for any tracing components that might use threading to work correctly
    such as the BatchSpanProcessor. Celery provides a signal called worker_process_init that can be used to
    accomplish this
    """
    CeleryInstrumentor().instrument()

celeryapp = Celery('paints')
celeryapp.conf.task_default_queue = sqsurl
Not sure what else to do?
s
Aren’t you seeing any traces?
a
where are you initialising the host and port to send the spans ?
a
I am seeing the following traces, can I not see Queue specific things like number of tasks picked up etc
s
where are you initialising the host and port to send the spans ?
The auto instrumentation command takes cares of this.
I am seeing the following traces, can I not see Queue specific things like number of tasks picked up etc
It won’t show up here in the filters. You can expand the trace details for more information. If you would to see some aggregation on trace data you can go to dashboards and write some ClickHouse SQL queries. Let me know if you need any help with it.
a
Yes I do need help with that ClickHouse SQL