Hey Guys, I've installed k8s-infra using helm v0.1...
# support
t
Hey Guys, I've installed k8s-infra using helm v0.11.18, I'm seeing for all my workloads severity_text is coming as empty string. Is there a config that I'm missing?
My override-values.yaml for this
Copy code
otelCollectorEndpoint: "url"
@Srikanth Chekuri can you help me with this, not sure if I've missed some config as I've kept all as default. All my workloads are in the same cluster, but seems like for all kubenetes logs it's giving relevant data but not for service logs, where as these are fine in my gcp logs explorer..
n
t
@nitya-signoz One more doubt, whenever I try to run IN filter in logs explorer from body, I get the below issue, any ideas on this how to resolve it.
Please use pipelines to parse severity_text https://signoz.io/docs/logs-pipelines/introduction/ .
https://signoz.io/docs/logs-pipelines/processors/#severity-parser
I did add the parser but I don't see anything changing for severity text. Not sure what it's updating..
n
Did you use the pipeline preview feature to test out the changes ?
t
Payload:
Copy code
{
  "start": 1731553526000,
  "end": 1731575126000,
  "step": 60,
  "variables": {},
  "compositeQuery": {
    "queryType": "builder",
    "panelType": "graph",
    "fillGaps": false,
    "builderQueries": {
      "A": {
        "dataSource": "logs",
        "queryName": "A",
        "aggregateOperator": "count",
        "aggregateAttribute": {
          "id": "------false",
          "dataType": "",
          "key": "",
          "isColumn": false,
          "type": "",
          "isJSON": false
        },
        "timeAggregation": "rate",
        "spaceAggregation": "sum",
        "functions": [],
        "filters": {
          "op": "AND",
          "items": [
            {
              "id": "8c230243",
              "key": {
                "key": "body.level",
                "dataType": "",
                "type": "",
                "isColumn": false,
                "isJSON": false
              },
              "op": "in",
              "value": [
                "error",
                "info"
              ]
            }
          ]
        },
        "expression": "A",
        "disabled": false,
        "stepInterval": 60,
        "having": [],
        "limit": null,
        "orderBy": [
          {
            "columnName": "timestamp",
            "order": "desc"
          }
        ],
        "groupBy": [
          {
            "key": "severity_text",
            "dataType": "string",
            "type": "",
            "isColumn": true,
            "isJSON": false,
            "id": "severity_text--string----true"
          }
        ],
        "legend": "",
        "reduceTo": "avg"
      }
    }
  }
}
Response :
Copy code
{
  "status": "error",
  "data": {},
  "errorType": "internal",
  "error": "error in builder queries"
}
https://signoz-community.slack.com/archives/C01HWQ1R0BC/p1731574955860999?thread_ts=1731391679.001799&cid=C01HWQ1R0BC
Yes, preview showed no changes in before and after processing. Below are the images of my pipeline setup. https://signoz-community.slack.com/archives/C01HWQ1R0BC/p1731574929143349?thread_ts=1731391679.001799&cid=C01HWQ1R0BC
n
Is your log body a json ? if yes you will have to use json parser as the first processor and then you can parse the severity later. check out this doc https://signoz.io/docs/logs-pipelines/guides/json/
t
Ahh that might be it
n
Let me know how it goes.
t
And do I need to do that for the above query issue as well? where IN is not working?
n
I will need more info in what your log structure is. First try to use the json parse and get the parsing done then we can look into the other.
t
Hey @nitya-signoz thanks the severity text is working now after json parsing. Now just the IN not working on body attributes is the issue.
Does that require parsing? I didn't thought of that since it allows to query inside for body's other attributes for other operators.. just the one's with multiple entities causing issues I guess..
n
Can you create an issue on github, seems like json search for IN operator is broken
t
Hey @nitya-signoz any idea how I can use logs processors to also have another severity parser to parse from log.iostream variable? My plan is to update all severity_text where its an empty string based on log_iostream (stdout, stderr) values.. But whenever i try to use log.iostream I get this error:
Copy code
{
  "status": "error",
  "errorType": "bad_data",
  "error": "could not simulate log pipelines processing.\nCollector errors: could not create logs processing simulator: failed to parse collector config: cannot unmarshal the configuration: decoding failed due to the following error(s):\n\nerror decoding 'processors': error reading configuration for \"signozlogspipeline/pipeline_EmptySeverityFix\": decoding failed due to the following error(s):\n\nerror decoding 'operators[1]': unmarshal to severity_parser: decoding failed due to the following error(s):\n\nerror decoding 'parse_from': unrecognized prefix"
}
Pipeline :
n
So
log.iostream
is something inside json body ? if yes you will need to have a json parser before it. You can specify the field inside a body directly . Please check this doc on how to parse json to a temp attribute and then use it https://signoz.io/docs/logs-pipelines/guides/json/
t
nope its in attributes
n
then it will be
attributes["log.iostream"]
t
Something like this
n
yep, try the above
t
ahh this works I was earlier trying attributes.log.iostream seems like dot notation doesn't work
Thanks