Upgrade to v0.47.0 is failing with signoz-schema-migrator-upgradePod: `clickhouse migrate failed ...
a

Al

11 months ago
Upgrade to v0.47.0 is failing with signoz-schema-migrator-upgradePod:
clickhouse migrate failed to run, error: Dirty database version 28. Fix and force version.
Any guidance appreciated thanks
{"level":"info","timestamp":"2024-06-06T13:55:49.546Z","caller":"signozschemamigrator/migrate.go:89","msg":"Setting env var SIGNOZ_CLUSTER","component":"migrate cli","cluster-name":"cluster"}
{"level":"info","timestamp":"2024-06-06T13:55:49.546Z","caller":"signozschemamigrator/migrate.go:106","msg":"Successfully set env var SIGNOZ_CLUSTER ","component":"migrate cli","cluster-name":"cluster"}
{"level":"info","timestamp":"2024-06-06T13:55:49.546Z","caller":"signozschemamigrator/migrate.go:111","msg":"Setting env var SIGNOZ_REPLICATED","component":"migrate cli","replication":false}
{"level":"info","timestamp":"2024-06-06T13:55:49.550Z","caller":"migrationmanager/manager.go:76","msg":"Running migrations for all migrators","component":"migrationmanager"}
{"level":"info","timestamp":"2024-06-06T13:55:49.550Z","caller":"migrationmanager/manager.go:78","msg":"Running migrations for logs","component":"migrationmanager","migrator":"logs"}
{"level":"info","timestamp":"2024-06-06T13:55:49.618Z","caller":"migrationmanager/manager.go:78","msg":"Running migrations for metrics","component":"migrationmanager","migrator":"metrics"}
{"level":"info","timestamp":"2024-06-06T13:55:49.749Z","caller":"migrationmanager/manager.go:78","msg":"Running migrations for traces","component":"migrationmanager","migrator":"traces"}
{"level":"error","timestamp":"2024-06-06T13:55:49.824Z","caller":"migrationmanager/manager.go:81","msg":"Failed to run migrations for migrator","component":"migrationmanager","migrator":"traces","error":"clickhouse migrate failed to run, error: Dirty database version 28. Fix and force version.","stacktrace":"github.com/SigNoz/signoz-otel-collector/migrationmanager.(*MigrationManager).Migrate\n\t/home/runner/work/signoz-otel-collector/signoz-otel-collector/migrationmanager/manager.go:81\nmain.main\n\t/home/runner/work/signoz-otel-collector/signoz-otel-collector/cmd/signozschemamigrator/migrate.go:126\nruntime.main\n\t/opt/hostedtoolcache/go/1.21.10/x64/src/runtime/proc.go:267"}
{"level":"fatal","timestamp":"2024-06-06T13:55:49.825Z","caller":"signozschemamigrator/migrate.go:128","msg":"Failed to run migrations","component":"migrate cli","error":"clickhouse migrate failed to run, error: Dirty database version 28. Fix and force version.","stacktrace":"main.main\n\t/home/runner/work/signoz-otel-collector/signoz-otel-collector/cmd/signozschemamigrator/migrate.go:128\nruntime.main\n\t/opt/hostedtoolcache/go/1.21.10/x64/src/runtime/proc.go:267"}
As a result, signoz-otel-collectors are blocked from starting with:
[2024-06-06 14:33:17] Waiting for job signoz-schema-migrator-upgrade...
[2024-06-06 14:33:19] Waiting for job signoz-schema-migrator-upgrade...
[2024-06-06 14:33:21] Waiting for job signoz-schema-migrator-upgrade...
[2024-06-06 14:33:23] Waiting for job signoz-schema-migrator-upgrade...
[2024-06-06 14:33:25] Waiting for job signoz-schema-migrator-upgrade...
Hi! Running into an issue with pipelines. I'm trying to parse a json field `labels` from a log such ...
m

Mircea Colonescu

8 months ago
Hi! Running into an issue with pipelines. I'm trying to parse a json field
labels
from a log such as this one
{
  "body": "{\"raw_log\":\"{\\\"level\\\":\\\"info\\\",\\\"module\\\":\\\"server\\\",\\\"module\\\":\\\"txindex\\\",\\\"height\\\":28557212,\\\"time\\\":\\\"2024-09-12T16:13:47-04:00\\\",\\\"message\\\":\\\"indexed block events\\\"}\"}",
  "id": "2lz9RKpucUEwudqQjp7LieQ9U4W",
  "timestamp": 1726172028356,
  "attributes": {
    "com.hashicorp.nomad.alloc_id": "71f80e7a-31d8-9a51-d5c5-9ad19783d6a5",
    "container_name": "/chain-binary-71f80e7a-31d8-9a51-d5c5-9ad19783d6a5",
    "labels": "{\"com.hashicorp.nomad.alloc_id\":\"71f80e7a-31d8-9a51-d5c5-9ad19783d6a5\"}",
    "level": "info",
    "message": "indexed block events",
    "module": "txindex",
    "nomad_job_name": "testnet-validator",
    "time": "2024-09-12T16:13:47-04:00"
  },
  "resources": {},
  "severity_text": "",
  "severity_number": 0,
  "trace_id": "",
  "span_id": "",
  "trace_flags": 0
}
The preview in the frontend works as expected. When I save the pipeline, however, it does not work and I see these errors in the collector logs
2024-09-12T20:16:29.396Z	error	helper/transformer.go:102	Failed to process entry	{"kind": "processor", "name": "logstransform/pipeline_Test", "pipeline": "logs", "operator_id": "4c9ebbab-d8b1-4ecb-9e07-c42459db68ab", "operator_type": "json_parser", "error": "running if expr: interface conversion: interface {} is map[string]interface {}, not string (1:48)\n | attributes?.labels != nil && attributes.labels matches \"^\\\\s*{.*}\\\\s*$\"\n | ...............................................^", "action": "send"}
<http://github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*TransformerOperator).HandleEntryError|github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*TransformerOperator).HandleEntryError>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza@v0.102.0/operator/helper/transformer.go:102
<http://github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*ParserOperator).ProcessWithCallback|github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*ParserOperator).ProcessWithCallback>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza@v0.102.0/operator/helper/parser.go:105
<http://github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*ParserOperator).ProcessWith|github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/helper.(*ParserOperator).ProcessWith>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza@v0.102.0/operator/helper/parser.go:98
<http://github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/parser/json.(*Parser).Process|github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/parser/json.(*Parser).Process>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza@v0.102.0/operator/parser/json/parser.go:24
<http://github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/transformer/router.(*Transformer).Process|github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/operator/transformer/router.(*Transformer).Process>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza@v0.102.0/operator/transformer/router/transformer.go:57
<http://github.com/open-telemetry/opentelemetry-collector-contrib/processor/logstransformprocessor.(*logsTransformProcessor).converterLoop|github.com/open-telemetry/opentelemetry-collector-contrib/processor/logstransformprocessor.(*logsTransformProcessor).converterLoop>
	/home/runner/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/logstransformprocessor@v0.102.0/processor.go:213
Any idea why this might be an issue? The pipeline executes the next step after the failed json parsing.