You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Destination Connector and version: Clickhouse 0.1.2
Normalization and version: airbyte/normalization-clickhouse 0.1.66
Severity: Medium
Step where error happened: Sync job
Current Behavior
Normalization Failed.
Expected Behavior
Synchronization done successfully.
Logs
If applicable, please upload the logs from the failing operation.
For sync jobs, you can download the full logs from the UI by going to the sync attempt page and
clicking the download logs button at the top right of the logs display window.
LOG
2022-02-09 12:19:52 �[32mINFO�[m i.a.w.w.WorkerRun(call):49 - Executing worker wrapper. Airbyte version: 0.35.23-alpha
2022-02-09 12:19:52 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/11/2/logs.log
2022-02-09 12:19:52 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.23-alpha
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):103 - start sync worker. job id: 11 attempt id: 2
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):115 - configured sync modes: {default.tbl=full_refresh - overwrite}
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.p.a.DefaultAirbyteDestination(start):69 - Running destination...
2022-02-09 12:19:53 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-clickhouse:0.1.2 exists...
2022-02-09 12:19:53 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-clickhouse:0.1.2 was found locally.
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.p.DockerProcessFactory(create):157 - Preparing command: docker run --rm --init -i -w /data/11/2 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/destination-clickhouse:0.1.2 write --config destination_config.json --catalog destination_catalog.json
2022-02-09 12:19:53 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-clickhouse:0.1.7 exists...
2022-02-09 12:19:53 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-clickhouse:0.1.7 was found locally.
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.p.DockerProcessFactory(create):157 - Preparing command: docker run --rm --init -i -w /data/11/2 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/source-clickhouse:0.1.7 read --config source_config.json --catalog source_catalog.json --state input_state.json
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$6):307 - Destination output thread started.
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):268 - Replication thread started.
2022-02-09 12:19:53 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):147 - Waiting for source and destination threads to complete.
2022-02-09 12:19:56 �[43mdestination�[0m > SLF4J: Class path contains multiple SLF4J bindings.
2022-02-09 12:19:56 �[43mdestination�[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.16.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-02-09 12:19:56 �[43mdestination�[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-02-09 12:19:56 �[43mdestination�[0m > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2022-02-09 12:19:56 �[43mdestination�[0m > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2022-02-09 12:19:56 �[44msource�[0m > 2022-02-09 12:19:56 �[32mINFO�[m i.a.i.s.c.ClickHouseSource(main):110 - starting source: class io.airbyte.integrations.source.clickhouse.ClickHouseSource
2022-02-09 12:19:56 �[44msource�[0m > 2022-02-09 12:19:56 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):76 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-02-09 12:19:56 �[44msource�[0m > 2022-02-09 12:19:56 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json}
2022-02-09 12:19:56 �[44msource�[0m > 2022-02-09 12:19:56 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):80 - Command: READ
2022-02-09 12:19:56 �[44msource�[0m > 2022-02-09 12:19:56 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):81 - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'}
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.r.CdcStateManager(<init>):26 - Initialized CDC state with: null
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.r.StateManager(createCursorInfoForStream):118 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='tbl', namespace='default'}, New Cursor Field: id. Resetting cursor value
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m r.y.c.ClickHouseDriver(<clinit>):49 - Driver registered
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.d.c.ClickhouseDestination(main):88 - starting destination: class io.airbyte.integrations.destination.clickhouse.ClickhouseDestination
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_ab_id (type String[0]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_data (type String[0]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_emitted_at (type DateTime64(3, 'GMT')[38]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_ab_id (type String[0]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_data (type String[0]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table _airbyte_raw_tbl column _airbyte_emitted_at (type DateTime64(3, 'GMT')[38]) -> Json type STRING
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):136 - Table tbl column id (type Int32[11]) -> Json type NUMBER
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.r.AbstractRelationalDbSource(queryTableFullRefresh):35 - Queueing query for table: tbl
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.r.AbstractDbSource(lambda$read$2):123 - Closing database connection pool.
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.r.AbstractDbSource(lambda$read$2):125 - Closed database connection pool.
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):133 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-02-09 12:19:57 �[44msource�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.s.c.ClickHouseSource(main):112 - completed source: class io.airbyte.integrations.source.clickhouse.ClickHouseSource
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):76 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):80 - Command: WRITE
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):81 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-02-09 12:19:57 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):152 - One of source or destination thread complete. Waiting on the other.
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:57 �[43mdestination�[0m > 2022-02-09 12:19:57 �[33mWARN�[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m r.y.c.ClickHouseDriver(<clinit>):49 - Driver registered
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$toWriteConfig$0):96 - Write config: WriteConfig{streamName=tbl, namespace=null, outputSchemaName=test_db, tmpTableName=_airbyte_tmp_enk_tbl, outputTableName=_airbyte_raw_tbl, syncMode=overwrite}
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):125 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):121 - Preparing tmp tables in destination started for 1 streams
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):125 - Preparing tmp table in destination started for stream tbl. schema: test_db, tmp table name: _airbyte_tmp_enk_tbl
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[33mWARN�[m r.y.c.ClickhouseJdbcUrlParser(parseUriQueryPart):84 - don't know how to handle parameter pair:
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[33mWARN�[m r.y.c.ClickhouseJdbcUrlParser(parseUriQueryPart):84 - don't know how to handle parameter pair:
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):131 - Preparing tables in destination completed.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):60 - Airbyte message consumer: succeeded.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(close):201 - executing on success close procedure.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.c.ClickhouseSqlOperations(insertRecordsInternal):68 - actual size of batch: 7
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):160 - Finalizing tables in destination started for 1 streams
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):165 - Finalizing stream tbl. schema test_db, tmp table _airbyte_tmp_enk_tbl, final table _airbyte_raw_tbl
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):178 - Executing finalization of tables.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):180 - Finalizing tables in destination completed.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):183 - Cleaning tmp tables in destination started for 1 streams
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):187 - Cleaning tmp table in destination started for stream tbl. schema test_db, tmp table name: _airbyte_tmp_enk_tbl
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):192 - Cleaning tmp tables in destination completed.
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):133 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2022-02-09 12:19:58 �[43mdestination�[0m > 2022-02-09 12:19:58 �[32mINFO�[m i.a.i.d.c.ClickhouseDestination(main):90 - completed destination: class io.airbyte.integrations.destination.clickhouse.ClickhouseDestination
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):154 - Source and destination threads complete.
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):217 - sync summary: io.airbyte.config.ReplicationAttemptSummary@1083e64c[status=completed,recordsSynced=7,bytesSynced=61,startTime=1644409193014,endTime=1644409198666,totalStats=io.airbyte.config.SyncStats@b7f89d6[recordsEmitted=7,bytesEmitted=61,stateMessagesEmitted=0,recordsCommitted=7],streamStats=[io.airbyte.config.StreamSyncStats@61b60ed9[streamName=tbl,stats=io.airbyte.config.SyncStats@15c891c4[recordsEmitted=7,bytesEmitted=61,stateMessagesEmitted=<null>,recordsCommitted=7]]]]
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.DefaultReplicationWorker(run):239 - Source did not output any state messages
2022-02-09 12:19:58 �[33mWARN�[m i.a.w.DefaultReplicationWorker(run):247 - State capture: No new state, falling back on input state: io.airbyte.config.State@1a76a4b7[state={}]
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$1):144 - sync summary: io.airbyte.config.StandardSyncOutput@426b093c[standardSyncSummary=io.airbyte.config.StandardSyncSummary@27b7c578[status=completed,recordsSynced=7,bytesSynced=61,startTime=1644409193014,endTime=1644409198666,totalStats=io.airbyte.config.SyncStats@b7f89d6[recordsEmitted=7,bytesEmitted=61,stateMessagesEmitted=0,recordsCommitted=7],streamStats=[io.airbyte.config.StreamSyncStats@61b60ed9[streamName=tbl,stats=io.airbyte.config.SyncStats@15c891c4[recordsEmitted=7,bytesEmitted=61,stateMessagesEmitted=<null>,recordsCommitted=7]]]],state=io.airbyte.config.State@1a76a4b7[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@3addc53b[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@76c4a880[stream=io.airbyte.protocol.models.AirbyteStream@6cfc7d8d[name=tbl,jsonSchema={"type":"object","properties":{"id":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[id],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]]
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):234 - Stopping temporal heartbeating...
2022-02-09 12:19:58 �[32mINFO�[m i.a.c.p.ConfigRepository(updateConnectionState):545 - Updating connection 65d42549-8db0-4295-bf68-7fd73a00ffe7 state: io.airbyte.config.State@7d95393c[state={}]
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/11/2/logs.log
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.23-alpha
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.DefaultNormalizationWorker(run):46 - Running normalization.
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-clickhouse:0.1.66
2022-02-09 12:19:58 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-clickhouse:0.1.66 exists...
2022-02-09 12:19:58 �[32mINFO�[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-clickhouse:0.1.66 was found locally.
2022-02-09 12:19:58 �[32mINFO�[m i.a.w.p.DockerProcessFactory(create):157 - Preparing command: docker run --rm --init -i -w /data/11/2/normalize --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/normalization-clickhouse:0.1.66 run --integration-type clickhouse --config destination_config.json --catalog destination_catalog.json
2022-02-09 12:20:01 �[42mnormalization�[0m > Running: transform-config --config destination_config.json --integration-type clickhouse --out /data/11/2/normalize
2022-02-09 12:20:01 �[42mnormalization�[0m > Namespace(config='destination_config.json', integration_type=<DestinationType.clickhouse: 'clickhouse'>, out='/data/11/2/normalize')
2022-02-09 12:20:01 �[42mnormalization�[0m > transform_clickhouse
2022-02-09 12:20:01 �[42mnormalization�[0m > Traceback (most recent call last):
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/bin/transform-config", line 8, in <module>
2022-02-09 12:20:01 �[42mnormalization�[0m > sys.exit(main())
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_config/transform.py", line 320, in main
2022-02-09 12:20:01 �[42mnormalization�[0m > TransformConfig().run(args)
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_config/transform.py", line 33, in run
2022-02-09 12:20:01 �[42mnormalization�[0m > transformed_config = self.transform(integration_type, original_config)
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_config/transform.py", line 62, in transform
2022-02-09 12:20:01 �[42mnormalization�[0m > transformed_integration_config = {
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_config/transform.py", line 282, in transform_clickhouse
2022-02-09 12:20:01 �[42mnormalization�[0m > "password": config["password"],
2022-02-09 12:20:01 �[42mnormalization�[0m > KeyError: 'password'
2022-02-09 12:20:01 �[42mnormalization�[0m > Running: transform-catalog --integration-type clickhouse --profile-config-dir /data/11/2/normalize --catalog destination_catalog.json --out /data/11/2/normalize/models/generated/ --json-column _airbyte_data
2022-02-09 12:20:01 �[42mnormalization�[0m > Traceback (most recent call last):
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/bin/transform-catalog", line 8, in <module>
2022-02-09 12:20:01 �[42mnormalization�[0m > sys.exit(main())
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_catalog/transform.py", line 82, in main
2022-02-09 12:20:01 �[42mnormalization�[0m > TransformCatalog().run(args)
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_catalog/transform.py", line 34, in run
2022-02-09 12:20:01 �[42mnormalization�[0m > self.parse(args)
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_catalog/transform.py", line 45, in parse
2022-02-09 12:20:01 �[42mnormalization�[0m > profiles_yml = read_profiles_yml(parsed_args.profile_config_dir)
2022-02-09 12:20:01 �[42mnormalization�[0m > File "/usr/local/lib/python3.8/site-packages/normalization/transform_catalog/transform.py", line 66, in read_profiles_yml
2022-02-09 12:20:01 �[42mnormalization�[0m > with open(os.path.join(profile_dir, "profiles.yml"), "r") as file:
2022-02-09 12:20:01 �[42mnormalization�[0m > FileNotFoundError: [Errno 2] No such file or directory: '/data/11/2/normalize/profiles.yml'
2022-02-09 12:20:01 �[42mnormalization�[0m >
2022-02-09 12:20:01 �[42mnormalization�[0m > Showing destination_catalog.json to diagnose/debug errors (1):
2022-02-09 12:20:01 �[42mnormalization�[0m >
2022-02-09 12:20:01 �[42mnormalization�[0m > {
2022-02-09 12:20:01 �[42mnormalization�[0m > "streams": [
2022-02-09 12:20:01 �[42mnormalization�[0m > {
2022-02-09 12:20:01 �[42mnormalization�[0m > "stream": {
2022-02-09 12:20:01 �[42mnormalization�[0m > "name": "tbl",
2022-02-09 12:20:01 �[42mnormalization�[0m > "json_schema": {
2022-02-09 12:20:01 �[42mnormalization�[0m > "type": "object",
2022-02-09 12:20:01 �[42mnormalization�[0m > "properties": {
2022-02-09 12:20:01 �[42mnormalization�[0m > "id": {
2022-02-09 12:20:01 �[42mnormalization�[0m > "type": "number"
2022-02-09 12:20:01 �[42mnormalization�[0m > }
2022-02-09 12:20:01 �[42mnormalization�[0m > }
2022-02-09 12:20:01 �[42mnormalization�[0m > },
2022-02-09 12:20:01 �[42mnormalization�[0m > "supported_sync_modes": [
2022-02-09 12:20:01 �[42mnormalization�[0m > "full_refresh",
2022-02-09 12:20:01 �[42mnormalization�[0m > "incremental"
2022-02-09 12:20:01 �[42mnormalization�[0m > ],
2022-02-09 12:20:01 �[42mnormalization�[0m > "default_cursor_field": [],
2022-02-09 12:20:01 �[42mnormalization�[0m > "source_defined_primary_key": [
2022-02-09 12:20:01 �[42mnormalization�[0m > [
2022-02-09 12:20:01 �[42mnormalization�[0m > "id"
2022-02-09 12:20:01 �[42mnormalization�[0m > ]
2022-02-09 12:20:01 �[42mnormalization�[0m > ]
2022-02-09 12:20:01 �[42mnormalization�[0m > },
2022-02-09 12:20:01 �[42mnormalization�[0m > "sync_mode": "full_refresh",
2022-02-09 12:20:01 �[42mnormalization�[0m > "cursor_field": [
2022-02-09 12:20:01 �[42mnormalization�[0m > "id"
2022-02-09 12:20:01 �[42mnormalization�[0m > ],
2022-02-09 12:20:01 �[42mnormalization�[0m > "destination_sync_mode": "overwrite",
2022-02-09 12:20:01 �[42mnormalization�[0m > "primary_key": [
2022-02-09 12:20:01 �[42mnormalization�[0m > [
2022-02-09 12:20:01 �[42mnormalization�[0m > "id"
2022-02-09 12:20:01 �[42mnormalization�[0m > ]
2022-02-09 12:20:01 �[42mnormalization�[0m > ]
2022-02-09 12:20:01 �[42mnormalization�[0m > }
2022-02-09 12:20:01 �[42mnormalization�[0m > ]
2022-02-09 12:20:01 �[42mnormalization�[0m > }
2022-02-09 12:20:01 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):158 - Completing future exceptionally...
io.airbyte.workers.WorkerException: Normalization Failed.
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.WorkerException: Normalization Failed.
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
... 3 more
Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful
at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:159) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:45) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
2022-02-09 12:20:01 �[32mINFO�[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
2022-02-09 12:20:01 �[32mINFO�[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):234 - Stopping temporal heartbeating...
2022-02-09 12:20:01 �[33mWARN�[m i.t.i.s.POJOActivityTaskHandler(activityFailureToResult):363 - Activity failure. ActivityId=e120d7a1-4466-3ddd-89c4-a7b603742a1a, activityType=Normalize, attempt=1
java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed.
at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:232) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:71) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
at io.temporal.internal.sync.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:286) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.internal.sync.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:252) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.internal.sync.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:209) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:193) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:151) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:73) ~[temporal-sdk-1.6.0.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed.
at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.6.0.jar:?]
at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:412) ~[temporal-sdk-1.6.0.jar:?]
at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.6.0.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:135) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$1(NormalizationActivityImpl.java:97) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:227) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
... 14 more
Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:129) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$1(NormalizationActivityImpl.java:97) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:227) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
... 14 more
Caused by: io.airbyte.workers.WorkerException: Normalization Failed.
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
... 1 more
Caused by: io.airbyte.workers.WorkerException: Normalization Failed.
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
... 1 more
Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful
at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:159) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:45) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.23-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Steps to Reproduce
Create Clickhouse source and destination (I've done for the same local server but different databases with no password) and set connection between them
Don't forget to set into connection settings Namespace Configuration=Destination Connector settings
Set Basic normalization
About fix
It's absolutely obvious from log that the problem is that password field is not provided as we can see from log:
Environment
airbyte/normalization-clickhouse
0.1.66Current Behavior
Normalization Failed.
Expected Behavior
Synchronization done successfully.
Logs
If applicable, please upload the logs from the failing operation.
For sync jobs, you can download the full logs from the UI by going to the sync attempt page and
clicking the download logs button at the top right of the logs display window.
LOG
Steps to Reproduce
About fix
It's absolutely obvious from log that the problem is that
password
field is not provided as we can see from log:The text was updated successfully, but these errors were encountered: