Skip to content

[source-amazon-ads] connector stucked after exception #56957

Open
@Vadym5

Description

@Vadym5

Connector Name

source-amazon-ads

Connector Version

7.1.5

What step the error happened?

During the sync

Relevant information

Amazon Ads data source became unresponsive during synchronization. It encountered error 425, attempted a retry, and then threw an exception. Following this, no further actions were taken, and the connection has remained stalled for over 10 hours.

Image

Relevant log output

2025-04-01 13:46:59 source WARN Caught exception that stops the processing of the jobs: Max attempt reached for job in partition {'profileId': 3802314508801283, 'parent_slice': {}, 'start_time': '2025-03-25', 'end_time': '2025-04-01'}. Traceback: Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 432, in create_and_get_completed_partitions
    self._start_jobs()
  File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 202, in _start_jobs
    self._replace_failed_jobs(partition)
  File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 186, in _replace_failed_jobs
    partition.replace_job(job, [new_job])
  File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 66, in replace_job
    raise ValueError(f"Max attempt reached for job in partition {self._stream_slice}")
ValueError: Max attempt reached for job in partition {'profileId': 3802314508801283, 'parent_slice': {}, 'start_time': '2025-03-25', 'end_time': '2025-04-01'}

Contribute

  • Yes, I want to contribute

Metadata

Metadata

Assignees

No one assigned

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions