Skip to content

Commit bb33e90

Browse files
authored
destination-aws-datalake: [autopull] base image + poetry + up_to_date (#38413)
1 parent fd3abf8 commit bb33e90

File tree

4 files changed

+59
-10
lines changed

4 files changed

+59
-10
lines changed

airbyte-integrations/connectors/destination-aws-datalake/README.md

+55-7
Original file line numberDiff line numberDiff line change
@@ -55,22 +55,70 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
5555

5656
### Locally running the connector docker image
5757

58-
#### Build
5958

60-
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
59+
60+
#### Use `airbyte-ci` to build your connector
61+
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
62+
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
63+
Then running the following command will build your connector:
6164

6265
```bash
63-
airbyte-ci connectors --name=destination-aws-datalake build
66+
airbyte-ci connectors --name destination-aws-datalake build
67+
```
68+
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-aws-datalake:dev`.
69+
70+
##### Customizing our build process
71+
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
72+
You can customize our build process by adding a `build_customization.py` module to your connector.
73+
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
74+
It will be imported at runtime by our build process and the functions will be called if they exist.
75+
76+
Here is an example of a `build_customization.py` module:
77+
```python
78+
from __future__ import annotations
79+
80+
from typing import TYPE_CHECKING
81+
82+
if TYPE_CHECKING:
83+
# Feel free to check the dagger documentation for more information on the Container object and its methods.
84+
# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/
85+
from dagger import Container
86+
87+
88+
async def pre_connector_install(base_image_container: Container) -> Container:
89+
return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value")
90+
91+
async def post_connector_install(connector_container: Container) -> Container:
92+
return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value")
6493
```
6594

66-
An image will be built with the tag `airbyte/destination-aws-datalake:dev`.
95+
#### Build your own connector image
96+
This connector is built using our dynamic built process in `airbyte-ci`.
97+
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
98+
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
99+
It does not rely on a Dockerfile.
100+
101+
If you would like to patch our connector and build your own a simple approach would be to:
102+
103+
1. Create your own Dockerfile based on the latest version of the connector image.
104+
```Dockerfile
105+
FROM airbyte/destination-aws-datalake:latest
67106

68-
**Via `docker build`:**
107+
COPY . ./airbyte/integration_code
108+
RUN pip install ./airbyte/integration_code
69109

110+
# The entrypoint and default env vars are already set in the base image
111+
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
112+
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
113+
```
114+
Please use this as an example. This is not optimized.
115+
116+
2. Build your image:
70117
```bash
71118
docker build -t airbyte/destination-aws-datalake:dev .
119+
# Running the spec command against your patched connector
120+
docker run airbyte/destination-aws-datalake:dev spec
72121
```
73-
74122
#### Run
75123

76124
Then run any of the connector commands as follows:
@@ -114,4 +162,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re
114162
4. Make the connector documentation and its changelog is up to date (`docs/integrations/destinations/aws-datalake.md`).
115163
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
116164
6. Pat yourself on the back for being an awesome contributor.
117-
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
165+
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.

airbyte-integrations/connectors/destination-aws-datalake/metadata.yaml

+2-2
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,8 @@ data:
33
connectorType: destination
44
definitionId: 99878c90-0fbd-46d3-9d98-ffde879d17fc
55
connectorBuildOptions:
6-
baseImage: docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c
7-
dockerImageTag: 0.1.7
6+
baseImage: docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9
7+
dockerImageTag: 0.1.8
88
dockerRepository: airbyte/destination-aws-datalake
99
githubIssueLabel: destination-aws-datalake
1010
icon: awsdatalake.svg

airbyte-integrations/connectors/destination-aws-datalake/pyproject.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ requires = [ "poetry-core>=1.0.0",]
33
build-backend = "poetry.core.masonry.api"
44

55
[tool.poetry]
6-
version = "0.1.7"
6+
version = "0.1.8"
77
name = "destination-aws-datalake"
88
description = "Destination Implementation for AWS Datalake."
99
authors = [ "Airbyte <[email protected]>",]

docs/integrations/destinations/aws-datalake.md

+1
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,7 @@ which will be translated for compatibility with the Glue Data Catalog:
9090

9191
| Version | Date | Pull Request | Subject |
9292
| :------ | :--------- | :--------------------------------------------------------- | :--------------------------------------------------- |
93+
| 0.1.8 | 2024-05-20 | [38413](https://github.com/airbytehq/airbyte/pull/38413) | [autopull] base image + poetry + up_to_date |
9394
| `0.1.7` | 2024-04-29 | [#33853](https://github.com/airbytehq/airbyte/pull/33853) | Enable STS Role Credential Refresh for Long Sync |
9495
| `0.1.6` | 2024-03-22 | [#36386](https://github.com/airbytehq/airbyte/pull/36386) | Support new state message protocol |
9596
| `0.1.5` | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region |

0 commit comments

Comments
 (0)