You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-aws-datalake:dev`.
69
+
70
+
##### Customizing our build process
71
+
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
72
+
You can customize our build process by adding a `build_customization.py` module to your connector.
73
+
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
74
+
It will be imported at runtime by our build process and the functions will be called if they exist.
75
+
76
+
Here is an example of a `build_customization.py` module:
77
+
```python
78
+
from__future__import annotations
79
+
80
+
from typing importTYPE_CHECKING
81
+
82
+
ifTYPE_CHECKING:
83
+
# Feel free to check the dagger documentation for more information on the Container object and its methods.
An image will be built with the tag `airbyte/destination-aws-datalake:dev`.
95
+
#### Build your own connector image
96
+
This connector is built using our dynamic built process in `airbyte-ci`.
97
+
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
98
+
The build logic is defined using [Dagger](https://dagger.io/)[here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
99
+
It does not rely on a Dockerfile.
100
+
101
+
If you would like to patch our connector and build your own a simple approach would be to:
102
+
103
+
1. Create your own Dockerfile based on the latest version of the connector image.
104
+
```Dockerfile
105
+
FROM airbyte/destination-aws-datalake:latest
67
106
68
-
**Via `docker build`:**
107
+
COPY . ./airbyte/integration_code
108
+
RUN pip install ./airbyte/integration_code
69
109
110
+
# The entrypoint and default env vars are already set in the base image
# Running the spec command against your patched connector
120
+
docker run airbyte/destination-aws-datalake:dev spec
72
121
```
73
-
74
122
#### Run
75
123
76
124
Then run any of the connector commands as follows:
@@ -114,4 +162,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re
114
162
4. Make the connector documentation and its changelog is up to date (`docs/integrations/destinations/aws-datalake.md`).
115
163
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
116
164
6. Pat yourself on the back for being an awesome contributor.
117
-
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
165
+
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
0 commit comments