Skip to content

Commit cefa020

Browse files
committed
update doc to reference poetry
1 parent 553c9b0 commit cefa020

15 files changed

+49
-60
lines changed

docs/connector-development/config-based/tutorial/0-getting-started.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,7 @@ This can be done by signing up for the Free tier plan on [Exchange Rates Data AP
4242

4343
- An Exchange Rates API key
4444
- Python >= 3.9
45+
- [Poetry](https://python-poetry.org/)
4546
- Docker must be running
4647
- NodeJS
4748
- [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1) CLI

docs/connector-development/config-based/tutorial/2-install-dependencies.md

Lines changed: 2 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,17 @@
11
# Step 2: Install dependencies
22

3-
Let's create a python virtual environment for our source.
4-
You can do this by executing the following commands from the root of the Airbyte repository.
5-
6-
The command below assume that `python` points to a version of python >=3.9.0. On some systems, `python` points to a Python2 installation and `python3` points to Python3.
7-
If this is the case on your machine, substitute the `python` commands with `python3`.
8-
The subsequent `python` invocations will use the virtual environment created for the connector.
93

104
```bash
115
cd ../../connectors/source-exchange-rates-tutorial
12-
python -m venv .venv
13-
source .venv/bin/activate
14-
pip install -r requirements.txt
6+
poetry install
157
```
168

179
These steps create an initial python environment, and install the dependencies required to run an API Source connector.
1810

1911
Let's verify everything works as expected by running the Airbyte `spec` operation:
2012

2113
```bash
22-
python main.py spec
14+
poetry run source-exchange-rates-tutorial spec
2315
```
2416

2517
You should see an output similar to the one below:

docs/connector-development/config-based/tutorial/3-connecting-to-the-API-source.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -200,7 +200,7 @@ spec:
200200
We can now run the `check` operation, which verifies the connector can connect to the API source.
201201

202202
```bash
203-
python main.py check --config secrets/config.json
203+
poetry run source-exchange-rates-tutorial check --config secrets/config.json
204204
```
205205

206206
which should now succeed with logs similar to:

docs/connector-development/config-based/tutorial/4-reading-data.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ As an alternative to storing the stream's data schema to the `schemas/` director
4444
Reading from the source can be done by running the `read` operation
4545

4646
```bash
47-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
47+
poetry run source-exchange-rates-tutorial read --config secrets/config.json --catalog integration_tests/configured_catalog.json
4848
```
4949

5050
The logs should show that 1 record was read from the stream.
@@ -57,7 +57,7 @@ The logs should show that 1 record was read from the stream.
5757
The `--debug` flag can be set to print out debug information, including the outgoing request and its associated response
5858

5959
```bash
60-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json --debug
60+
poetry run source-exchange-rates-tutorial read --config secrets/config.json --catalog integration_tests/configured_catalog.json --debug
6161
```
6262

6363
## Next steps

docs/connector-development/config-based/tutorial/5-incremental-reads.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ definitions:
7676
You can test these changes by executing the `read` operation:
7777

7878
```bash
79-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
79+
poetry run source-exchange-rates-tutorial read --config secrets/config.json --catalog integration_tests/configured_catalog.json
8080
```
8181

8282
By reading the output record, you should see that we read historical data instead of the latest exchange rate.
@@ -240,7 +240,7 @@ spec:
240240
Running the `read` operation will now read all data for all days between start_date and now:
241241

242242
```bash
243-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
243+
poetry run source-exchange-rates-tutorial read --config secrets/config.json --catalog integration_tests/configured_catalog.json
244244
```
245245

246246
The operation should now output more than one record:
@@ -295,7 +295,7 @@ We can simulate incremental syncs by creating a state file containing the last s
295295
Running the `read` operation will now only read data for dates later than the given state:
296296

297297
```bash
298-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json --state integration_tests/sample_state.json
298+
poetry run source-exchange-rates-tutorial read --config secrets/config.json --catalog integration_tests/configured_catalog.json --state integration_tests/sample_state.json
299299
```
300300

301301
There shouldn't be any data read if the state is today's date:

docs/connector-development/config-based/tutorial/6-testing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ and `integration_tests/abnormal_state.json` with
3030
You can run the [acceptance tests](https://github.com/airbytehq/airbyte/blob/master/docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md#L1) with the following commands using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1):
3131

3232
```bash
33-
airbyte-ci connectors --use-remote-secrets=false --name source-exchange-rates-tutorial test
33+
airbyte-ci connectors --use-remote-secrets=false --name source-exchange-rates-tutorial test --only-step=acceptance
3434
```
3535

3636
## Next steps:

docs/connector-development/testing-connectors/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ Unit and integration tests can be run directly from the connector code.
1515

1616
Using `pytest` for Python connectors:
1717
```bash
18-
python -m pytest unit_tests/
19-
python -m pytest integration_tests/
18+
poetry run pytest tests/unit_tests/
19+
poetry run pytest tests/integration_tests/
2020
```
2121

2222
Using `gradle` for Java connectors:

docs/connector-development/tutorials/building-a-python-source.md

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -64,18 +64,16 @@ $ ./generate.sh
6464

6565
Select the `python` template and then input the name of your connector. For this walk through we will refer to our source as `example-python`
6666

67-
### Step 2: Build the newly generated source
67+
### Step 2: Install the newly generated source
6868

69-
Build the source by running:
69+
Install the source by running:
7070

7171
```bash
7272
cd airbyte-integrations/connectors/source-<name>
73-
python -m venv .venv # Create a virtual environment in the .venv directory
74-
source .venv/bin/activate # enable the venv
75-
pip install -r requirements.txt
73+
poetry install
7674
```
7775

78-
This step sets up the initial python environment. **All** subsequent `python` or `pip` commands assume you have activated your virtual environment.
76+
This step sets up the initial python environment.
7977

8078
### Step 3: Set up your Airbyte development environment
8179

@@ -112,10 +110,10 @@ You'll notice in your source's directory that there is a python file called `mai
112110

113111
```bash
114112
# from airbyte-integrations/connectors/source-<source-name>
115-
python main.py spec
116-
python main.py check --config secrets/config.json
117-
python main.py discover --config secrets/config.json
118-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json
113+
poetry run source-<source-name> spec
114+
poetry run source-<source-name> check --config secrets/config.json
115+
poetry run source-<source-name> discover --config secrets/config.json
116+
poetry run source-<source-name> read --config secrets/config.json --catalog sample_files/configured_catalog.json
119117
```
120118

121119
The nice thing about this approach is that you can iterate completely within in python. The downside is that you are not quite running your source as it will actually be run by Airbyte. Specifically you're not running it from within the docker container that will house it.
@@ -182,7 +180,7 @@ The nice thing about this approach is that you are running your source exactly a
182180
During development of your connector, you can enable the printing of detailed debug information during a sync by specifying the `--debug` flag. This will allow you to get a better picture of what is happening during each step of your sync.
183181

184182
```bash
185-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json --debug
183+
poetry run source-<source-name> read --config secrets/config.json --catalog sample_files/configured_catalog.json --debug
186184
```
187185

188186
In addition to the preset CDK debug statements, you can also emit custom debug information from your connector by introducing your own debug statements:
@@ -233,7 +231,8 @@ As described in the template code, this method takes in the same config object a
233231

234232
The Connector Acceptance Tests are a set of tests that run against all sources. These tests are run in the Airbyte CI to prevent regressions. They also can help you sanity check that your source works as expected. The following [article](../testing-connectors/connector-acceptance-tests-reference.md) explains Connector Acceptance Tests and how to run them.
235233

236-
You can run the tests using `./gradlew :airbyte-integrations:connectors:source-<source-name>:integrationTest`. Make sure to run this command from the Airbyte repository root.
234+
You can run the tests using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
235+
`airbyte-ci connectors --name source-<source-name> test --only-step=acceptance`
237236

238237
:::info
239238
In some rare cases we make exceptions and allow a source to not need to pass all the standard tests. If for some reason you think your source cannot reasonably pass one of the tests cases, reach out to us on github or slack, and we can determine whether there's a change we can make so that the test will pass or if we should skip that test for your source.
@@ -245,15 +244,15 @@ The connector acceptance tests are meant to cover the basic functionality of a s
245244

246245
#### Unit Tests
247246

248-
Add any relevant unit tests to the `unit_tests` directory. Unit tests should _not_ depend on any secrets.
247+
Add any relevant unit tests to the `tests/unit_tests` directory. Unit tests should _not_ depend on any secrets.
249248

250-
You can run the tests using `python -m pytest -s unit_tests`
249+
You can run the tests using `poetry run pytest tests/unit_tests`
251250

252251
#### Integration Tests
253252

254253
Place any integration tests in the `integration_tests` directory such that they can be [discovered by pytest](https://docs.pytest.org/en/6.2.x/goodpractices.html#conventions-for-python-test-discovery).
255254

256-
Run integration tests using `python -m pytest -s integration_tests`.
255+
You can run the tests using `poetry run pytest tests/integration_tests`
257256

258257
### Step 10: Update the `README.md`
259258

docs/connector-development/tutorials/cdk-speedrun.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ If you are a visual learner and want to see a video version of this guide going
1111
## Dependencies
1212

1313
1. Python &gt;= 3.9
14+
2. [Poetry](https://python-poetry.org/)
1415
2. Docker
1516
3. NodeJS
1617

@@ -30,9 +31,7 @@ Select the `Python HTTP API Source` and name it `python-http-example`.
3031

3132
```bash
3233
cd ../../connectors/source-python-http-example
33-
python -m venv .venv # Create a virtual environment in the .venv directory
34-
source .venv/bin/activate
35-
pip install -r requirements.txt
34+
poetry install
3635
```
3736

3837
### Define Connector Inputs
@@ -169,7 +168,7 @@ This file defines your output schema for every endpoint that you want to impleme
169168
Test your discover function. You should receive a fairly large JSON object in return.
170169

171170
```bash
172-
python main.py discover --config sample_files/config.json
171+
poetry run source-python-http-example discover --config sample_files/config.json
173172
```
174173

175174
Note that our discover function is using the `pokemon_name` config variable passed in from the `Pokemon` stream when we set it in the `__init__` function.
@@ -226,7 +225,7 @@ We now need a catalog that defines all of our streams. We only have one stream:
226225
Let's read some data.
227226

228227
```bash
229-
python main.py read --config sample_files/config.json --catalog sample_files/configured_catalog.json
228+
poetry run source-python-http-example read --config sample_files/config.json --catalog sample_files/configured_catalog.json
230229
```
231230

232231
If all goes well, containerize it so you can use it in the UI:

docs/connector-development/tutorials/cdk-tutorial-python-http/connection-checking.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -46,17 +46,17 @@ Let's test out this implementation by creating two objects: a valid and an inval
4646
mkdir sample_files
4747
echo '{"start_date": "2022-04-01", "base": "USD", "apikey": <your_apikey>}' > secrets/config.json
4848
echo '{"start_date": "2022-04-01", "base": "BTC", "apikey": <your_apikey>}' > secrets/invalid_config.json
49-
python main.py check --config secrets/config.json
50-
python main.py check --config secrets/invalid_config.json
49+
poetry run source-python-http-example check --config secrets/config.json
50+
poetry run source-python-http-example check --config secrets/invalid_config.json
5151
```
5252

5353
You should see output like the following:
5454

5555
```text
56-
> python main.py check --config secrets/config.json
56+
> poetry run source-python-http-example check --config secrets/config.json
5757
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "SUCCEEDED"}}
5858
59-
> python main.py check --config secrets/invalid_config.json
59+
> poetry run source-python-http-example check --config secrets/invalid_config.json
6060
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "FAILED", "message": "Input currency BTC is invalid. Please input one of the following currencies: {'DKK', 'USD', 'CZK', 'BGN', 'JPY'}"}}
6161
```
6262

docs/connector-development/tutorials/cdk-tutorial-python-http/declare-schema.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ Having created this stream in code, we'll put a file `exchange_rates.json` in th
6363
With `.json` schema file in place, let's see if the connector can now find this schema and produce a valid catalog:
6464

6565
```text
66-
python main.py discover --config secrets/config.json # this is not a mistake, the schema file is found by naming snake_case naming convention as specified above
66+
poetry run source-python-http-example discover --config secrets/config.json # this is not a mistake, the schema file is found by naming snake_case naming convention as specified above
6767
```
6868

6969
you should see some output like:

docs/connector-development/tutorials/cdk-tutorial-python-http/getting-started.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ This is a step-by-step guide for how to create an Airbyte source in Python to re
77
## Requirements
88

99
* Python &gt;= 3.9
10+
* [Poetry](https://python-poetry.org/)
1011
* Docker
1112
* NodeJS \(only used to generate the connector\). We'll remove the NodeJS dependency soon.
1213

docs/connector-development/tutorials/cdk-tutorial-python-http/install-dependencies.md

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,17 +4,14 @@ Now that you've generated the module, let's navigate to its directory and instal
44

55
```text
66
cd ../../connectors/source-<name>
7-
python -m venv .venv # Create a virtual environment in the .venv directory
8-
source .venv/bin/activate # enable the venv
9-
pip install -r requirements.txt
7+
poetry install
108
```
119

12-
This step sets up the initial python environment. **All** subsequent `python` or `pip` commands assume you have activated your virtual environment.
1310

1411
Let's verify everything is working as intended. Run:
1512

1613
```text
17-
python main.py spec
14+
poetry run source-<name> spec
1815
```
1916

2017
You should see some output:
@@ -49,10 +46,10 @@ You'll notice in your source's directory that there is a python file called `mai
4946

5047
```text
5148
# from airbyte-integrations/connectors/source-<name>
52-
python main.py spec
53-
python main.py check --config secrets/config.json
54-
python main.py discover --config secrets/config.json
55-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json
49+
poetry run source-<name> spec
50+
poetry run source-<name> check --config secrets/config.json
51+
poetry run source-<name> discover --config secrets/config.json
52+
poetry run source-<name> read --config secrets/config.json --catalog sample_files/configured_catalog.json
5653
```
5754

5855
The nice thing about this approach is that you can iterate completely within python. The downside is that you are not quite running your source as it will actually be run by Airbyte. Specifically, you're not running it from within the docker container that will house it.

docs/connector-development/tutorials/cdk-tutorial-python-http/read-data.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ We're now ready to query the API!
108108
To do this, we'll need a [ConfiguredCatalog](../../../understanding-airbyte/beginners-guide-to-catalog.md). We've prepared one [here](https://github.com/airbytehq/airbyte/blob/master/docs/connector-development/tutorials/cdk-tutorial-python-http/configured_catalog.json) -- download this and place it in `sample_files/configured_catalog.json`. Then run:
109109

110110
```text
111-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json
111+
poetry run source-<name> --config secrets/config.json --catalog sample_files/configured_catalog.json
112112
```
113113

114114
you should see some output lines, one of which is a record from the API:
@@ -240,17 +240,17 @@ We should now have a working implementation of incremental sync!
240240
Let's try it out:
241241

242242
```text
243-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json
243+
poetry run source-<name> --config secrets/config.json --catalog sample_files/configured_catalog.json
244244
```
245245

246246
You should see a bunch of `RECORD` messages and `STATE` messages. To verify that incremental sync is working, pass the input state back to the connector and run it again:
247247

248248
```text
249249
# Save the latest state to sample_files/state.json
250-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json | grep STATE | tail -n 1 | jq .state.data > sample_files/state.json
250+
poetry run source-<name> --config secrets/config.json --catalog sample_files/configured_catalog.json | grep STATE | tail -n 1 | jq .state.data > sample_files/state.json
251251
252252
# Run a read operation with the latest state message
253-
python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json --state sample_files/state.json
253+
poetry run source-<name> --config secrets/config.json --catalog sample_files/configured_catalog.json --state sample_files/state.json
254254
```
255255

256256
You should see that only the record from the last date is being synced! This is acceptable behavior, since Airbyte requires at-least-once delivery of records, so repeating the last record twice is OK.

docs/connector-development/tutorials/cdk-tutorial-python-http/test-your-connector.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
## Unit Tests
44

5-
Add any relevant unit tests to the `unit_tests` directory. Unit tests should **not** depend on any secrets.
5+
Add any relevant unit tests to the `tests/unit_tests` directory. Unit tests should **not** depend on any secrets.
66

7-
You can run the tests using `python -m pytest -s unit_tests`.
7+
You can run the tests using `poetry run pytest tests/unit_tests`.
88

99
## Integration Tests
1010

1111
Place any integration tests in the `integration_tests` directory such that they can be [discovered by pytest](https://docs.pytest.org/en/6.2.x/goodpractices.html#conventions-for-python-test-discovery).
1212

13-
Run integration tests using `python -m pytest -s integration_tests`.
13+
You can run the tests using `poetry run pytest tests/integration_tests`.
1414

1515
More information on integration testing can be found on [the Testing Connectors doc](https://docs.airbyte.com/connector-development/testing-connectors/#running-integration-tests).
1616

0 commit comments

Comments
 (0)