Skip to content

Commit d453842

Browse files
authored
Merge pull request #2 from artefactory/bigquery
create new bigquery source in python
2 parents 806974c + 4fdb65d commit d453842

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+1990
-1711
lines changed
Lines changed: 99 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,105 @@
1-
# BigQuery Test Configuration
1+
# Bigquery Source
22

3-
In order to test the BigQuery source, you need a service account key file.
3+
This is the repository for the Bigquery source connector, written in Python.
4+
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/bigquery).
45

5-
## Community Contributor
6+
## Local development
67

7-
As a community contributor, you will need access to a GCP project and BigQuery to run tests.
8+
### Prerequisites
89

9-
1. Go to the `Service Accounts` page on the GCP console
10-
1. Click on `+ Create Service Account" button
11-
1. Fill out a descriptive name/id/description
12-
1. Click the edit icon next to the service account you created on the `IAM` page
13-
1. Add the `BigQuery Data Editor` and `BigQuery User` role
14-
1. Go back to the `Service Accounts` page and use the actions modal to `Create Key`
15-
1. Download this key as a JSON file
16-
1. Move and rename this file to `secrets/credentials.json`
10+
* Python (`^3.9`)
11+
* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)
1712

18-
## Airbyte Employee
1913

20-
1. Access the `BigQuery Integration Test User` secret on Rippling under the `Engineering` folder
21-
1. Create a file with the contents at `secrets/credentials.json`
14+
15+
### Installing the connector
16+
17+
From this connector directory, run:
18+
```bash
19+
poetry install --with dev
20+
```
21+
22+
23+
### Create credentials
24+
25+
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/bigquery)
26+
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `src/source_bigquery/spec.yaml` file.
27+
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
28+
See `sample_files/sample_config.json` for a sample config file.
29+
30+
31+
### Locally running the connector
32+
33+
```
34+
poetry run source-bigquery spec
35+
poetry run source-bigquery check --config secrets/config.json
36+
poetry run source-bigquery discover --config secrets/config.json
37+
poetry run source-bigquery read --config secrets/config.json --catalog sample_files/configured_catalog.json
38+
```
39+
40+
### Running tests
41+
42+
To run tests locally, from the connector directory run:
43+
44+
```
45+
poetry run pytest tests
46+
```
47+
48+
### Building the docker image
49+
50+
1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
51+
2. Run the following command to build the docker image:
52+
```bash
53+
airbyte-ci connectors --name=source-bigquery build
54+
```
55+
56+
An image will be available on your host with the tag `airbyte/source-bigquery:dev`.
57+
58+
59+
### Running as a docker container
60+
61+
Then run any of the connector commands as follows:
62+
```
63+
docker run --rm airbyte/source-bigquery:dev spec
64+
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bigquery:dev check --config /secrets/config.json
65+
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bigquery:dev discover --config /secrets/config.json
66+
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-bigquery:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
67+
```
68+
69+
### Running our CI test suite
70+
71+
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
72+
73+
```bash
74+
airbyte-ci connectors --name=source-bigquery test
75+
```
76+
77+
### Customizing acceptance Tests
78+
79+
Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
80+
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
81+
82+
### Dependency Management
83+
84+
All of your dependencies should be managed via Poetry.
85+
To add a new dependency, run:
86+
87+
```bash
88+
poetry add <package-name>
89+
```
90+
91+
Please commit the changes to `pyproject.toml` and `poetry.lock` files.
92+
93+
## Publishing a new version of the connector
94+
95+
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
96+
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-bigquery test`
97+
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
98+
- bump the `dockerImageTag` value in in `metadata.yaml`
99+
- bump the `version` value in `pyproject.toml`
100+
3. Make sure the `metadata.yaml` content is up to date.
101+
4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/bigquery.md`).
102+
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
103+
6. Pat yourself on the back for being an awesome contributor.
104+
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
105+
8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference)
2+
# for more information about how to configure these tests
3+
connector_image: airbyte/source-bigquery:dev
4+
acceptance_tests:
5+
spec:
6+
tests:
7+
- spec_path: "source_bigquery/spec.yaml"
8+
connection:
9+
tests:
10+
- config_path: "secrets/config.json"
11+
status: "succeed"
12+
- config_path: "integration_tests/invalid_config.json"
13+
status: "failed"
14+
discovery:
15+
tests:
16+
- config_path: "secrets/config.json"
17+
basic_read:
18+
tests:
19+
- config_path: "secrets/config.json"
20+
configured_catalog_path: "integration_tests/configured_catalog.json"
21+
empty_streams: []
22+
# TODO uncomment this block to specify that the tests should assert the connector outputs the records provided in the input file a file
23+
# expect_records:
24+
# path: "integration_tests/expected_records.jsonl"
25+
# exact_order: no
26+
incremental:
27+
bypass_reason: "This connector does not implement incremental sync"
28+
# TODO uncomment this block this block if your connector implements incremental sync:
29+
# tests:
30+
# - config_path: "secrets/config.json"
31+
# configured_catalog_path: "integration_tests/configured_catalog.json"
32+
# future_state:
33+
# future_state_path: "integration_tests/abnormal_state.json"
34+
full_refresh:
35+
tests:
36+
- config_path: "secrets/config.json"
37+
configured_catalog_path: "integration_tests/configured_catalog.json"

airbyte-integrations/connectors/source-bigquery/build.gradle

Lines changed: 0 additions & 35 deletions
This file was deleted.

airbyte-integrations/connectors/source-bigquery/icon.svg

Lines changed: 0 additions & 1 deletion
This file was deleted.

airbyte-integrations/connectors/source-bigquery/integration_tests/README.md

Lines changed: 0 additions & 3 deletions
This file was deleted.
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
#
2+
# Copyright (c) 2024 Airbyte, Inc., all rights reserved.
3+
#
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"todo-stream-name": {
3+
"todo-field-name": "todo-abnormal-value"
4+
}
5+
}

airbyte-integrations/connectors/source-bigquery/integration_tests/acceptance.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
#
2-
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
2+
# Copyright (c) 2024 Airbyte, Inc., all rights reserved.
33
#
44

55

0 commit comments

Comments
 (0)