Skip to content

Commit 60e73fe

Browse files
tswastplamutgcf-owl-bot[bot]release-please[bot]renovate-bot
authored
chore: sync v3 with master branch (#880)
* chore: protect v3.x.x branch (#816) * chore: protect v3.x.x branch In preparation for breaking changes. * force pattern to be a string * simplify branch name * fix: no longer raise a warning in `to_dataframe` if `max_results` set (#815) That warning should only be used when BQ Storage client is explicitly passed in to RowIterator methods when max_results value is also set. * feat: Update proto definitions for bigquery/v2 to support new proto fields for BQML. (#817) PiperOrigin-RevId: 387137741 Source-Link: googleapis/googleapis@8962c92 Source-Link: googleapis/googleapis-gen@102f1b4 * chore: release 2.23.0 (#819) Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> * chore(deps): update dependency google-cloud-bigquery to v2.23.0 (#820) * fix: `insert_rows()` accepts float column values as strings again (#824) * chore: release 2.23.1 (#825) Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> * chore: add second protection rule for v3 branch (#828) * chore(deps): update dependency google-cloud-bigquery to v2.23.1 (#827) * test: retry getting rows after streaming them in `test_insert_rows_from_dataframe` (#832) * chore(deps): update dependency pyarrow to v5 (#834) * chore(deps): update dependency google-cloud-bigquery-storage to v2.6.2 (#795) * deps: expand pyarrow pins to support 5.x releases (#833) * chore: release 2.23.2 (#835) Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> * chore(deps): update dependency google-auth-oauthlib to v0.4.5 (#839) * chore(deps): update dependency google-cloud-bigquery to v2.23.2 (#838) * chore(deps): update dependency google-cloud-testutils to v1 (#845) * chore: require CODEOWNER review and up to date branches (#846) These two lines bring the rules on this repo in line with the defaults: https://github.com/googleapis/repo-automation-bots/blob/63c858e539e1f4d9bb8ea66e12f9c0a0de5fef55/packages/sync-repo-settings/src/required-checks.json#L40-L50 * chore: add api-bigquery as a samples owner (#852) * fix: increase default retry deadline to 10 minutes (#859) The backend API has a timeout of 4 minutes, so the default of 2 minutes was not allowing for any retries to happen in some cases. Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes #853 🦕 * process: add yoshi-python to samples CODEOWNERS (#858) Closes #857. * chore: release 2.23.3 (#860) Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> Co-authored-by: Tim Swast <[email protected]> * chore(deps): update dependency google-cloud-bigquery to v2.23.3 (#866) [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Change | Age | Adoption | Passing | Confidence | |---|---|---|---|---|---| | [google-cloud-bigquery](https://togithub.com/googleapis/python-bigquery) | `==2.23.2` -> `==2.23.3` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/compatibility-slim/2.23.2)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/confidence-slim/2.23.2)](https://docs.renovatebot.com/merge-confidence/) | *** ### Release Notes <details> <summary>googleapis/python-bigquery</summary> ### [`v2.23.3`](https://togithub.com/googleapis/python-bigquery/blob/master/CHANGELOG.md#​2233-httpswwwgithubcomgoogleapispython-bigquerycomparev2232v2233-2021-08-06) [Compare Source](https://togithub.com/googleapis/python-bigquery/compare/v2.23.2...v2.23.3) </details> *** ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. *** * \[ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box. *** This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery). * feat: add support for transaction statistics (#849) * feat: add support for transaction statistics * Hoist transaction_info into base job class * Add versionadded directive to new property and class * Include new class in docs reference * chore(deps): update dependency google-cloud-bigquery-storage to v2.6.3 (#863) [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Change | Age | Adoption | Passing | Confidence | |---|---|---|---|---|---| | [google-cloud-bigquery-storage](https://togithub.com/googleapis/python-bigquery-storage) | `==2.6.2` -> `==2.6.3` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/compatibility-slim/2.6.2)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/confidence-slim/2.6.2)](https://docs.renovatebot.com/merge-confidence/) | *** ### Release Notes <details> <summary>googleapis/python-bigquery-storage</summary> ### [`v2.6.3`](https://togithub.com/googleapis/python-bigquery-storage/blob/master/CHANGELOG.md#​263-httpswwwgithubcomgoogleapispython-bigquery-storagecomparev262v263-2021-08-06) [Compare Source](https://togithub.com/googleapis/python-bigquery-storage/compare/v2.6.2...v2.6.3) </details> *** ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. *** * \[x] <!-- rebase-check -->If you want to rebase/retry this PR, check this box. *** This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery). * chore: fix INSTALL_LIBRARY_FROM_SOURCE in noxfile.py (#869) Source-Link: googleapis/synthtool@6252f2c Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:50e35228649c47b6ca82aa0be3ff9eb2afce51c82b66c4a03fe4afeb5ff6c0fc Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> * feat: make the same `Table*` instances equal to each other (#867) * feat: make the same Table instances equal to each other * Table equality should ignore metadata differences * Compare instances through tableReference property * Make Table instances hashable * Make Table* classes interchangeable If these classes reference the same table, they are now considered equal. * feat: support `ScalarQueryParameterType` for `type_` argument in `ScalarQueryParameter` constructor (#850) Follow-up to https://github.com/googleapis/python-bigquery/pull/840/files#r679880582 Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) * feat: retry failed query jobs in `result()` (#837) Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [x] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [x] Ensure the tests and linter pass - [x] Code coverage does not decrease (if any source code was changed) - [x] Appropriate docs were updated (if necessary) Fixes #539 🦕 Previously, we only retried failed API requests. Now, we retry failed jobs (according to the predicate of the `Retry` object passed to `job.result()`). * fix: make unicode characters working well in load_table_from_json (#865) Co-authored-by: Tim Swast <[email protected]> Co-authored-by: Tres Seaver <[email protected]> * chore: release 2.24.0 (#868) :robot: I have created a release \*beep\* \*boop\* --- ## [2.24.0](https://www.github.com/googleapis/python-bigquery/compare/v2.23.3...v2.24.0) (2021-08-11) ### Features * add support for transaction statistics ([#849](https://www.github.com/googleapis/python-bigquery/issues/849)) ([7f7b1a8](https://www.github.com/googleapis/python-bigquery/commit/7f7b1a808d50558772a0deb534ca654da65d629e)) * make the same `Table*` instances equal to each other ([#867](https://www.github.com/googleapis/python-bigquery/issues/867)) ([c1a3d44](https://www.github.com/googleapis/python-bigquery/commit/c1a3d4435739a21d25aa154145e36d3a7c42eeb6)) * retry failed query jobs in `result()` ([#837](https://www.github.com/googleapis/python-bigquery/issues/837)) ([519d99c](https://www.github.com/googleapis/python-bigquery/commit/519d99c20e7d1101f76981f3de036fdf3c7a4ecc)) * support `ScalarQueryParameterType` for `type_` argument in `ScalarQueryParameter` constructor ([#850](https://www.github.com/googleapis/python-bigquery/issues/850)) ([93d15e2](https://www.github.com/googleapis/python-bigquery/commit/93d15e2e5405c2cc6d158c4e5737361344193dbc)) ### Bug Fixes * make unicode characters working well in load_table_from_json ([#865](https://www.github.com/googleapis/python-bigquery/issues/865)) ([ad9c802](https://www.github.com/googleapis/python-bigquery/commit/ad9c8026f0e667f13dd754279f9dc40d06f4fa78)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). * chore(deps): update dependency google-cloud-bigquery to v2.24.0 (#873) * test: refactor `list_rows` tests and add test for scalars (#829) * test: refactor `list_rows` tests and add test for scalars * fix JSON formatting * add TODO for INTERVAL Arrow support * format tests * chore: drop mention of Python 2.7 from templates (#877) Source-Link: googleapis/synthtool@facee4c Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:9743664022bd63a8084be67f144898314c7ca12f0a03e422ac17c733c129d803 Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> * fix: remove pytz dependency and require pyarrow>=3.0.0 (#875) * fix: remove pytz dependency * 🦉 Updates from OwlBot See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * fix(deps): require pyarrow>=3.0.0 * remove version check for pyarrow * require pyarrow 3.0 in pandas extra * remove _BIGNUMERIC_SUPPORT references from tests Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Dina Graves Portman <[email protected]> Co-authored-by: Tim Swast <[email protected]> * Update google/cloud/bigquery/table.py * tests: avoid INTERVAL columns in pandas tests Co-authored-by: Peter Lamut <[email protected]> Co-authored-by: gcf-owl-bot[bot] <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> Co-authored-by: WhiteSource Renovate <[email protected]> Co-authored-by: Bu Sun Kim <[email protected]> Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Jim Fulton <[email protected]> Co-authored-by: Grimmer <[email protected]> Co-authored-by: Tres Seaver <[email protected]> Co-authored-by: Dina Graves Portman <[email protected]>
1 parent dcd78c7 commit 60e73fe

17 files changed

+268
-124
lines changed

docs/snippets.py

+3-2
Original file line numberDiff line numberDiff line change
@@ -359,7 +359,6 @@ def test_update_table_expiration(client, to_delete):
359359

360360
# [START bigquery_update_table_expiration]
361361
import datetime
362-
import pytz
363362

364363
# from google.cloud import bigquery
365364
# client = bigquery.Client()
@@ -371,7 +370,9 @@ def test_update_table_expiration(client, to_delete):
371370
assert table.expires is None
372371

373372
# set table to expire 5 days from now
374-
expiration = datetime.datetime.now(pytz.utc) + datetime.timedelta(days=5)
373+
expiration = datetime.datetime.now(datetime.timezone.utc) + datetime.timedelta(
374+
days=5
375+
)
375376
table.expires = expiration
376377
table = client.update_table(table, ["expires"]) # API request
377378

google/cloud/bigquery/table.py

+1-3
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@
2020
import datetime
2121
import functools
2222
import operator
23-
import pytz
2423
import typing
2524
from typing import Any, Dict, Iterable, Iterator, Optional, Tuple
2625
import warnings
@@ -1727,7 +1726,6 @@ def to_arrow(
17271726
.. versionadded:: 1.17.0
17281727
"""
17291728
self._maybe_warn_max_results(bqstorage_client)
1730-
17311729
if not self._validate_bqstorage(bqstorage_client, create_bqstorage_client):
17321730
create_bqstorage_client = False
17331731
bqstorage_client = None
@@ -1946,7 +1944,7 @@ def to_dataframe(
19461944
# Pandas, we set the timestamp_as_object parameter to True, if necessary.
19471945
types_to_check = {
19481946
pyarrow.timestamp("us"),
1949-
pyarrow.timestamp("us", tz=pytz.UTC),
1947+
pyarrow.timestamp("us", tz=datetime.timezone.utc),
19501948
}
19511949

19521950
for column in record_batch:

samples/client_query_w_timestamp_params.py

+1-2
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@ def client_query_w_timestamp_params():
1818
# [START bigquery_query_params_timestamps]
1919
import datetime
2020

21-
import pytz
2221
from google.cloud import bigquery
2322

2423
# Construct a BigQuery client object.
@@ -30,7 +29,7 @@ def client_query_w_timestamp_params():
3029
bigquery.ScalarQueryParameter(
3130
"ts_value",
3231
"TIMESTAMP",
33-
datetime.datetime(2016, 12, 7, 8, 0, tzinfo=pytz.UTC),
32+
datetime.datetime(2016, 12, 7, 8, 0, tzinfo=datetime.timezone.utc),
3433
)
3534
]
3635
)

samples/geography/noxfile.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@
3939

4040
TEST_CONFIG = {
4141
# You can opt out from the test for specific Python versions.
42-
"ignored_versions": ["2.7"],
42+
"ignored_versions": [],
4343
# Old samples are opted out of enforcing Python type hints
4444
# All new samples should feature them
4545
"enforce_type_hints": False,
@@ -86,8 +86,8 @@ def get_pytest_env_vars() -> Dict[str, str]:
8686

8787

8888
# DO NOT EDIT - automatically generated.
89-
# All versions used to tested samples.
90-
ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"]
89+
# All versions used to test samples.
90+
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]
9191

9292
# Any default versions that should be ignored.
9393
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]

samples/snippets/noxfile.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@
3939

4040
TEST_CONFIG = {
4141
# You can opt out from the test for specific Python versions.
42-
"ignored_versions": ["2.7"],
42+
"ignored_versions": [],
4343
# Old samples are opted out of enforcing Python type hints
4444
# All new samples should feature them
4545
"enforce_type_hints": False,
@@ -86,8 +86,8 @@ def get_pytest_env_vars() -> Dict[str, str]:
8686

8787

8888
# DO NOT EDIT - automatically generated.
89-
# All versions used to tested samples.
90-
ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"]
89+
# All versions used to test samples.
90+
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]
9191

9292
# Any default versions that should be ignored.
9393
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]

scripts/readme-gen/templates/install_deps.tmpl.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Install Dependencies
1212
.. _Python Development Environment Setup Guide:
1313
https://cloud.google.com/python/setup
1414

15-
#. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+.
15+
#. Create a virtualenv. Samples are compatible with Python 3.6+.
1616

1717
.. code-block:: bash
1818

tests/data/scalars.jsonl

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
1-
{"bool_col": true, "bytes_col": "abcd", "date_col": "2021-07-21", "datetime_col": "2021-07-21 11:39:45", "geography_col": "POINT(-122.0838511 37.3860517)", "int64_col": "123456789", "numeric_col": "1.23456789", "bignumeric_col": "10.111213141516171819", "float64_col": "1.25", "string_col": "Hello, World", "time_col": "11:41:43.07616", "timestamp_col": "2021-07-21T17:43:43.945289Z"}
2-
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "string_col": null, "time_col": null, "timestamp_col": null}
1+
{"bool_col": true, "bytes_col": "SGVsbG8sIFdvcmxkIQ==", "date_col": "2021-07-21", "datetime_col": "2021-07-21 11:39:45", "geography_col": "POINT(-122.0838511 37.3860517)", "int64_col": "123456789", "interval_col": "P7Y11M9DT4H15M37.123456S", "numeric_col": "1.23456789", "bignumeric_col": "10.111213141516171819", "float64_col": "1.25", "rowindex": 0, "string_col": "Hello, World!", "time_col": "11:41:43.07616", "timestamp_col": "2021-07-21T17:43:43.945289Z"}
2+
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "interval_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "rowindex": 1, "string_col": null, "time_col": null, "timestamp_col": null}

tests/data/scalars_extreme.jsonl

+5-5
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
{"bool_col": true, "bytes_col": "DQo=\n", "date_col": "9999-12-31", "datetime_col": "9999-12-31 23:59:59.999999", "geography_col": "POINT(-135.0000 90.0000)", "int64_col": "9223372036854775807", "numeric_col": "9.9999999999999999999999999999999999999E+28", "bignumeric_col": "9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "+inf", "string_col": "Hello, World", "time_col": "23:59:59.99999", "timestamp_col": "9999-12-31T23:59:59.999999Z"}
2-
{"bool_col": false, "bytes_col": "8J+Zgw==\n", "date_col": "0001-01-01", "datetime_col": "0001-01-01 00:00:00", "geography_col": "POINT(45.0000 -90.0000)", "int64_col": "-9223372036854775808", "numeric_col": "-9.9999999999999999999999999999999999999E+28", "bignumeric_col": "-9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "-inf", "string_col": "Hello, World", "time_col": "00:00:00", "timestamp_col": "0001-01-01T00:00:00.000000Z"}
3-
{"bool_col": true, "bytes_col": "AA==\n", "date_col": "1900-01-01", "datetime_col": "1900-01-01 00:00:00", "geography_col": "POINT(-180.0000 0.0000)", "int64_col": "-1", "numeric_col": "0.000000001", "bignumeric_col": "-0.00000000000000000000000000000000000001", "float64_col": "nan", "string_col": "こんにちは", "time_col": "00:00:00.000001", "timestamp_col": "1900-01-01T00:00:00.000000Z"}
4-
{"bool_col": false, "bytes_col": "", "date_col": "1970-01-01", "datetime_col": "1970-01-01 00:00:00", "geography_col": "POINT(0 0)", "int64_col": "0", "numeric_col": "0.0", "bignumeric_col": "0.0", "float64_col": 0.0, "string_col": "", "time_col": "12:00:00", "timestamp_col": "1970-01-01T00:00:00.000000Z"}
5-
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "string_col": null, "time_col": null, "timestamp_col": null}
1+
{"bool_col": true, "bytes_col": "DQo=\n", "date_col": "9999-12-31", "datetime_col": "9999-12-31 23:59:59.999999", "geography_col": "POINT(-135.0000 90.0000)", "int64_col": "9223372036854775807", "interval_col": "P-10000Y0M-3660000DT-87840000H0M0S", "numeric_col": "9.9999999999999999999999999999999999999E+28", "bignumeric_col": "9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "+inf", "rowindex": 0, "string_col": "Hello, World", "time_col": "23:59:59.999999", "timestamp_col": "9999-12-31T23:59:59.999999Z"}
2+
{"bool_col": false, "bytes_col": "8J+Zgw==\n", "date_col": "0001-01-01", "datetime_col": "0001-01-01 00:00:00", "geography_col": "POINT(45.0000 -90.0000)", "int64_col": "-9223372036854775808", "interval_col": "P10000Y0M3660000DT87840000H0M0S", "numeric_col": "-9.9999999999999999999999999999999999999E+28", "bignumeric_col": "-9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "-inf", "rowindex": 1, "string_col": "Hello, World", "time_col": "00:00:00", "timestamp_col": "0001-01-01T00:00:00.000000Z"}
3+
{"bool_col": true, "bytes_col": "AA==\n", "date_col": "1900-01-01", "datetime_col": "1900-01-01 00:00:00", "geography_col": "POINT(-180.0000 0.0000)", "int64_col": "-1", "interval_col": "P0Y0M0DT0H0M0.000001S", "numeric_col": "0.000000001", "bignumeric_col": "-0.00000000000000000000000000000000000001", "float64_col": "nan", "rowindex": 2, "string_col": "こんにちは", "time_col": "00:00:00.000001", "timestamp_col": "1900-01-01T00:00:00.000000Z"}
4+
{"bool_col": false, "bytes_col": "", "date_col": "1970-01-01", "datetime_col": "1970-01-01 00:00:00", "geography_col": "POINT(0 0)", "int64_col": "0", "interval_col": "P0Y0M0DT0H0M0S", "numeric_col": "0.0", "bignumeric_col": "0.0", "float64_col": 0.0, "rowindex": 3, "string_col": "", "time_col": "12:00:00", "timestamp_col": "1970-01-01T00:00:00.000000Z"}
5+
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "interval_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "rowindex": 4, "string_col": null, "time_col": null, "timestamp_col": null}

tests/data/scalars_schema.json

+32-22
Original file line numberDiff line numberDiff line change
@@ -1,33 +1,33 @@
11
[
22
{
33
"mode": "NULLABLE",
4-
"name": "timestamp_col",
5-
"type": "TIMESTAMP"
4+
"name": "bool_col",
5+
"type": "BOOLEAN"
66
},
77
{
88
"mode": "NULLABLE",
9-
"name": "time_col",
10-
"type": "TIME"
9+
"name": "bignumeric_col",
10+
"type": "BIGNUMERIC"
1111
},
1212
{
1313
"mode": "NULLABLE",
14-
"name": "float64_col",
15-
"type": "FLOAT"
14+
"name": "bytes_col",
15+
"type": "BYTES"
1616
},
1717
{
1818
"mode": "NULLABLE",
19-
"name": "datetime_col",
20-
"type": "DATETIME"
19+
"name": "date_col",
20+
"type": "DATE"
2121
},
2222
{
2323
"mode": "NULLABLE",
24-
"name": "bignumeric_col",
25-
"type": "BIGNUMERIC"
24+
"name": "datetime_col",
25+
"type": "DATETIME"
2626
},
2727
{
2828
"mode": "NULLABLE",
29-
"name": "numeric_col",
30-
"type": "NUMERIC"
29+
"name": "float64_col",
30+
"type": "FLOAT"
3131
},
3232
{
3333
"mode": "NULLABLE",
@@ -36,27 +36,37 @@
3636
},
3737
{
3838
"mode": "NULLABLE",
39-
"name": "date_col",
40-
"type": "DATE"
39+
"name": "int64_col",
40+
"type": "INTEGER"
4141
},
4242
{
4343
"mode": "NULLABLE",
44-
"name": "string_col",
45-
"type": "STRING"
44+
"name": "interval_col",
45+
"type": "INTERVAL"
4646
},
4747
{
4848
"mode": "NULLABLE",
49-
"name": "bool_col",
50-
"type": "BOOLEAN"
49+
"name": "numeric_col",
50+
"type": "NUMERIC"
51+
},
52+
{
53+
"mode": "REQUIRED",
54+
"name": "rowindex",
55+
"type": "INTEGER"
5156
},
5257
{
5358
"mode": "NULLABLE",
54-
"name": "bytes_col",
55-
"type": "BYTES"
59+
"name": "string_col",
60+
"type": "STRING"
5661
},
5762
{
5863
"mode": "NULLABLE",
59-
"name": "int64_col",
60-
"type": "INTEGER"
64+
"name": "time_col",
65+
"type": "TIME"
66+
},
67+
{
68+
"mode": "NULLABLE",
69+
"name": "timestamp_col",
70+
"type": "TIMESTAMP"
6171
}
6272
]

tests/system/test_arrow.py

+29-6
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,14 @@
1414

1515
"""System tests for Arrow connector."""
1616

17+
from typing import Optional
18+
1719
import pyarrow
1820
import pytest
1921

22+
from google.cloud import bigquery
23+
from google.cloud.bigquery import enums
24+
2025

2126
@pytest.mark.parametrize(
2227
("max_results", "scalars_table_name"),
@@ -28,17 +33,35 @@
2833
),
2934
)
3035
def test_list_rows_nullable_scalars_dtypes(
31-
bigquery_client,
32-
scalars_table,
33-
scalars_extreme_table,
34-
max_results,
35-
scalars_table_name,
36+
bigquery_client: bigquery.Client,
37+
scalars_table: str,
38+
scalars_extreme_table: str,
39+
max_results: Optional[int],
40+
scalars_table_name: str,
3641
):
3742
table_id = scalars_table
3843
if scalars_table_name == "scalars_extreme_table":
3944
table_id = scalars_extreme_table
45+
46+
# TODO(GH#836): Avoid INTERVAL columns until they are supported by the
47+
# BigQuery Storage API and pyarrow.
48+
schema = [
49+
bigquery.SchemaField("bool_col", enums.SqlTypeNames.BOOLEAN),
50+
bigquery.SchemaField("bignumeric_col", enums.SqlTypeNames.BIGNUMERIC),
51+
bigquery.SchemaField("bytes_col", enums.SqlTypeNames.BYTES),
52+
bigquery.SchemaField("date_col", enums.SqlTypeNames.DATE),
53+
bigquery.SchemaField("datetime_col", enums.SqlTypeNames.DATETIME),
54+
bigquery.SchemaField("float64_col", enums.SqlTypeNames.FLOAT64),
55+
bigquery.SchemaField("geography_col", enums.SqlTypeNames.GEOGRAPHY),
56+
bigquery.SchemaField("int64_col", enums.SqlTypeNames.INT64),
57+
bigquery.SchemaField("numeric_col", enums.SqlTypeNames.NUMERIC),
58+
bigquery.SchemaField("string_col", enums.SqlTypeNames.STRING),
59+
bigquery.SchemaField("time_col", enums.SqlTypeNames.TIME),
60+
bigquery.SchemaField("timestamp_col", enums.SqlTypeNames.TIMESTAMP),
61+
]
62+
4063
arrow_table = bigquery_client.list_rows(
41-
table_id, max_results=max_results,
64+
table_id, max_results=max_results, selected_fields=schema,
4265
).to_arrow()
4366

4467
schema = arrow_table.schema

tests/system/test_client.py

+5-48
Original file line numberDiff line numberDiff line change
@@ -1962,6 +1962,11 @@ def test_query_w_query_params(self):
19621962
"expected": {"friends": [phred_name, bharney_name]},
19631963
"query_parameters": [with_friends_param],
19641964
},
1965+
{
1966+
"sql": "SELECT @bignum_param",
1967+
"expected": bignum,
1968+
"query_parameters": [bignum_param],
1969+
},
19651970
]
19661971

19671972
for example in examples:
@@ -2406,54 +2411,6 @@ def test_nested_table_to_arrow(self):
24062411
self.assertTrue(pyarrow.types.is_list(record_col[1].type))
24072412
self.assertTrue(pyarrow.types.is_int64(record_col[1].type.value_type))
24082413

2409-
def test_list_rows_empty_table(self):
2410-
from google.cloud.bigquery.table import RowIterator
2411-
2412-
dataset_id = _make_dataset_id("empty_table")
2413-
dataset = self.temp_dataset(dataset_id)
2414-
table_ref = dataset.table("empty_table")
2415-
table = Config.CLIENT.create_table(bigquery.Table(table_ref))
2416-
2417-
# It's a bit silly to list rows for an empty table, but this does
2418-
# happen as the result of a DDL query from an IPython magic command.
2419-
rows = Config.CLIENT.list_rows(table)
2420-
self.assertIsInstance(rows, RowIterator)
2421-
self.assertEqual(tuple(rows), ())
2422-
2423-
def test_list_rows_page_size(self):
2424-
from google.cloud.bigquery.job import SourceFormat
2425-
from google.cloud.bigquery.job import WriteDisposition
2426-
2427-
num_items = 7
2428-
page_size = 3
2429-
num_pages, num_last_page = divmod(num_items, page_size)
2430-
2431-
SF = bigquery.SchemaField
2432-
schema = [SF("string_col", "STRING", mode="NULLABLE")]
2433-
to_insert = [{"string_col": "item%d" % i} for i in range(num_items)]
2434-
rows = [json.dumps(row) for row in to_insert]
2435-
body = io.BytesIO("{}\n".format("\n".join(rows)).encode("ascii"))
2436-
2437-
table_id = "test_table"
2438-
dataset = self.temp_dataset(_make_dataset_id("nested_df"))
2439-
table = dataset.table(table_id)
2440-
self.to_delete.insert(0, table)
2441-
job_config = bigquery.LoadJobConfig()
2442-
job_config.write_disposition = WriteDisposition.WRITE_TRUNCATE
2443-
job_config.source_format = SourceFormat.NEWLINE_DELIMITED_JSON
2444-
job_config.schema = schema
2445-
# Load a table using a local JSON file from memory.
2446-
Config.CLIENT.load_table_from_file(body, table, job_config=job_config).result()
2447-
2448-
df = Config.CLIENT.list_rows(table, selected_fields=schema, page_size=page_size)
2449-
pages = df.pages
2450-
2451-
for i in range(num_pages):
2452-
page = next(pages)
2453-
self.assertEqual(page.num_items, page_size)
2454-
page = next(pages)
2455-
self.assertEqual(page.num_items, num_last_page)
2456-
24572414
def temp_dataset(self, dataset_id, location=None):
24582415
project = Config.CLIENT.project
24592416
dataset_ref = bigquery.DatasetReference(project, dataset_id)

0 commit comments

Comments
 (0)