diff --git a/README.rst b/README.rst index f81adc4..bf9804e 100644 --- a/README.rst +++ b/README.rst @@ -1,5 +1,5 @@ -Python Client for Google BigQuery -================================= +IPython Magics for BigQuery +=========================== |GA| |pypi| |versions| @@ -70,7 +70,7 @@ Mac/Linux pip install virtualenv virtualenv source /bin/activate - /bin/pip install google-cloud-bigquery + /bin/pip install bigquery-magics Windows @@ -81,61 +81,4 @@ Windows pip install virtualenv virtualenv \Scripts\activate - \Scripts\pip.exe install google-cloud-bigquery - -Example Usage -------------- - -Perform a query -~~~~~~~~~~~~~~~ - -.. code:: python - - from google.cloud import bigquery - - client = bigquery.Client() - - # Perform a query. - QUERY = ( - 'SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` ' - 'WHERE state = "TX" ' - 'LIMIT 100') - query_job = client.query(QUERY) # API request - rows = query_job.result() # Waits for query to finish - - for row in rows: - print(row.name) - -Instrumenting With OpenTelemetry --------------------------------- - -This application uses `OpenTelemetry`_ to output tracing data from -API calls to BigQuery. To enable OpenTelemetry tracing in -the BigQuery client the following PyPI packages need to be installed: - -.. _OpenTelemetry: https://opentelemetry.io - -.. code-block:: console - - pip install google-cloud-bigquery[opentelemetry] opentelemetry-exporter-gcp-trace - -After installation, OpenTelemetry can be used in the BigQuery -client and in BigQuery jobs. First, however, an exporter must be -specified for where the trace data will be outputted to. An -example of this can be found here: - -.. code-block:: python - - from opentelemetry import trace - from opentelemetry.sdk.trace import TracerProvider - from opentelemetry.sdk.trace.export import BatchSpanProcessor - from opentelemetry.exporter.cloud_trace import CloudTraceSpanExporter - tracer_provider = TracerProvider() - tracer_provider = BatchSpanProcessor(CloudTraceSpanExporter()) - trace.set_tracer_provider(TracerProvider()) - -In this example all tracing data will be published to the Google -`Cloud Trace`_ console. For more information on OpenTelemetry, please consult the `OpenTelemetry documentation`_. - -.. _OpenTelemetry documentation: https://opentelemetry-python.readthedocs.io -.. _Cloud Trace: https://cloud.google.com/trace + \Scripts\pip.exe install bigquery-magics \ No newline at end of file diff --git a/docs/bigquery/legacy_proto_types.rst b/docs/bigquery/legacy_proto_types.rst index bc1e937..2012229 100644 --- a/docs/bigquery/legacy_proto_types.rst +++ b/docs/bigquery/legacy_proto_types.rst @@ -6,8 +6,6 @@ Legacy proto-based Types for Google Cloud Bigquery v2 API anymore. They might also differ from the types uspported on the backend. It is therefore strongly advised to migrate to the types found in :doc:`standard_sql`. - Also see the :doc:`3.0.0 Migration Guide<../UPGRADING>` for more information. - .. automodule:: google.cloud.bigquery_v2.types :members: :undoc-members: diff --git a/docs/index.rst b/docs/index.rst index d5ebb10..d2d37a7 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -15,8 +15,8 @@ More Examples .. toctree:: :maxdepth: 2 - usage/index - Official Google BigQuery How-to Guides + magics + Official Google BigQuery Magics Tutorials API Reference ------------- diff --git a/docs/magics.rst b/docs/magics.rst index bb708c9..11fc326 100644 --- a/docs/magics.rst +++ b/docs/magics.rst @@ -15,14 +15,14 @@ Code Samples Running a query: -.. literalinclude:: ./samples/snippets/query.py +.. literalinclude:: ../samples/snippets/query.py :dedent: 4 :start-after: [START bigquery_jupyter_query] :end-before: [END bigquery_jupyter_query] Running a parameterized query: -.. literalinclude:: ./samples/snippets/query_params_scalars.py +.. literalinclude:: ../samples/snippets/query_params_scalars.py :dedent: 4 :start-after: [START bigquery_jupyter_query_params_scalars] :end-before: [END bigquery_jupyter_query_params_scalars] diff --git a/docs/usage/client.rst b/docs/usage/client.rst deleted file mode 100644 index d631585..0000000 --- a/docs/usage/client.rst +++ /dev/null @@ -1,25 +0,0 @@ -Creating a Client -~~~~~~~~~~~~~~~~~ - -A project is the top-level container in the ``BigQuery`` API: it is tied -closely to billing, and can provide default access control across all its -datasets. If no ``project`` is passed to the client container, the library -attempts to infer a project using the environment (including explicit -environment variables, GAE, and GCE). - -To override the project inferred from the environment, pass an explicit -``project`` to the :class:`~google.cloud.bigquery.client.Client` constructor, -or to either of the alternative ``classmethod`` factories: - -.. code-block:: python - - from google.cloud import bigquery - client = bigquery.Client(project='PROJECT_ID') - - -Project ACLs -^^^^^^^^^^^^ - -Each project has an access control list granting reader / writer / owner -permission to one or more entities. This list cannot be queried or set -via the API; it must be managed using the Google Developer Console. diff --git a/docs/usage/datasets.rst b/docs/usage/datasets.rst deleted file mode 100644 index 2daee77..0000000 --- a/docs/usage/datasets.rst +++ /dev/null @@ -1,131 +0,0 @@ -Managing Datasets -~~~~~~~~~~~~~~~~~ - -A dataset represents a collection of tables, and applies several default -policies to tables as they are created: - -- An access control list (ACL). When created, a dataset has an ACL - which maps to the ACL inherited from its project. - -- A default table expiration period. If set, tables created within the - dataset will have the value as their expiration period. - -See BigQuery documentation for more information on -`Datasets `_. - -Listing Datasets -^^^^^^^^^^^^^^^^ - -List datasets for a project with the -:func:`~google.cloud.bigquery.client.Client.list_datasets` method: - -.. literalinclude:: ../samples/list_datasets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_list_datasets] - :end-before: [END bigquery_list_datasets] - -List datasets by label for a project with the -:func:`~google.cloud.bigquery.client.Client.list_datasets` method: - -.. literalinclude:: ../samples/list_datasets_by_label.py - :language: python - :dedent: 4 - :start-after: [START bigquery_list_datasets_by_label] - :end-before: [END bigquery_list_datasets_by_label] - -Getting a Dataset -^^^^^^^^^^^^^^^^^ - -Get a dataset resource (to pick up changes made by another client) with the -:func:`~google.cloud.bigquery.client.Client.get_dataset` method: - -.. literalinclude:: ../samples/get_dataset.py - :language: python - :dedent: 4 - :start-after: [START bigquery_get_dataset] - :end-before: [END bigquery_get_dataset] - -Determine if a dataset exists with the -:func:`~google.cloud.bigquery.client.Client.get_dataset` method: - -.. literalinclude:: ../samples/dataset_exists.py - :language: python - :dedent: 4 - :start-after: [START bigquery_dataset_exists] - :end-before: [END bigquery_dataset_exists] - -Creating a Dataset -^^^^^^^^^^^^^^^^^^ - -Create a new dataset with the -:func:`~google.cloud.bigquery.client.Client.create_dataset` method: - -.. literalinclude:: ../samples/create_dataset.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_dataset] - :end-before: [END bigquery_create_dataset] - -Updating a Dataset -^^^^^^^^^^^^^^^^^^ - -Update a property in a dataset's metadata with the -:func:`~google.cloud.bigquery.client.Client.update_dataset` method: - -.. literalinclude:: ../samples/update_dataset_description.py - :language: python - :dedent: 4 - :start-after: [START bigquery_update_dataset_description] - :end-before: [END bigquery_update_dataset_description] - -Modify user permissions on a dataset with the -:func:`~google.cloud.bigquery.client.Client.update_dataset` method: - -.. literalinclude:: ../samples/update_dataset_access.py - :language: python - :dedent: 4 - :start-after: [START bigquery_update_dataset_access] - :end-before: [END bigquery_update_dataset_access] - -Manage Dataset labels -^^^^^^^^^^^^^^^^^^^^^ - -Add labels to a dataset with the -:func:`~google.cloud.bigquery.client.Client.update_dataset` method: - -.. literalinclude:: ../samples/label_dataset.py - :language: python - :dedent: 4 - :start-after: [START bigquery_label_dataset] - :end-before: [END bigquery_label_dataset] - -Get dataset's labels with the -:func:`~google.cloud.bigquery.client.Client.get_dataset` method: - -.. literalinclude:: ../samples/get_dataset_labels.py - :language: python - :dedent: 4 - :start-after: [START bigquery_get_dataset_labels] - :end-before: [END bigquery_get_dataset_labels] - -Delete dataset's labels with the -:func:`~google.cloud.bigquery.client.Client.update_dataset` method: - -.. literalinclude:: ../samples/delete_dataset_labels.py - :language: python - :dedent: 4 - :start-after: [START bigquery_delete_label_dataset] - :end-before: [END bigquery_delete_label_dataset] - -Deleting a Dataset -^^^^^^^^^^^^^^^^^^ - -Delete a dataset with the -:func:`~google.cloud.bigquery.client.Client.delete_dataset` method: - -.. literalinclude:: ../samples/delete_dataset.py - :language: python - :dedent: 4 - :start-after: [START bigquery_delete_dataset] - :end-before: [END bigquery_delete_dataset] diff --git a/docs/usage/encryption.rst b/docs/usage/encryption.rst deleted file mode 100644 index 3e6d5aa..0000000 --- a/docs/usage/encryption.rst +++ /dev/null @@ -1,52 +0,0 @@ -Using Customer Managed Encryption Keys -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Table data is always encrypted at rest, but BigQuery also provides a way for -you to control what keys it uses to encrypt they data. See `Protecting data -with Cloud KMS keys -`_ -in the BigQuery documentation for more details. - -Create a new table, using a customer-managed encryption key from -Cloud KMS to encrypt it. - -.. literalinclude:: ../samples/snippets/create_table_cmek.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_table_cmek] - :end-before: [END bigquery_create_table_cmek] - -Change the key used to encrypt a table. - -.. literalinclude:: ../snippets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_update_table_cmek] - :end-before: [END bigquery_update_table_cmek] - -Load a file from Cloud Storage, using a customer-managed encryption key from -Cloud KMS for the destination table. - -.. literalinclude:: ../samples/load_table_uri_cmek.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_json_cmek] - :end-before: [END bigquery_load_table_gcs_json_cmek] - -Copy a table, using a customer-managed encryption key from Cloud KMS for the -destination table. - -.. literalinclude:: ../samples/copy_table_cmek.py - :language: python - :dedent: 4 - :start-after: [START bigquery_copy_table_cmek] - :end-before: [END bigquery_copy_table_cmek] - -Write query results to a table, using a customer-managed encryption key from -Cloud KMS for the destination table. - -.. literalinclude:: ../samples/client_query_destination_table_cmek.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_destination_table_cmek] - :end-before: [END bigquery_query_destination_table_cmek] diff --git a/docs/usage/index.rst b/docs/usage/index.rst deleted file mode 100644 index 1d3cc9f..0000000 --- a/docs/usage/index.rst +++ /dev/null @@ -1,35 +0,0 @@ -Usage Guides -~~~~~~~~~~~~ - -BigQuery Basics -^^^^^^^^^^^^^^^ - -.. toctree:: - :maxdepth: 1 - - client - queries - -Working with BigQuery Resources -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -.. toctree:: - :maxdepth: 1 - - datasets - tables - encryption - jobs - -Integrations with Other Libraries -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -.. toctree:: - :maxdepth: 1 - - pandas - -See also, the :mod:`google.cloud.bigquery.magics.magics` module for -integrations with Jupyter. - - diff --git a/docs/usage/jobs.rst b/docs/usage/jobs.rst deleted file mode 100644 index c3dd710..0000000 --- a/docs/usage/jobs.rst +++ /dev/null @@ -1,21 +0,0 @@ -Managing Jobs -~~~~~~~~~~~~~ - -Jobs describe actions performed on data in BigQuery tables: - -- Load data into a table -- Run a query against data in one or more tables -- Extract data from a table -- Copy a table - -Listing jobs -^^^^^^^^^^^^ - -List jobs for a project with the -:func:`~google.cloud.bigquery.client.Client.list_jobs` method: - -.. literalinclude:: ../samples/client_list_jobs.py - :language: python - :dedent: 4 - :start-after: [START bigquery_list_jobs] - :end-before: [END bigquery_list_jobs] diff --git a/docs/usage/pandas.rst b/docs/usage/pandas.rst deleted file mode 100644 index 550a677..0000000 --- a/docs/usage/pandas.rst +++ /dev/null @@ -1,109 +0,0 @@ -Using BigQuery with Pandas -~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Retrieve BigQuery data as a Pandas DataFrame -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -As of version 0.29.0, you can use the -:func:`~google.cloud.bigquery.table.RowIterator.to_dataframe` function to -retrieve query results or table rows as a :class:`pandas.DataFrame`. - -First, ensure that the :mod:`pandas` library is installed by running: - -.. code-block:: bash - - pip install --upgrade pandas - -Alternatively, you can install the BigQuery Python client library with -:mod:`pandas` by running: - -.. code-block:: bash - - pip install --upgrade 'google-cloud-bigquery[pandas]' - -To retrieve query results as a :class:`pandas.DataFrame`: - -.. literalinclude:: ../snippets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_results_dataframe] - :end-before: [END bigquery_query_results_dataframe] - -To retrieve table rows as a :class:`pandas.DataFrame`: - -.. literalinclude:: ../snippets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_list_rows_dataframe] - :end-before: [END bigquery_list_rows_dataframe] - -The following data types are used when creating a pandas DataFrame. - -.. list-table:: Pandas Data Type Mapping - :header-rows: 1 - - * - BigQuery - - pandas - - Notes - * - BOOL - - boolean - - - * - DATETIME - - datetime64[ns], object - - The object dtype is used when there are values not representable in a - pandas nanosecond-precision timestamp. - * - DATE - - dbdate, object - - The object dtype is used when there are values not representable in a - pandas nanosecond-precision timestamp. - - Requires the ``db-dtypes`` package. See the `db-dtypes usage guide - `_ - * - FLOAT64 - - float64 - - - * - INT64 - - Int64 - - - * - TIME - - dbtime - - Requires the ``db-dtypes`` package. See the `db-dtypes usage guide - `_ - -Retrieve BigQuery GEOGRAPHY data as a GeoPandas GeoDataFrame ------------------------------------------------------------- - -`GeoPandas `_ adds geospatial analytics -capabilities to Pandas. To retrieve query results containing -GEOGRAPHY data as a :class:`geopandas.GeoDataFrame`: - -.. literalinclude:: ../samples/geography/to_geodataframe.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_results_geodataframe] - :end-before: [END bigquery_query_results_geodataframe] - - -Load a Pandas DataFrame to a BigQuery Table -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -As of version 1.3.0, you can use the -:func:`~google.cloud.bigquery.client.Client.load_table_from_dataframe` function -to load data from a :class:`pandas.DataFrame` to a -:class:`~google.cloud.bigquery.table.Table`. To use this function, in addition -to :mod:`pandas`, you will need to install the :mod:`pyarrow` library. You can -install the BigQuery Python client library with :mod:`pandas` and -:mod:`pyarrow` by running: - -.. code-block:: bash - - pip install --upgrade google-cloud-bigquery[pandas,pyarrow] - -The following example demonstrates how to create a :class:`pandas.DataFrame` -and load it into a new table: - -.. literalinclude:: ../samples/load_table_dataframe.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_dataframe] - :end-before: [END bigquery_load_table_dataframe] diff --git a/docs/usage/queries.rst b/docs/usage/queries.rst deleted file mode 100644 index fc57e54..0000000 --- a/docs/usage/queries.rst +++ /dev/null @@ -1,63 +0,0 @@ -Running Queries -~~~~~~~~~~~~~~~ - -Querying data -^^^^^^^^^^^^^ - -Run a query and wait for it to finish with the -:func:`~google.cloud.bigquery.client.Client.query` method: - -.. literalinclude:: ../samples/client_query.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query] - :end-before: [END bigquery_query] - - -Run a dry run query -^^^^^^^^^^^^^^^^^^^ - -.. literalinclude:: ../samples/client_query_dry_run.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_dry_run] - :end-before: [END bigquery_query_dry_run] - - -Writing query results to a destination table -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -See BigQuery documentation for more information on -`writing query results `_. - -.. literalinclude:: ../samples/client_query_destination_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_destination_table] - :end-before: [END bigquery_query_destination_table] - - -Run a query using a named query parameter -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -See BigQuery documentation for more information on -`parameterized queries `_. - -.. literalinclude:: ../samples/client_query_w_named_params.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_params_named] - :end-before: [END bigquery_query_params_named] - -Run a script -^^^^^^^^^^^^ - -See BigQuery documentation for more information on `scripting in BigQuery -standard SQL -`_. - -.. literalinclude:: ../samples/query_script.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_script] - :end-before: [END bigquery_query_script] diff --git a/docs/usage/tables.rst b/docs/usage/tables.rst deleted file mode 100644 index a4f42b1..0000000 --- a/docs/usage/tables.rst +++ /dev/null @@ -1,316 +0,0 @@ -Managing Tables -~~~~~~~~~~~~~~~ - -Tables exist within datasets. See BigQuery documentation for more information -on `Tables `_. - -Listing Tables -^^^^^^^^^^^^^^ - -List the tables belonging to a dataset with the -:func:`~google.cloud.bigquery.client.Client.list_tables` method: - -.. literalinclude:: ../samples/list_tables.py - :language: python - :dedent: 4 - :start-after: [START bigquery_list_tables] - :end-before: [END bigquery_list_tables] - -Getting a Table -^^^^^^^^^^^^^^^ - -Get a table resource with the -:func:`~google.cloud.bigquery.client.Client.get_table` method: - -.. literalinclude:: ../samples/get_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_get_table] - :end-before: [END bigquery_get_table] - -Determine if a table exists with the -:func:`~google.cloud.bigquery.client.Client.get_table` method: - -.. literalinclude:: ../samples/table_exists.py - :language: python - :dedent: 4 - :start-after: [START bigquery_table_exists] - :end-before: [END bigquery_table_exists] - -Browse data rows in a table with the -:func:`~google.cloud.bigquery.client.Client.list_rows` method: - -.. literalinclude:: ../samples/browse_table_data.py - :language: python - :dedent: 4 - :start-after: [START bigquery_browse_table] - :end-before: [END bigquery_browse_table] - -Creating a Table -^^^^^^^^^^^^^^^^ - -Create an empty table with the -:func:`~google.cloud.bigquery.client.Client.create_table` method: - -.. literalinclude:: ../samples/create_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_table] - :end-before: [END bigquery_create_table] - -Create a table using an external data source with the -:func:`~google.cloud.bigquery.client.Client.create_table` method: - -.. literalinclude:: ../samples/snippets/create_table_external_data_configuration.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_table_external_data_configuration] - :end-before: [END bigquery_create_table_external_data_configuration] - -Create a clustered table with the -:func:`~google.cloud.bigquery.client.Client.create_table` method: - -.. literalinclude:: ../samples/create_table_clustered.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_table_clustered] - :end-before: [END bigquery_create_table_clustered] - -Create an integer range partitioned table with the -:func:`~google.cloud.bigquery.client.Client.create_table` method: - -.. literalinclude:: ../samples/create_table_range_partitioned.py - :language: python - :dedent: 4 - :start-after: [START bigquery_create_table_range_partitioned] - :end-before: [END bigquery_create_table_range_partitioned] - -Load table data from a file with the -:func:`~google.cloud.bigquery.client.Client.load_table_from_file` method: - -.. literalinclude:: ../samples/load_table_file.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_from_file] - :end-before: [END bigquery_load_from_file] - -Creating a clustered table from a query result: - -.. literalinclude:: ../samples/client_query_destination_table_clustered.py - :language: python - :dedent: 4 - :start-after: [START bigquery_query_clustered_table] - :end-before: [END bigquery_query_clustered_table] - -Creating a clustered table when you load data with the -:func:`~google.cloud.bigquery.client.Client.load_table_from_uri` method: - -.. literalinclude:: ../samples/load_table_clustered.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_clustered] - :end-before: [END bigquery_load_table_clustered] - -Load a CSV file from Cloud Storage with the -:func:`~google.cloud.bigquery.client.Client.load_table_from_uri` method: - -.. literalinclude:: ../samples/load_table_uri_csv.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_csv] - :end-before: [END bigquery_load_table_gcs_csv] - -See also: `Loading CSV data from Cloud Storage -`_. - -Load a JSON file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_json.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_json] - :end-before: [END bigquery_load_table_gcs_json] - -See also: `Loading JSON data from Cloud Storage -`_. - -Load a Parquet file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_parquet.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_parquet] - :end-before: [END bigquery_load_table_gcs_parquet] - -See also: `Loading Parquet data from Cloud Storage -`_. - -Load an Avro file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_avro.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_avro] - :end-before: [END bigquery_load_table_gcs_avro] - -See also: `Loading Avro data from Cloud Storage -`_. - -Load an ORC file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_orc.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_orc] - :end-before: [END bigquery_load_table_gcs_orc] - -See also: `Loading ORC data from Cloud Storage -`_. - -Load a CSV file from Cloud Storage and auto-detect schema: - -.. literalinclude:: ../samples/load_table_uri_autodetect_csv.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_csv_autodetect] - :end-before: [END bigquery_load_table_gcs_csv_autodetect] - -Load a JSON file from Cloud Storage and auto-detect schema: - -.. literalinclude:: ../samples/load_table_uri_autodetect_json.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_json_autodetect] - :end-before: [END bigquery_load_table_gcs_json_autodetect] - -Updating a Table -^^^^^^^^^^^^^^^^ - -Update a property in a table's metadata with the -:func:`~google.cloud.bigquery.client.Client.update_table` method: - -.. literalinclude:: ../snippets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_update_table_description] - :end-before: [END bigquery_update_table_description] - -Insert rows into a table's data with the -:func:`~google.cloud.bigquery.client.Client.insert_rows` method: - -.. literalinclude:: ../samples/table_insert_rows.py - :language: python - :dedent: 4 - :start-after: [START bigquery_table_insert_rows] - :end-before: [END bigquery_table_insert_rows] - -Insert rows into a table's data with the -:func:`~google.cloud.bigquery.client.Client.insert_rows` method, achieving -higher write limit: - -.. literalinclude:: ../samples/table_insert_rows_explicit_none_insert_ids.py - :language: python - :dedent: 4 - :start-after: [START bigquery_table_insert_rows_explicit_none_insert_ids] - :end-before: [END bigquery_table_insert_rows_explicit_none_insert_ids] - -Mind that inserting data with ``None`` row insert IDs can come at the expense of -more duplicate inserts. See also: -`Streaming inserts `_. - -Add an empty column to the existing table with the -:func:`~google.cloud.bigquery.update_table` method: - -.. literalinclude:: ../samples/add_empty_column.py - :language: python - :dedent: 4 - :start-after: [START bigquery_add_empty_column] - :end-before: [END bigquery_add_empty_column] - -Copying a Table -^^^^^^^^^^^^^^^ - -Copy a table with the -:func:`~google.cloud.bigquery.client.Client.copy_table` method: - -.. literalinclude:: ../samples/copy_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_copy_table] - :end-before: [END bigquery_copy_table] - -Copy table data to Google Cloud Storage with the -:func:`~google.cloud.bigquery.client.Client.extract_table` method: - -.. literalinclude:: ../snippets.py - :language: python - :dedent: 4 - :start-after: [START bigquery_extract_table] - :end-before: [END bigquery_extract_table] - -Deleting a Table -^^^^^^^^^^^^^^^^ - -Delete a table with the -:func:`~google.cloud.bigquery.client.Client.delete_table` method: - -.. literalinclude:: ../samples/delete_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_delete_table] - :end-before: [END bigquery_delete_table] - -Restoring a Deleted Table -^^^^^^^^^^^^^^^^^^^^^^^^^ - -Restore a deleted table from a snapshot by using the -:func:`~google.cloud.bigquery.client.Client.copy_table` method: - -.. literalinclude:: ../samples/undelete_table.py - :language: python - :dedent: 4 - :start-after: [START bigquery_undelete_table] - :end-before: [END bigquery_undelete_table] - -Overwrite a Table -^^^^^^^^^^^^^^^^^ - -Replace the table data with an Avro file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_truncate_avro.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_avro_truncate] - :end-before: [END bigquery_load_table_gcs_avro_truncate] - -Replace the table data with a CSV file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_truncate_csv.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_csv_truncate] - :end-before: [END bigquery_load_table_gcs_csv_truncate] - -Replace the table data with a JSON file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_truncate_json.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_json_truncate] - :end-before: [END bigquery_load_table_gcs_json_truncate] - -Replace the table data with an ORC file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_truncate_orc.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_orc_truncate] - :end-before: [END bigquery_load_table_gcs_orc_truncate] - -Replace the table data with a Parquet file from Cloud Storage: - -.. literalinclude:: ../samples/load_table_uri_truncate_parquet.py - :language: python - :dedent: 4 - :start-after: [START bigquery_load_table_gcs_parquet_truncate] - :end-before: [END bigquery_load_table_gcs_parquet_truncate]