Skip to content

Commit 82bb938

Browse files
vertex-sdk-botcopybara-github
authored andcommitted
docs: Update the documentation for the tabular_dataset class
PiperOrigin-RevId: 690910108
1 parent fe53922 commit 82bb938

File tree

1 file changed

+5
-15
lines changed

1 file changed

+5
-15
lines changed

google/cloud/aiplatform/datasets/tabular_dataset.py

+5-15
Original file line numberDiff line numberDiff line change
@@ -52,19 +52,8 @@ class TabularDataset(datasets._ColumnNamesDataset):
5252
my_dataset = aiplatform.TabularDataset.create(
5353
display_name="my-dataset", gcs_source=['gs://path/to/my/dataset.csv'])
5454
```
55-
56-
The following code shows you how to create and import a tabular
57-
dataset in two distinct steps.
58-
59-
```py
60-
my_dataset = aiplatform.TextDataset.create(
61-
display_name="my-dataset")
62-
63-
my_dataset.import(
64-
gcs_source=['gs://path/to/my/dataset.csv']
65-
import_schema_uri=aiplatform.schema.dataset.ioformat.text.multi_label_classification
66-
)
67-
```
55+
Contrary to unstructured datasets, creating and importing a tabular dataset
56+
can only be done in a single step.
6857
6958
If you create a tabular dataset with a pandas
7059
[`DataFrame`](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html),
@@ -108,10 +97,11 @@ def create(
10897
Optional. The URI to one or more Google Cloud Storage buckets that contain
10998
your datasets. For example, `str: "gs://bucket/file.csv"` or
11099
`Sequence[str]: ["gs://bucket/file1.csv",
111-
"gs://bucket/file2.csv"]`.
100+
"gs://bucket/file2.csv"]`. Either `gcs_source` or `bq_source` must be specified.
112101
bq_source (str):
113102
Optional. The URI to a BigQuery table that's used as an input source. For
114-
example, `bq://project.dataset.table_name`.
103+
example, `bq://project.dataset.table_name`. Either `gcs_source`
104+
or `bq_source` must be specified.
115105
project (str):
116106
Optional. The name of the Google Cloud project to which this
117107
`TabularDataset` is uploaded. This overrides the project that

0 commit comments

Comments
 (0)