Skip to content

Commit c5782d4

Browse files
authored
Merge pull request #74 from masoncusack/master
Documentation updates for correctness and demonstration
2 parents 9753cdd + 75bb693 commit c5782d4

File tree

4 files changed

+23
-5
lines changed

4 files changed

+23
-5
lines changed

databricks/resource_databricks_cluster.go

+2
Original file line numberDiff line numberDiff line change
@@ -109,6 +109,7 @@ func resourceCluster() *schema.Resource {
109109
Type: schema.TypeString,
110110
Optional: true,
111111
ConflictsWith: []string{"instance_pool_id"},
112+
AtLeastOneOf: []string{"instance_pool_id"},
112113
},
113114
"ssh_public_keys": &schema.Schema{
114115
Type: schema.TypeSet,
@@ -300,6 +301,7 @@ func resourceCluster() *schema.Resource {
300301
Type: schema.TypeString,
301302
Optional: true,
302303
ConflictsWith: []string{"node_type_id", "driver_node_type_id", "aws_attributes"},
304+
AtLeastOneOf: []string{"node_type_id"},
303305
},
304306
"idempotency_token": &schema.Schema{
305307
Type: schema.TypeInt,

databricks/resource_databricks_notebook.go

+1
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,7 @@ func resourceNotebook() *schema.Resource {
7777
ForceNew: true,
7878
ValidateFunc: validation.StringInSlice([]string{
7979
string(model.DBC),
80+
string(model.Jupyter),
8081
string(model.Source),
8182
string(model.HTML),
8283
}, false),

website/content/Resources/cluster.md

+5-3
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,8 @@ resource "databricks_cluster" "my-cluster" {
2727
}
2828
```
2929

30+
>Note: For Azure, valid node_type_ids are the "Sizes" of the virtual machines to use as workers, e.g. Standard_DS3_v2.
31+
3032
## Argument Reference
3133

3234
The following arguments are supported:
@@ -64,7 +66,7 @@ overloaded. max_workers must be strictly greater than min_workers.
6466
If not specified at creation, the cluster name will be an empty string.
6567

6668
#### - `spark_version`:
67-
> **(Optional)** The Spark version of the cluster. A list of available
69+
> **(Required)** The Spark version of the cluster. A list of available
6870
Spark versions can be retrieved by using the Runtime Versions API call. This field is required.
6971

7072
#### - `spark_conf`:
@@ -144,7 +146,7 @@ value must be within the range 500 - 4096. Custom EBS volumes cannot be specifie
144146
is optional; if unset, the driver node type will be set as the same value as node_type_id defined above.
145147

146148
#### - `node_type_id`:
147-
> **(Optional)** This field encodes, through a single value, the resources
149+
> **(Optional - required if instance_pool_id is not given)** This field encodes, through a single value, the resources
148150
available to each of the Spark nodes in this cluster. For example, the Spark nodes can be provisioned and optimized for
149151
memory or compute intensive workloads A list of available node types can be retrieved by using the List Node Types API
150152
call. This field is required.
@@ -322,7 +324,7 @@ this cluster dynamically acquires additional disk space when its Spark workers a
322324
feature requires specific AWS permissions to function correctly - refer to Autoscaling local storage for details.
323325
324326
#### - `instance_pool_id`:
325-
> **(Optional)** The optional ID of the instance pool to which the
327+
> **(Optional - required if node_type_id is not given)** The optional ID of the instance pool to which the
326328
cluster belongs. Refer to Instance Pools API for details.
327329
328330
#### - `single_user_name`:

website/content/Resources/notebook.md

+15-2
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,19 @@ resource "databricks_notebook" "my_databricks_notebook" {
2323
format = "DBC"
2424
}
2525
```
26+
27+
For deployment of an empty Python notebook, the following example might be useful:
28+
29+
```hcl
30+
resource "databricks_notebook" "notebook" {
31+
content = base64encode("# Welcome to your Python notebook")
32+
path = "/mynotebook"
33+
overwrite = false
34+
mkdirs = true
35+
language = "PYTHON"
36+
format = "SOURCE"
37+
}
38+
```
2639

2740
## Argument Reference
2841

@@ -33,7 +46,7 @@ The following arguments are supported:
3346
exception with error code MAX_NOTEBOOK_SIZE_EXCEEDED will be thrown.
3447

3548
#### - `path`:
36-
> **(Required)** The absolute path of the notebook or directory.
49+
> **(Required)** The absolute path of the notebook or directory, beginning with "/", e.g. "/mynotebook"
3750
Exporting a directory is supported only for DBC. This field is **required**.
3851

3952
#### - `language`:
@@ -51,7 +64,7 @@ returns an error RESOURCE_ALREADY_EXISTS. If this operation fails it may have su
5164

5265
#### - `format`:
5366
> **(Required)** This specifies the format of the file to be imported.
54-
By default, this is SOURCE. However it may be one of: SOURCE, HTML, JUPYTER, DBC. The value is case sensitive.
67+
By default, this is SOURCE. However it may be one of: SOURCE, HTML, JUPYTER, DBC. The value is case sensitive. SOURCE is suitable for .scala, .py, .r, .sql extension based files, HTML for .html files, JUPYTER for .ipynb files, and DBC for .dbc files.
5568

5669
## Attribute Reference
5770

0 commit comments

Comments
 (0)