Skip to content

Commit 5c8d1ec

Browse files
authored
Merge branch 'main' into doc/reorganize-notes
2 parents 778c4d7 + 4720882 commit 5c8d1ec

30 files changed

+1074
-722
lines changed

.github/ISSUE_TEMPLATE/provider-issue.md

+7-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,13 @@ title: "[ISSUE] Issue with `databricks_XXX` resource"
77
<!--
88
Hi there,
99
10-
Thank you for opening an issue. Please note that we try to keep the Databricks Provider issue tracker reserved for bug reports and feature requests. For general usage questions, please see: <https://www.terraform.io/community.html>.
10+
Please make sure that you checked Troubleshooting Guide first:
11+
https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/troubleshooting
12+
13+
Thank you for opening an issue.
14+
Please note that we try to keep the Databricks Provider issue tracker reserved for bug reports and feature requests.
15+
For general Terraform usage questions, please see: <https://www.terraform.io/community.html>.
16+
Questions about Databricks Provider please post on https://community.databricks.com
1117
-->
1218

1319
### Configuration

.release_metadata.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
"timestamp": "2025-04-16 17:50:45+0000"
2+
"timestamp": "2025-04-24 12:15:47+0000"
33
}

CHANGELOG.md

+27
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,32 @@
11
# Version changelog
22

3+
## Release v1.75.0
4+
5+
### New Features and Improvements
6+
7+
* Add support for `power_bi_task` in jobs ([#4647](https://github.com/databricks/terraform-provider-databricks/pull/4647))
8+
* Add support for `dashboard_task` in jobs ([#4646](https://github.com/databricks/terraform-provider-databricks/pull/4646))
9+
* Add `compute_mode` to `databricks_mws_workspaces` to support creating serverless workspaces ([#4670](https://github.com/databricks/terraform-provider-databricks/pull/4670)).
10+
* Make `spark_version` optional in the context of jobs such that a cluster policy can provide a default value ([#4643](https://github.com/databricks/terraform-provider-databricks/pull/4643))
11+
12+
13+
### Documentation
14+
15+
* Document `performance_target` in `databricks_job` ([#4651](https://github.com/databricks/terraform-provider-databricks/pull/4651))
16+
* Add more examples for `databricks_model_serving` ([#4658](https://github.com/databricks/terraform-provider-databricks/pull/4658))
17+
* Document `on_streaming_backlog_exceeded` in email/webhook notifications in `databricks_job` ([#4660](https://github.com/databricks/terraform-provider-databricks/pull/4660))
18+
* Refresh `spark_python_task` option in `databricks_job` ([#4666](https://github.com/databricks/terraform-provider-databricks/pull/4666))
19+
20+
### Exporter
21+
22+
* Emit files installed with `%pip install` in Python notebooks ([#4664](https://github.com/databricks/terraform-provider-databricks/pull/4664))
23+
* Correctly handle account-level identities when generating the code ([#4650](https://github.com/databricks/terraform-provider-databricks/pull/4650))
24+
* Add export of dashboard tasks in `datarbicks_job` ([#4665](https://github.com/databricks/terraform-provider-databricks/pull/4665))
25+
* Add export of PowerBI tasks in `datarbicks_job` ([#4668](https://github.com/databricks/terraform-provider-databricks/pull/4668))
26+
* Add `Ignore` implementation for `databricks_grants` to fix issue with wrongly generated dependencies ([#4650](https://github.com/databricks/terraform-provider-databricks/pull/4650))
27+
* Improve handling of `owner` for UC resources ([#4669](https://github.com/databricks/terraform-provider-databricks/pull/4669))
28+
29+
330
## Release v1.74.0
431

532
### Bug Fixes

NEXT_CHANGELOG.md

+3-12
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,17 @@
11
# NEXT CHANGELOG
22

3-
## Release v1.75.0
3+
## Release v1.76.0
44

55
### New Features and Improvements
66

7-
* Add support for `power_bi_task` in jobs ([#4647](https://github.com/databricks/terraform-provider-databricks/pull/4647))
8-
* Add support for `dashboard_task` in jobs ([#4646](https://github.com/databricks/terraform-provider-databricks/pull/4646))
9-
107
### Bug Fixes
118

9+
* Fix automatic cluster creation for `databricks_sql_permissions` ([#4141](https://github.com/databricks/terraform-provider-databricks/pull/4141))
10+
1211
### Documentation
1312

14-
* Document `performance_target` in `databricks_job` ([#4651](https://github.com/databricks/terraform-provider-databricks/pull/4651))
15-
* Add more examples for `databricks_model_serving` ([#4658](https://github.com/databricks/terraform-provider-databricks/pull/4658))
16-
* Document `on_streaming_backlog_exceeded` in email/webhook notifications in `databricks_job` ([#4660](https://github.com/databricks/terraform-provider-databricks/pull/4660))
17-
* Refresh `spark_python_task` option in `databricks_job` ([#4666](https://github.com/databricks/terraform-provider-databricks/pull/4666))
1813
* Unify/reorganize notes in docs for resources/data sources ([#4657](https://github.com/databricks/terraform-provider-databricks/pull/4657))
1914

2015
### Exporter
2116

22-
* Correctly handle account-level identities when generating the code ([#4650](https://github.com/databricks/terraform-provider-databricks/pull/4650))
23-
* Add export of dashboard tasks in `datarbicks_job` ([#4665](https://github.com/databricks/terraform-provider-databricks/pull/4665))
24-
* Add `Ignore` implementation for `databricks_grants` to fix issue with wrongly generated dependencies ([#4661](https://github.com/databricks/terraform-provider-databricks/pull/4650))
25-
2617
### Internal Changes

access/resource_sql_permissions.go

+11-8
Original file line numberDiff line numberDiff line change
@@ -277,7 +277,8 @@ func (ta *SqlPermissions) initCluster(ctx context.Context, d *schema.ResourceDat
277277

278278
func (ta *SqlPermissions) getOrCreateCluster(clustersAPI clusters.ClustersAPI) (string, error) {
279279
sparkVersion := clusters.LatestSparkVersionOrDefault(clustersAPI.Context(), clustersAPI.WorkspaceClient(), compute.SparkVersionRequest{
280-
Latest: true,
280+
Latest: true,
281+
LongTermSupport: true,
281282
})
282283
nodeType := clustersAPI.GetSmallestNodeType(compute.NodeTypeRequest{LocalDisk: true})
283284
aclCluster, err := clustersAPI.GetOrCreateRunningCluster(
@@ -287,13 +288,15 @@ func (ta *SqlPermissions) getOrCreateCluster(clustersAPI clusters.ClustersAPI) (
287288
NodeTypeID: nodeType,
288289
AutoterminationMinutes: 10,
289290
DataSecurityMode: "LEGACY_TABLE_ACL",
290-
SparkConf: map[string]string{
291-
"spark.databricks.cluster.profile": "singleNode",
292-
"spark.master": "local[*]",
293-
},
294-
CustomTags: map[string]string{
295-
"ResourceClass": "SingleNode",
296-
},
291+
// TODO: return back after backend fix is rolled out
292+
NumWorkers: 1,
293+
// SparkConf: map[string]string{
294+
// "spark.databricks.cluster.profile": "singleNode",
295+
// "spark.master": "local[*]",
296+
// },
297+
// CustomTags: map[string]string{
298+
// "ResourceClass": "SingleNode",
299+
// },
297300
})
298301
if err != nil {
299302
return "", err

access/resource_sql_permissions_test.go

+27-25
Original file line numberDiff line numberDiff line change
@@ -188,8 +188,8 @@ var createHighConcurrencyCluster = []qa.HTTPFixture{
188188
Response: compute.GetSparkVersionsResponse{
189189
Versions: []compute.SparkVersion{
190190
{
191-
Key: "7.1.x-cpu-ml-scala2.12",
192-
Name: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
191+
Key: "15.4.x-scala2.12",
192+
Name: "15.4 LTS (includes Apache Spark 3.5.0, Scala 2.12)",
193193
},
194194
},
195195
},
@@ -222,15 +222,16 @@ var createHighConcurrencyCluster = []qa.HTTPFixture{
222222
AutoterminationMinutes: 10,
223223
ClusterName: "terraform-table-acl",
224224
NodeTypeID: "Standard_F4s",
225-
SparkVersion: "11.3.x-scala2.12",
226-
CustomTags: map[string]string{
227-
"ResourceClass": "SingleNode",
228-
},
229-
SparkConf: map[string]string{
230-
"spark.databricks.cluster.profile": "singleNode",
231-
"spark.master": "local[*]",
232-
},
233-
DataSecurityMode: "LEGACY_TABLE_ACL",
225+
SparkVersion: "15.4.x-scala2.12",
226+
DataSecurityMode: "LEGACY_TABLE_ACL",
227+
NumWorkers: 1,
228+
// CustomTags: map[string]string{
229+
// "ResourceClass": "SingleNode",
230+
// },
231+
// SparkConf: map[string]string{
232+
// "spark.databricks.cluster.profile": "singleNode",
233+
// "spark.master": "local[*]",
234+
// },
234235
},
235236
Response: clusters.ClusterID{
236237
ClusterID: "bcd",
@@ -244,9 +245,9 @@ var createHighConcurrencyCluster = []qa.HTTPFixture{
244245
ClusterID: "bcd",
245246
State: "RUNNING",
246247
DataSecurityMode: "LEGACY_TABLE_ACL",
247-
SparkConf: map[string]string{
248-
"spark.databricks.cluster.profile": "singleNode",
249-
},
248+
// SparkConf: map[string]string{
249+
// "spark.databricks.cluster.profile": "singleNode",
250+
// },
250251
},
251252
},
252253
}
@@ -265,8 +266,8 @@ var createSharedCluster = []qa.HTTPFixture{
265266
Response: compute.GetSparkVersionsResponse{
266267
Versions: []compute.SparkVersion{
267268
{
268-
Key: "7.1.x-cpu-ml-scala2.12",
269-
Name: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
269+
Key: "15.4.x-scala2.12",
270+
Name: "15.4 LTS (includes Apache Spark 3.5.0, Scala 2.12)",
270271
},
271272
},
272273
},
@@ -299,15 +300,16 @@ var createSharedCluster = []qa.HTTPFixture{
299300
AutoterminationMinutes: 10,
300301
ClusterName: "terraform-table-acl",
301302
NodeTypeID: "Standard_F4s",
302-
SparkVersion: "11.3.x-scala2.12",
303-
CustomTags: map[string]string{
304-
"ResourceClass": "SingleNode",
305-
},
306-
DataSecurityMode: "LEGACY_TABLE_ACL",
307-
SparkConf: map[string]string{
308-
"spark.databricks.cluster.profile": "singleNode",
309-
"spark.master": "local[*]",
310-
},
303+
SparkVersion: "15.4.x-scala2.12",
304+
DataSecurityMode: "LEGACY_TABLE_ACL",
305+
NumWorkers: 1,
306+
// CustomTags: map[string]string{
307+
// "ResourceClass": "SingleNode",
308+
// },
309+
// SparkConf: map[string]string{
310+
// "spark.databricks.cluster.profile": "singleNode",
311+
// "spark.master": "local[*]",
312+
// },
311313
},
312314
Response: clusters.ClusterID{
313315
ClusterID: "bcd",

catalog/data_current_metastore_test.go

+17-9
Original file line numberDiff line numberDiff line change
@@ -3,23 +3,24 @@ package catalog
33
import (
44
"testing"
55

6+
"github.com/databricks/databricks-sdk-go/apierr"
7+
"github.com/databricks/databricks-sdk-go/experimental/mocks"
68
"github.com/databricks/databricks-sdk-go/service/catalog"
79
"github.com/databricks/terraform-provider-databricks/qa"
10+
"github.com/stretchr/testify/mock"
811
)
912

1013
func TestCurrentMetastoreDataVerify(t *testing.T) {
1114
qa.ResourceFixture{
12-
Fixtures: []qa.HTTPFixture{
13-
{
14-
Method: "GET",
15-
Resource: "/api/2.1/unity-catalog/metastore_summary",
16-
Response: catalog.GetMetastoreSummaryResponse{
15+
MockWorkspaceClientFunc: func(w *mocks.MockWorkspaceClient) {
16+
w.GetMockMetastoresAPI().EXPECT().
17+
Summary(mock.Anything).
18+
Return(&catalog.GetMetastoreSummaryResponse{
1719
Name: "xyz",
1820
MetastoreId: "abc",
1921
Owner: "pqr",
2022
Cloud: "aws",
21-
},
22-
},
23+
}, nil)
2324
},
2425
Resource: DataSourceCurrentMetastore(),
2526
Read: true,
@@ -35,10 +36,17 @@ func TestCurrentMetastoreDataVerify(t *testing.T) {
3536

3637
func TestCurrentMetastoreDataError(t *testing.T) {
3738
qa.ResourceFixture{
38-
Fixtures: qa.HTTPFailures,
39+
MockWorkspaceClientFunc: func(w *mocks.MockWorkspaceClient) {
40+
w.GetMockMetastoresAPI().EXPECT().
41+
Summary(mock.Anything).
42+
Return(nil, &apierr.APIError{
43+
ErrorCode: "BAD_REQUEST",
44+
Message: "Bad request: unable to get metastore summary",
45+
})
46+
},
3947
Resource: DataSourceCurrentMetastore(),
4048
Read: true,
4149
NonWritable: true,
4250
ID: "_",
43-
}.ExpectError(t, "i'm a teapot")
51+
}.ExpectError(t, "Bad request: unable to get metastore summary")
4452
}

catalog/data_metastores_test.go

+30-24
Original file line numberDiff line numberDiff line change
@@ -3,35 +3,34 @@ package catalog
33
import (
44
"testing"
55

6+
"github.com/databricks/databricks-sdk-go/apierr"
7+
"github.com/databricks/databricks-sdk-go/experimental/mocks"
68
"github.com/databricks/databricks-sdk-go/service/catalog"
79
"github.com/databricks/terraform-provider-databricks/qa"
10+
"github.com/stretchr/testify/mock"
811
)
912

1013
func TestMetastoresDataContainsName(t *testing.T) {
1114
qa.ResourceFixture{
12-
Fixtures: []qa.HTTPFixture{
13-
{
14-
Method: "GET",
15-
Resource: "/api/2.0/accounts/testaccount/metastores",
16-
Response: catalog.ListMetastoresResponse{
17-
Metastores: []catalog.MetastoreInfo{
18-
{
19-
Name: "a",
20-
StorageRoot: "abc",
21-
DefaultDataAccessConfigId: "sth",
22-
23-
MetastoreId: "abc",
24-
},
25-
{
26-
Name: "b",
27-
StorageRoot: "dcw",
28-
DefaultDataAccessConfigId: "sth",
29-
30-
MetastoreId: "ded",
31-
},
15+
MockAccountClientFunc: func(a *mocks.MockAccountClient) {
16+
a.GetMockAccountMetastoresAPI().EXPECT().
17+
ListAll(mock.Anything).
18+
Return([]catalog.MetastoreInfo{
19+
{
20+
Name: "a",
21+
StorageRoot: "abc",
22+
DefaultDataAccessConfigId: "sth",
23+
24+
MetastoreId: "abc",
3225
},
33-
},
34-
},
26+
{
27+
Name: "b",
28+
StorageRoot: "dcw",
29+
DefaultDataAccessConfigId: "sth",
30+
31+
MetastoreId: "ded",
32+
},
33+
}, nil)
3534
},
3635
Resource: DataSourceMetastores(),
3736
Read: true,
@@ -45,11 +44,18 @@ func TestMetastoresDataContainsName(t *testing.T) {
4544

4645
func TestMetastoresData_Error(t *testing.T) {
4746
qa.ResourceFixture{
48-
Fixtures: qa.HTTPFailures,
47+
MockAccountClientFunc: func(a *mocks.MockAccountClient) {
48+
a.GetMockAccountMetastoresAPI().EXPECT().
49+
ListAll(mock.Anything).
50+
Return(nil, &apierr.APIError{
51+
ErrorCode: "BAD_REQUEST",
52+
Message: "Bad request: unable to list metastores",
53+
})
54+
},
4955
Resource: DataSourceMetastores(),
5056
Read: true,
5157
NonWritable: true,
5258
ID: "_",
5359
AccountID: "_",
54-
}.ExpectError(t, "i'm a teapot")
60+
}.ExpectError(t, "Bad request: unable to list metastores")
5561
}

0 commit comments

Comments
 (0)