-
Notifications
You must be signed in to change notification settings - Fork 424
[Internal] Rewrite DLT pipelines using SDK #3839
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…ricks/terraform-provider-databricks into divyansh_dlt_pipelines
…ricks/terraform-provider-databricks into divyansh_dlt_pipelines
@@ -76,13 +76,26 @@ The following arguments are supported: | |||
* `library` blocks - Specifies pipeline code and required artifacts. Syntax resembles [library](cluster.md#library-configuration-block) configuration block with the addition of a special `notebook` & `file` library types that should have the `path` attribute. *Right now only the `notebook` & `file` types are supported.* | |||
* `cluster` blocks - [Clusters](cluster.md) to run the pipeline. If none is specified, pipelines will automatically select a default cluster configuration for the pipeline. *Please note that DLT pipeline clusters are supporting only subset of attributes as described in [documentation](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-api-guide.html#pipelinesnewcluster).* Also, note that `autoscale` block is extended with the `mode` parameter that controls the autoscaling algorithm (possible values are `ENHANCED` for new, enhanced autoscaling algorithm, or `LEGACY` for old algorithm). | |||
* `continuous` - A flag indicating whether to run the pipeline continuously. The default value is `false`. | |||
* `development` - A flag indicating whether to run the pipeline in development mode. The default value is `true`. | |||
* `development` - A flag indicating whether to run the pipeline in development mode. The default value is `false`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note for the reader or this PR, this was incorrect
https://openapi.dev.databricks.com/api/workspace/pipelines/create
hectorcast-db
approved these changes
Jul 31, 2024
edwardfeng-db
approved these changes
Jul 31, 2024
5 tasks
github-merge-queue bot
pushed a commit
that referenced
this pull request
Aug 14, 2024
## Changes After #3839, the provider's behavior of DLT pipelines regressed. In particular, the `Read` method stopped populating certain fields from the GetPipelineResponse into Terraform state. This PR addresses this by additionally writing all top-level fields into the state as part of the read operation. Resolves #3855. ## Tests Unit tests cover the case specified in the issue. - [x] `make test` run locally - [ ] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] relevant acceptance tests are passing - [ ] using Go SDK
mgyucht
added a commit
that referenced
this pull request
Aug 14, 2024
### New Features and Improvements * Added support for `cloudflare_api_token` in `databricks_storage_credential` resource ([#3835](#3835)). * Add `active` attribute to `databricks_user` data source ([#3733](#3733)). * Add `workspace_path` attribute to `databricks_notebook` resource and data source ([#3885](#3885)). * Mark attributes as sensitive in `databricks_mlflow_webhook` ([#3825](#3825)). * Added notification destination resource ([#3820](#3820)). ### Bug Fixes * Automatically assign `IS_OWNER` permission to sql warehouse if not specified ([#3829](#3829)). * Corrected kms arn format in `data_aws_unity_catalog_policy` ([#3823](#3823)). * Fix crash when destroying `databricks_compliance_security_profile_workspace_setting` ([#3883](#3883)). * Fixed read method of `databricks_entitlements` resource ([#3858](#3858)). * Retry cluster update on "INVALID_STATE" ([#3890](#3890)). * Save Pipeline resource to state in addition to spec ([#3869](#3869)). * Tolerate `databricks_workspace_conf` deletion failures ([#3737](#3737)). * Update Go SDK ([#3826](#3826)). * cluster key update for `databricks_sql_table` should not force new ([#3824](#3824)). * reading `databricks_metastore_assignment` when importing resource ([#3827](#3827)). ### Documentation * Add troubleshooting instructions for `databricks OAuth is not supported for this host` error ([#3815](#3815)). * Clarify setting of permissions for workspace objects ([#3884](#3884)). * Document missing task attributes in `databricks_job` resource ([#3817](#3817)). * Fixed documentation for `databricks_schemas` data source and `databricks_metastore_assignment` resource ([#3851](#3851)). * clarified `spot_bid_max_price` option for `databricks_cluster` ([#3830](#3830)). * marked `databricks_sql_dashboard` as legacy ([#3836](#3836)). ### Internal Changes * Refactor exporter: split huge files into smaller ones ([#3870](#3870)). * Refactored `client.ClientForHost` to use Go SDK method ([#3735](#3735)). * Revert "Rewriting DLT pipelines using SDK" ([#3838](#3838)). * Rewrite DLT pipelines using SDK ([#3839](#3839)). * Rewriting DLT pipelines using SDK ([#3792](#3792)). * Update Go SDK ([#3808](#3808)). * refactored `databricks_mws_permission_assignment` to Go SDK ([#3831](#3831)). ### Dependency Updates * Bump databricks-sdk-go to 0.44.0 ([#3896](#3896)). * Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 ([#3775](#3775)). ### Exporter * Add retry on "Operation timed out" error ([#3897](#3897)). * Add support for Vector Search assets ([#3828](#3828)). * Add support for `databricks_notification_destination` ([#3861](#3861)). * Add support for `databricks_online_table` ([#3816](#3816)). * Don't export model serving endpoints with foundational models ([#3845](#3845)). * Fix generation of `autotermination_minutes = 0` ([#3881](#3881)). * Generate `databricks_workspace_binding` instead of legacy `databricks_catalog_workspace_binding` ([#3812](#3812)). * Ignore DLT pipelines deployed via DABs ([#3857](#3857)). * Improve exporting of `databricks_model_serving` ([#3821](#3821)). * Refactoring: remove legacy code ([#3864](#3864)).
mgyucht
added a commit
that referenced
this pull request
Aug 14, 2024
### New Features and Improvements * Added `databricks_notification_destination` resource ([#3820](#3820)). * Added support for `cloudflare_api_token` in `databricks_storage_credential` resource ([#3835](#3835)). * Add `active` attribute to `databricks_user` data source ([#3733](#3733)). * Add `workspace_path` attribute to `databricks_notebook` resource and data source ([#3885](#3885)). * Mark attributes as sensitive in `databricks_mlflow_webhook` ([#3825](#3825)). ### Bug Fixes * Automatically assign `IS_OWNER` permission to sql warehouse if not specified ([#3829](#3829)). * Corrected kms arn format in `data_aws_unity_catalog_policy` ([#3823](#3823)). * Fix crash when destroying `databricks_compliance_security_profile_workspace_setting` ([#3883](#3883)). * Fixed read method of `databricks_entitlements` resource ([#3858](#3858)). * Retry cluster update on "INVALID_STATE" ([#3890](#3890)). * Save Pipeline resource to state in addition to spec ([#3869](#3869)). * Tolerate `databricks_workspace_conf` deletion failures ([#3737](#3737)). * Update Go SDK ([#3826](#3826)). * cluster key update for `databricks_sql_table` should not force new ([#3824](#3824)). * reading `databricks_metastore_assignment` when importing resource ([#3827](#3827)). ### Documentation * Add troubleshooting instructions for `databricks OAuth is not supported for this host` error ([#3815](#3815)). * Clarify setting of permissions for workspace objects ([#3884](#3884)). * Document missing task attributes in `databricks_job` resource ([#3817](#3817)). * Fixed documentation for `databricks_schemas` data source and `databricks_metastore_assignment` resource ([#3851](#3851)). * clarified `spot_bid_max_price` option for `databricks_cluster` ([#3830](#3830)). * marked `databricks_sql_dashboard` as legacy ([#3836](#3836)). ### Internal Changes * Refactor exporter: split huge files into smaller ones ([#3870](#3870)). * Refactored `client.ClientForHost` to use Go SDK method ([#3735](#3735)). * Revert "Rewriting DLT pipelines using SDK" ([#3838](#3838)). * Rewrite DLT pipelines using SDK ([#3839](#3839)). * Rewriting DLT pipelines using SDK ([#3792](#3792)). * Update Go SDK ([#3808](#3808)). * refactored `databricks_mws_permission_assignment` to Go SDK ([#3831](#3831)). ### Dependency Updates * Bump databricks-sdk-go to 0.44.0 ([#3896](#3896)). * Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 ([#3775](#3775)). ### Exporter * Add retry on "Operation timed out" error ([#3897](#3897)). * Add support for Vector Search assets ([#3828](#3828)). * Add support for `databricks_notification_destination` ([#3861](#3861)). * Add support for `databricks_online_table` ([#3816](#3816)). * Don't export model serving endpoints with foundational models ([#3845](#3845)). * Fix generation of `autotermination_minutes = 0` ([#3881](#3881)). * Generate `databricks_workspace_binding` instead of legacy `databricks_catalog_workspace_binding` ([#3812](#3812)). * Ignore DLT pipelines deployed via DABs ([#3857](#3857)). * Improve exporting of `databricks_model_serving` ([#3821](#3821)). * Refactoring: remove legacy code ([#3864](#3864)).
github-merge-queue bot
pushed a commit
that referenced
this pull request
Aug 15, 2024
### New Features and Improvements * Added `databricks_notification_destination` resource ([#3820](#3820)). * Added support for `cloudflare_api_token` in `databricks_storage_credential` resource ([#3835](#3835)). * Add `active` attribute to `databricks_user` data source ([#3733](#3733)). * Add `workspace_path` attribute to `databricks_notebook` resource and data source ([#3885](#3885)). * Mark attributes as sensitive in `databricks_mlflow_webhook` ([#3825](#3825)). ### Bug Fixes * Automatically assign `IS_OWNER` permission to sql warehouse if not specified ([#3829](#3829)). * Corrected kms arn format in `data_aws_unity_catalog_policy` ([#3823](#3823)). * Fix crash when destroying `databricks_compliance_security_profile_workspace_setting` ([#3883](#3883)). * Fixed read method of `databricks_entitlements` resource ([#3858](#3858)). * Retry cluster update on "INVALID_STATE" ([#3890](#3890)). * Save Pipeline resource to state in addition to spec ([#3869](#3869)). * Tolerate `databricks_workspace_conf` deletion failures ([#3737](#3737)). * Update Go SDK ([#3826](#3826)). * cluster key update for `databricks_sql_table` should not force new ([#3824](#3824)). * reading `databricks_metastore_assignment` when importing resource ([#3827](#3827)). ### Documentation * Add troubleshooting instructions for `databricks OAuth is not supported for this host` error ([#3815](#3815)). * Clarify setting of permissions for workspace objects ([#3884](#3884)). * Document missing task attributes in `databricks_job` resource ([#3817](#3817)). * Fixed documentation for `databricks_schemas` data source and `databricks_metastore_assignment` resource ([#3851](#3851)). * clarified `spot_bid_max_price` option for `databricks_cluster` ([#3830](#3830)). * marked `databricks_sql_dashboard` as legacy ([#3836](#3836)). ### Internal Changes * Refactor exporter: split huge files into smaller ones ([#3870](#3870)). * Refactored `client.ClientForHost` to use Go SDK method ([#3735](#3735)). * Revert "Rewriting DLT pipelines using SDK" ([#3838](#3838)). * Rewrite DLT pipelines using SDK ([#3839](#3839)). * Rewriting DLT pipelines using SDK ([#3792](#3792)). * Update Go SDK ([#3808](#3808)). * refactored `databricks_mws_permission_assignment` to Go SDK ([#3831](#3831)). ### Dependency Updates * Bump databricks-sdk-go to 0.44.0 ([#3896](#3896)). * Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 ([#3775](#3775)). ### Exporter * Add retry on "Operation timed out" error ([#3897](#3897)). * Add support for Vector Search assets ([#3828](#3828)). * Add support for `databricks_notification_destination` ([#3861](#3861)). * Add support for `databricks_online_table` ([#3816](#3816)). * Don't export model serving endpoints with foundational models ([#3845](#3845)). * Fix generation of `autotermination_minutes = 0` ([#3881](#3881)). * Generate `databricks_workspace_binding` instead of legacy `databricks_catalog_workspace_binding` ([#3812](#3812)). * Ignore DLT pipelines deployed via DABs ([#3857](#3857)). * Improve exporting of `databricks_model_serving` ([#3821](#3821)). * Refactoring: remove legacy code ([#3864](#3864)).
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
Rewrite DLT pipelines resource using SDK.
2 new fields are a part of DLT pipeline schema now
gateway_definition
- The definition of a gateway pipeline to support CDC.ingestion_definition
- The configuration for a managed ingestion pipeline. These settings cannot be used with thelibrary
,target
orcatalog
settings.Tests
All existing unit tests and integration tests are passing
docs/
folderinternal/acceptance