Skip to content

Why Terraform try to destroy resource that been imported? #11043

@niraj06

Description

@niraj06

We have terraform module to create a google big query tables, using that module we try to import the table, so far good but, when we try to "terraform apply" it's trying to delete the import table and plan to re-create it.
This is a message in the console that will be destroyed

(because resource uses count or for_each)

Yes, the module has for_each been used, don't understand why it's forcing to drop resources?

resource "google_bigquery_table" "default" {
for_each = local.tables
project = each.value.project_id
dataset_id = each.value.dataset_id
table_id = each.key
friendly_name = each.value.friendly_name
description = each.value.description
clustering = try(each.value.options.clustering, null)
expiration_time = try(each.value.options.expiration_time, null)
labels = each.value.labels
schema = file(each.value.schema)

dynamic "encryption_configuration" {
for_each = try(each.value.options.encryption_key, null) != null ? [""] : []
content {
kms_key_name = each.value.options.encryption_key
}
}

dynamic "range_partitioning" {
for_each = try(each.value.range_partitioning, null) != null ? [""] : []
content {
field = each.value.range_partitioning.field
range {
start = each.value.range_partitioning.start
end = each.value.range_partitioning.end
interval = each.value.range_partitioning.interval
}
}
}

dynamic "time_partitioning" {
for_each = try(each.value.time_partitioning, null) != null ? [""] : []
content {
expiration_ms = each.value.time_partitioning.expiration_ms
field = each.value.time_partitioning.field
type = each.value.time_partitioning.type
}
}
deletion_protection = "true"

lifecycle {
ignore_changes = [
encryption_configuration
]
}
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions