Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't fail on sensitive output during terraform test #36544

Open
dvdvorle opened this issue Feb 20, 2025 · 6 comments
Open

Don't fail on sensitive output during terraform test #36544

dvdvorle opened this issue Feb 20, 2025 · 6 comments
Labels
enhancement new new issue not yet triaged terraform test

Comments

@dvdvorle
Copy link

Terraform Version

Terraform v1.10.5
on windows_amd64

Terraform Configuration Files

Inside a (non-root) module: module-x/output.tf

output "password" {
  value     = data.azurerm_key_vault_secret.password.value
}

In module-x/tests/main.tftest.hcl

run "existing" {
  command = plan
}

Debug Output

n/a

Expected Behavior

When running terraform test, I'd expect this to not result in a failure.

Actual Behavior

  run "existing"... fail
╷
│ Error: Output refers to sensitive values
│
│   on output.tf line 95:
│   95: output "password" {
│
│ To reduce the risk of accidentally exporting sensitive data that was intended to be only internal, Terraform requires
│ that any root module output containing sensitive data be explicitly marked as sensitive, to confirm your intent.
│
│ If you do intend to export this data, annotate the output value as sensitive by adding the following argument:
│     sensitive = true
╵
tests\main.tftest.hcl... tearing down
tests\main.tftest.hcl... fail

Failure! 0 passed, 1 failed.

Steps to Reproduce

Create the resources as described, run terraform test

Additional Context

This is running interactively, but should also work in a CI system. Since this is terraform test I'd expect to be able to always see all output, even if it's sensitive, since now I'm sometimes trying to fix a failed test in the dark.

But for this specific issue I'd be happy if I were able to provide a flag like terraform test -module and have it not fail (and not warn!) on this.

References

No response

Generative AI / LLM assisted development?

No response

@dvdvorle dvdvorle added bug new new issue not yet triaged labels Feb 20, 2025
@dvdvorle
Copy link
Author

Hi @liamcervante , I see you changed this issue from a bug to an enhancement. I think I can understand why, but I'd like to make a case for this to be regarded as a bug.

The error message states "Terraform requires that any root module [...]". Terraform incorrectly assumes it to be a root module, which it is not. Though not a crash, I'd say this is an unexpected error and incorrect behavior -> which is categorized as a bug.

I'd like to propose that to fix this bug the default behaviour should be that terraform test assumes the module under test is not a root module.

A separate enhancement (separate issue) could be to add a flag to specifiy that the module under test should be treated as a root module.

@liamcervante
Copy link
Member

Hi @dvdvorle, thanks for filing the issue and following up! A bug requires us to have said "this is how thing x works", and then thing x must be doing something different from whatever we published. I think we haven't actually documented anywhere what the correct behaviour of Terraform should be in regards to the testing framework executing the module under test as a root module.

The error message states "Terraform requires that any root module [...]". Terraform incorrectly assumes it to be a root module, which it is not.

In your case this is an incorrect assumption being made by Terraform, but it is not always an incorrect assumption. Another configuration author could have written tests that are executing against the root module. If we were to then "fix" this behaviour, the tests written by these other authors might break as a result of that.

Given that we haven't documented anywhere the "correct" behaviour in this case, and the current behaviour has been in production since the testing framework launched, I would lean towards saying that the current behaviour should become the "correct" behaviour as that leads to the smallest impact for all users.

Our processes do also differentiate between a bug and an enhancement, as we can engage our product team for discovery of an enhancement which would give us greater insight into how users more generally are treating this assumption.

I would also note, that in the example you have given, it doesn't seem unreasonable for you to add a sensitive = true attribute to the troublesome output which would resolve the issue for you. Not setting sensitive = true attribute doesn't strip the sensitivity away from the actual value, so there would be no behaviour change for your module in actual production use by just marking it as sensitive explicitly. However, I am aware it may be that the example you gave here isn't truly representative of your broader use case and you might not actually be able to do this for whatever reason.

Hopefully that all makes sense! Thanks again!

@dvdvorle
Copy link
Author

That makes sense, thanks for getting back to me! Fwiw I also haven't been able to find any documentation on what the 'correct' behaviour should be in this case. So I agree marking it as an enhancement would be more appropriate here.

As a workaround we can mark the output as sensitive. We currently do this, but we also output more complex object where only a part of it is sensitive. With this workaround the whole output is marked sensitive which hides more information than needed. During terraform test this has been an pain, since failings tests don't tell why they're failing.

Another potentional workaround we thought of, but decided against is actually using the module as a module in the test. This only allows testing outputs though, not 'internal' resources within the module.

Anyway, thanks again for all the hard work! 🙂

@coderveehan
Copy link

Maybe a dumb question - is there a reason this is part of terraform test? I would expect something like this part of a linter. However, since that isn't natively present in terraform perhaps this could be part of terraform validate instead?

@liamcervante
Copy link
Member

Hi @coderveehan. Just to clarify, this isn't something that terraform test is doing itself. The error is produced by the internal plan / apply cycles that the testing framework carries out. You could replicate this error message within regular Terraform executions by exposing passing a sensitive value into a root module output without marking that output as sensitive.

This has been raised as a problem within terraform test because the module being tested here isn't actually the root module of the configuration the module is being used in but the testing framework is not aware of that and so the error is being raised during testing but not during regular execution for this particular use case.

The reason this isn't within terraform validate is because the error requires known values to deduce that a sensitive value is being exposed via an unmarked output value. The validate functionality doesn't have access to any concrete information about the values it is processing, only type information, and so it cannot deduce this error message.

@coderveehan
Copy link

@liamcervante that makes sense! Thank you for the explanation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement new new issue not yet triaged terraform test
Projects
None yet
Development

No branches or pull requests

3 participants