Skip to content

connectors-ci: unique secret name at session level #28656

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 25, 2023

Conversation

alafanechere
Copy link
Contributor

@alafanechere alafanechere commented Jul 25, 2023

What

Relates to #27867 #28617
According to the nightly tests results it appears that connectors spec validation tests often fails transiently, on different connectors.
My hypothesis is that the wrong config/secret is mounted to the CAT container here
After investigation it's probably related to a namespace issue when we instantiate Secret object after GSM download::
Dagger derives secret ID from their name and not their content, so far we've named secrets according to the config file name, which is not unique and often config.json.
It means that when we call context.dagger_client.set_secret(secret_filename, secret_plaintext) in a concurrent context we can overwrite a connector secret with a wrong value as dagger_client is a pipeline wide object, not a context specific instance.

How

Make secret name unique by concatenating connector name and file name:

unique_secret_name = f"{context.connector.technical_name}_{secret_file}"
connector_secrets[secret_file] = context.dagger_client.set_secret(unique_secret_name, secret_plaintext)

Tests

Running a nightly build from this branch

@alafanechere alafanechere requested a review from a team July 25, 2023 07:52
Comment on lines +87 to +89
ci_credentials.with_directory(secrets_path, context.updated_secrets_dir).with_exec(
["ci_credentials", context.connector.technical_name, "update-secrets"]
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

formatting change

@erohmensing
Copy link
Contributor

Is there still the chance that 2 concurrent runs for the same connector (one which modified the spec) could have this issue?

E.g. the new 1s1t big query branch and the regular bigquery branch?

@alafanechere
Copy link
Contributor Author

alafanechere commented Jul 25, 2023

Is there still the chance that 2 concurrent runs for the same connector (one which modified the spec) could have this issue?

@erohmensing By concurrent I mean concurrent within the same test pipeline execution. In our current paradigm a connector can't be tested twice is the same pipeline so I think we're safe.

Two concurrent connector test running on a different branch share a different dagger session/client, the Dagger team told me the Secret ID are session specific.

@erohmensing
Copy link
Contributor

Gotcha, so this is only a problem for changes with dependencies which kick off multiple test runs in the same session. Thanks for the clarification!

@alafanechere
Copy link
Contributor Author

Gotcha, so this is only a problem for changes with dependencies which kick off multiple test runs in the same session. Thanks for the clarification!

Or specifically nightly builds where we trigger tons of concurrent test execution in a single pipeline.

Copy link
Contributor

@erohmensing erohmensing left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so long as the connector_secrets dict is per connector, looks good! otherwise i'd use the namespaced name as the key instead of just secret_file

@alafanechere
Copy link
Contributor Author

so long as the connector_secrets dict is per connector, looks good! otherwise i'd use the namespaced name as the key instead of just secret_file

Yep I could have done it but it's not useful as this dict instance is per connector and it would mean "denamespacing" the key downstream when we need to actually name the secret file correctly as expected by CAT.

@alafanechere
Copy link
Contributor Author

Merging as the nightly build workflow, even if not finished, look healthy so far, and I'd like to see the result on the real tomorrow's nightly build.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants