-
Notifications
You must be signed in to change notification settings - Fork 4.5k
🐛 Destination snowflake: Create final tables with uppercase naming #30056
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 Destination snowflake: Create final tables with uppercase naming #30056
Conversation
Before Merging a Connector Pull RequestWow! What a great pull request you have here! 🎉 To merge this PR, ensure the following has been done/considered for each connector added or updated:
If the checklist is complete, but the CI check is failing,
|
b226c19
to
befa893
Compare
This comment was marked as outdated.
This comment was marked as outdated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a lot of toUpperCase()
in the implementation code. I thought we wanted to let the DB evaluate these these to UPPERCASE lazily?
unescapeIdentifier(databaseName).toUpperCase(), | ||
unescapeIdentifier(schema).toUpperCase(), | ||
unescapeIdentifier(table).toUpperCase()).stream() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a test util, so maybe it doens't matter, but I thought we wanted to let the DB evaluate the upper-case-ness of table names
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I overruled that decision https://airbytehq-team.slack.com/archives/C03C4AVJWG4/p1693495091006359 (but open to continued discussion!)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes!
with snowflake uppercasing via not-quoting, we’re losing the ability to use special characters / reserved words in namespace/table/column names. That opens up a whole can of worms around escaping things, which I’d rather not deal with right now.
That makes a lot of sense. 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is still technically a user-facing decision (e.g. would they rather get "FOO$BAR"
or FOO_BAR
). But that's a much smaller blast radius
This comment was marked as outdated.
This comment was marked as outdated.
@@ -175,12 +175,6 @@ In addition to the changes which apply for all destinations described above, the | |||
1. [Object and array properties](https://docs.airbyte.com/understanding-airbyte/supported-data-types/#the-types) are properly stored as JSON columns. Previously, we had used TEXT, which made querying sub-properties more difficult. | |||
- In certain cases, numbers within sub-properties with long decimal values will need to be converted to float representations due to a _quirk_ of Bigquery. Learn more [here](https://github.com/airbytehq/airbyte/issues/29594). | |||
|
|||
### Snowflake | |||
|
|||
1. `destination-snowflake` is now case-sensitive, and was not previously. This means that if you have a source stream "users", `destination-snowflake` would have previously created a "USERS" table in your data warehouse. We now correctly create a "users" table. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Hesperide fyi - I'm deleting the case-sensitivity thing from snowflake's upgrade guide.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as off-topic.
This comment was marked as off-topic.
|
Step | Result |
---|---|
Java Connector Unit Tests | ✅ |
Build connector tar | ✅ |
Build destination-bigquery docker image for platform linux/x86_64 | ✅ |
Java Connector Integration Tests | ✅ |
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml | ✅ |
Connector version semver check | ✅ |
Connector version increment check | ✅ |
QA checks | ✅ |
☁️ View runs for commit in Dagger Cloud
Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command
airbyte-ci connectors --name=destination-bigquery test
|
Step | Result |
---|---|
Java Connector Unit Tests | ✅ |
Build connector tar | ✅ |
Build destination-snowflake docker image for platform linux/x86_64 | ✅ |
Java Connector Integration Tests | ✅ |
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml | ✅ |
Connector version semver check | ✅ |
Connector version increment check | ✅ |
QA checks | ✅ |
☁️ View runs for commit in Dagger Cloud
Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command
airbyte-ci connectors --name=destination-snowflake test
source-hubplanner is an unrelated autoformat change. Will approve-and-merge this PR. |
/approve-and-merge reason="failing test is due to unrelated connector; test only ran because of autoformat. bigquery+snowflake have green CI" |
closes #30010
The interesting changes are in SnowflakeDestinationHandler / SnowflakeSqlGenerator; everything else is just updating tests. (T+D tests are passing locally, at least)
See also #30068, which implements the migration for existing tables. I'll merge that PR into this one, then release this PR to master.
todo:
add migration logic. I'll stack a PR on top of this one.Destination snowflake: table casing migration #30068run some manual testsupdate tests to have uppercaseTypingAndDedupingTests: RecordDiffer should accept airbyte metadata column names instead of hardcoding them #30112_airbyte_*
in the expected records. I'll submit an issue to do that - the columns are uppercased in the destination, just the expected records look weird.