-
Notifications
You must be signed in to change notification settings - Fork 420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] type issue in databricks_sql_table
#4422
base: main
Are you sure you want to change the base?
Conversation
…ore then one word in a string
databricks_sql_table
databricks_sql_table
You also need to think how to handle cases like, if you have a decimal type declared as |
I used regex instead. I looked at this document and it seems all the types use parentheses to signify arguments. |
Integration tests has failed with the following error:
|
I updated the template generation to include the needed delta flag |
@alexott Anything else I can do? |
It looks like there is some syntax error - integration tests are failing with:
|
🤦♂️ just updated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in general looks good, but need to think on how the change of the type may affect apply/read (I'm not sure if it won't lead to the configuration drift).
integration test is passing, but we need to fix our build to make it overall green
return caseInsensitiveColumnType | ||
return normalizedColumnType |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure about this change, need to think how changes like decimal(12, 2)
-> decimal(12,2)
will affect the plan/apply and then read operations
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, let me know what you think the best approach is and I'm happy to implement.
If integration tests don't run automatically, an authorized user can run them manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
@alexott Any thoughts on how we can proceed? |
Hi @ian-norris-ncino, thank you for this contribution. I've gone over this change carefully with @alexott. While I recognize it solves your immediate issue with being able to specify a timestamp column's default value, I am concerned about merging the change as it is. We need to be able to accurately compare the type of a column as returned by the Tables API with what is specified in a user's config. I have two concerns about this:
The current implementation is far from ideal, and I'll be the first to admit that, but I don't think this is the right direction to start addressing this. To better support this, I would propose the following pathway.
I don't know how easy this will be, as I don't know if there is a complete reference for |
Thanks @mgyucht and @alexott for thinking deeply about this. I've starting getting deeper into an implementation and it comes with many complexities. Mainly around parsing the user provided type string into a common structure with the get table API structure. I wonder if it makes sense to add additional keys to the resource for default values, precision, scales, and interval types as to match with the get api? An example would the look like column {
name = "updated_at"
type = "timestamp"
default = "current_timestamp()"
comment = ""
nullable = false
}
column {
name = "value"
type = "decimal"
type_precision = 10
type_scale = 0
comment = ""
nullable = false
} There would be some considerations on parsing the type_json metadata but this would make handling the user input much easier. What do you think? |
Changes
Updated type check to only compare the first word in the type string.
Resolves #4421
Tests
make test
run locallydocs/
folderinternal/acceptance