-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Json type to the sqlalchemy parser #436
base: main
Are you sure you want to change the base?
Conversation
@@ -128,6 +128,7 @@ def test_numeric_renders_as_decimal_with_precision_and_scale(self): | |||
sqlalchemy.types.INT: DatabricksDataType.INT, | |||
sqlalchemy.types.SMALLINT: DatabricksDataType.SMALLINT, | |||
sqlalchemy.types.TIMESTAMP: DatabricksDataType.TIMESTAMP, | |||
sqlalchemy.types.JSON: DatabricksDataType.STRING, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Others with more knowledge can chime in, but I wonder if you need to do something similar to TinyIntegerTest to see if the type mapping will actually work. I think this addition makes sure the type will be recognized and translated, but I actually do not know how you would know the value is rendered as a literal string properly.
@jprakash-db Generally the idea you implemented looks correct - in Databricks, JSON values are stored as strings, so we have to give SQLAlchemy a hint about it. However, it would be nice to have tests to check that data are actually serialized correctly (as JSON-encoded string in this case). Currently we only have tests that verify it the type is mapped properly, but IMHO it's not enough. Can you please check if we have any tests to check that all our customized data types are actually properly serialized, and add JSON values to it? If we don't have such test suite - can you please create it, and it least check for JSON, Enum ans UUID types being properly serialized? |
@@ -33,6 +33,7 @@ def process_literal_param_hack(value: Any): | |||
@compiles(sqlalchemy.types.Unicode, "databricks") | |||
@compiles(sqlalchemy.types.UnicodeText, "databricks") | |||
@compiles(sqlalchemy.types.Uuid, "databricks") | |||
@compiles(sqlalchemy.types.JSON, "databricks") | |||
def compile_string_databricks(type_, compiler, **kw): | |||
""" | |||
We override the default compilation for Enum(), String(), Text(), and Time() because SQLAlchemy |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please update this comment as well - to match decorators list for this function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, adding this
Description
Added the Json type in the sqlalchemy types file to handle the compilation of Json to string