-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
expect_column_values_to_of_type does not work #9626
Comments
Hey @sfan0704 -- I'm so far unable to reproduce this. Are you able to share your configuration for this expectation as well? |
Hey @austiezr thanks for responding to this. My expectation suite looks like this. Not sure what else I can provide to make this clearer. {
"expectation_type": "expect_column_values_to_be_of_type",
"kwargs": {
"column": "reference_currency_price",
"type_": "FLOAT"
},
"meta": {}
}, |
How did you initially create this datasource? Is this error arising after running a checkpoint? |
I create the datasource like this, self.data_context.sources.add_snowflake(name="source_name", connection_string="STRING") And correct, the exception was thrown after we run the checkpoint. |
Hi Team, Any update on this issue. I had worked with this expectation and it was working on till March 14, 2024. Today when I am running the same validation with same arguments, its giving me error as below: Calculating Metrics: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.16it/s] |
Can you share your python dependency versions? |
I am using |
Hi Team, Any update on this issue. Any eta for the issue.? |
Hey @NehaNRane @sfan0704, Austin handed this issue over to me - I'm going to investigate and try to reproduce the error you're getting. |
@sfan0704 @NehaNRane Ok, I've tested in two different virtual environments matching your dependencies and have not been able to replicate the error - the As a next troubleshooting step, could you please install If you still get the same error after a fresh install, we'll take a closer look at your code and Snowflake data next. |
@rachhouse, I tried again by creating fresh virtual environment but I am getting the same error. Error: |
Hi @NehaNRane, thanks for retrying with a fresh install and including the stacktrace. Can you please also share your GX workflow code from |
I've tested this Expectation ( At this point, I am suspicious that this behavior might be a Windows issue. I see you are using Windows from your stacktrace, @NehaNRane, and when this same behavior was reported in #5565, it also occurred for a Windows user. @sfan0704, are you running your code on Windows or WSL? |
I'm running on a Mac |
Hi @rachhouse
|
@NehaNRane, looks like the issue is that you're using outdated code - the use of
@sfan0704 Thanks for confirming your environment, can share your full GX workflow code that generates the error? |
Hi @rachhouse, I observed that this issue occurs only when data_asset is created by query asset. We have a table_name in mixed case. To handle this, we are using query asset. In our use case, table_name and column_name can be in any case. So, how to go about it in such scenario.? |
Thanks @NehaNRane, those details are very helpful - I'm now able to consistently generate an error when using a Snowflake table with a mixed case name, a |
Sorry for the delayed response, we're also doing exactly what @rachhouse is doing. Looking forward to the potential solution here. Thanks. |
Hi @NehaNRane @sfan0704, Engineering and I dug further into this error, and the problem is caused by Snowflake tables that have a The current workaround is to either: and use a GX Table Asset with your renamed table or created view. From there, you should be able to run In testing, I created a view,
used the view to create a Table Asset:
and was able to run
|
Opening an issue similar to this #5565
I'm also running into this problem while using a Snowflake datasource to check the type of the columns.
Here are the specs,
I was debugging through this and found that the metrics object only has the key
name
in the dictstack trace
Originally posted by @sfan0704 in #5565 (comment)
The text was updated successfully, but these errors were encountered: