-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coerce a None
value to the default
#675
Comments
One thing you could do now is to have a "private" field with optional list and a property that isn't optional. |
That's actually the direction i went, but the setters and |
no prob. I am a big "immutables" fan, so wasn't thinking about the need for mutable fields. another idea for you then - you can have phantom types, just for deserialization, which will have optional fields and post_init logic. Then you'll just convert from them to the final type to be used in the code. This can be easily automated, either via code generation (explicit approach), or you could even play around with generating the phantom types from the "true" types on the fly on the topic, I think all the implicit logic, like treating nulls as missing value, is more headache then gain in the long run. pydantic was (or maybe still is) is so confusing with None/Optional handling. As PEP20 says - |
I don't disagree - big fan of immutable data structures, and the explicit > implicit is true for sure. However - that's the json i got! |
Question
Hi, I was hoping for a flag or some other method of interpreting a
None
value in json as the default value, such that it would continue to conform to the type of the field (and avoidmypy
continuing to complain about my field being optional)For example:
results in
Which is true! But it would be cool to detect that null value, and force it to be an empty list. The alternative is to do something like:
But then my typing for the
data
field is an optional list, and mypy tells me that i need to sprinkle assertions everywhere to be sure that it's actually a list before i read it. It would also be great to then not serialize that back to json, but that would be a bonus.Thanks so much for the library! You would not believe how much time it is saving me for very large json files (upwards of 200MB).
The text was updated successfully, but these errors were encountered: