Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(cli-generate-config): ensure configuration types are always toml parsable #785

Conversation

codejedi365
Copy link
Contributor

@codejedi365 codejedi365 commented Dec 29, 2023

Purpose

Ensure that when converting pydantic configuration objects into TOML, the TOML comes out parsable.

Rationale

When working on #782, I ran into a problem where the TOML generated was unparsable due to the use of a subclass of IntEnum. The LevelBump(IntEnum) caused the string produced by the __str__() to not be surrounded in quotations in the resulting TOML.

This led to discovering that the Tomlkit developer refuses to attempt to address serialization of enums due to the shifting position of Python implementation of enums across versions of python (python-poetry/tomlkit#237). Since Tomlkit will not support enums, we must prevent enum objects from being passed to tomlkit.dumps(). Pydantic saves the day with the parameter mode for the model_dumps() function which can be set to json and it will force the returning dictionary to always use types that are JSON serializable instead of Python classes which may or may not be serializable when passed to tomlkit. I also looked for a possible custom TOML serialize function that could be overrided but tomlkit does not have one. The effect on IntEnums such as LevelBump caused the result to be the numeric value of the enum as opposed to the string. The numeric value is effectively written out via tomlkit.dumps() and then through pydantic coercion it is parsed back into the proper enum value.

How I tested

I was developing on #782, when the default AngularParserOptions included the default_level_bump: LevelBump, it would not generate a parsable toml. You can prove this by adding the default_level_bump: LevelBump.NO_RELEASE to the default commit_parser_options of RawConfig and then attempt to run the dump command. You will see a
default_level_bump = no_release without quotes! Try to parse with tomlkit again and it will fail on the n of no_release.

Outside of the manual discovery above using the default_level_bump: LevelBump as a commit_parser_option, I updated the cli generate-config command testing. This particular testing isn't quite apparent of the problem as there are not any current default values which are non-json serializable, but with #782, there will be!

How to verify

git fetch origin pull/785/head:pr-785

# checkout the PR as a detached HEAD state with only the unit tests
git checkout pr-785~1 --detach

# execute tests
pytest tests/command_line/test_generate_config.py

# update to the HEAD of the PR (with fixes)
git merge --ff-only pr-785

# run pytest again to see success
pytest tests/command_line/test_generate_config.py

And of course, all tests work as with the CI results below.

@codejedi365
Copy link
Contributor Author

@bernardcooke53, ready for review

@codejedi365 codejedi365 force-pushed the fix/model-dump-to-toml-parsable branch from b65b69d to c77353c Compare January 2, 2024 20:31
Copy link
Contributor

@bernardcooke53 bernardcooke53 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The differing output in this comment on the tomlkit issue you mentioned is gross 🤮

This does seem sensible to do now especially as there's currently nothing which isn't JSON-serializable in the default config. Thank you!

@bernardcooke53 bernardcooke53 merged commit 758e649 into python-semantic-release:master Jan 3, 2024
8 checks passed
@codejedi365 codejedi365 deleted the fix/model-dump-to-toml-parsable branch January 3, 2024 18:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants