Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to checkpoint scalar value? #2738

Open
Wongboo opened this issue May 3, 2024 · 2 comments
Open

Is it possible to checkpoint scalar value? #2738

Wongboo opened this issue May 3, 2024 · 2 comments

Comments

@Wongboo
Copy link

Wongboo commented May 3, 2024

From accelerate document:
"By using register_for_checkpointing(), you can register custom objects to be automatically stored or loaded from the two prior functions, so long as the object has a state_dict and a load_state_dict functionality. This could include objects such as a learning rate scheduler."
Is it possible to include scalar value in a checkpoint just like pytorch? such as epoch and step, etc. if so, redundant lines to recover epoch can be avoided.

Copy link

github-actions bot commented Jun 2, 2024

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@muellerzr
Copy link
Collaborator

Sure we can absolutely. If you'd like to expand our checkpointing example here in accelerate implementing that, we can look at upstreaming it further 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants