Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

timm bits #804

Open
wants to merge 101 commits into
base: main
Choose a base branch
from
Open

timm bits #804

wants to merge 101 commits into from

Conversation

rwightman
Copy link
Collaborator

The branch associated with this PR is for development of timm bits -- a significant update and reorganizaiton of timm's training scripts and associated modules. It works w/ TPUs (PyTorch XLA) and GPUs (PyTorch) and possibly DeepSpeed.

This PR is still a long ways from being merged. Having the branch as a PR helps w/ visibility of the ongoing work.

rwightman and others added 30 commits April 20, 2021 17:15
…step closure used, metrics base impl w/ distributed reduce, many tweaks/fixes.
…h XLA usage on TPU-VM. Add some FIXMEs and fold train_cfg into train_state by default.
… XLA (pushing into transforms), revamp of transform/preproc config, etc ongoing...
rwightman and others added 30 commits February 28, 2022 16:28
…ers more similar. Fix workers=0 compatibility. Add ImageNet22k/12k synset defs.
Add support for different TFDS `BuilderConfig`s
Fix issue with `torchvision`'s `ImageNet`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants