Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix pretrained weights checkpoint_path and inference weight double double download #75

Merged
merged 1 commit into from
Nov 22, 2024

Conversation

tlarcher
Copy link
Collaborator

@tlarcher tlarcher commented Nov 22, 2024

📝 Changelog

Main changes

  • Updated all examples to fix inference path issue. Previously, PyTorch Lightning overwrote checkpoint_paths with the values contained in the saved checkpoint files. However, those provided by the Malpolon team for pure inference purposes, contained absolute path incompatible with other people's machines. Now, only relative paths are stored.
  • Updated all examples to prevent downloading the model weights twice when running models in inference mode
  • Updated URL and md5 checksum signature to download glc24_pre_extracted pre-trained weights

Other changes

  • Added Malpolon QR code in project resources

🔗 Links

✅ Checklist

  • Lint and tests pass locally with my changes
  • I've added necessary documentation

…wnload (#74)

* Updated glc24_pre_extracted pretrained checkpoint with fixedj wiehgt_dir path

* Updated all examples to prevent double downloading of model weights when using inference mode.

* Added malpolon qr code
@tlarcher tlarcher self-assigned this Nov 22, 2024
@tlarcher tlarcher added bug Something isn't working enhancement New feature or request labels Nov 22, 2024
@tlarcher tlarcher merged commit 80e1084 into main Nov 22, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant