PyTorch Lightning doesn't like _metadata
in BaseModel.state_dict()
#577
arturtoshev
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
Hi @arturtoshev! We just need to add tests and update the existing API! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
PyTorch Lightning does not use your
BaseModel.load_state_dict()
, so I had to emulate it in my LitModule:Did you consider moving what you currently do in
state_dict()
(i.e.,state_dict['_metadata'] = self._init_kwargs
) tosave_checkpoint()
? As long as you don't overridestate_dict()
, Lightning should be fine. Obviously, this way Lightning will not be able to check the version, but if this must be the case, then one has to use my solution above.Beta Was this translation helpful? Give feedback.
All reactions