-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Don't use PretrainedModelInitializer when loading a model #4711
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems fine, but wouldn't it be easier to just not run initializers at all when loading the model?
If we make the assumption that for every module the initializer is under "initializer" in the config, then yes. |
I went hunting for a more elegant way of doing this than grepping for the word "initializer" in a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fantastic, thank you!
When loading a model, pretrained embeddings are not loaded as it is not needed.
This change make the same thing happen with pretrained weights from PretrainedModelInitializer.
Goes with allenai/allennlp-models#141