Skip to content
This repository was archived by the owner on Dec 16, 2022. It is now read-only.

Docs update for PytorchTransformerWrapper #5295

Merged
merged 5 commits into from
Jul 7, 2021
Merged

Docs update for PytorchTransformerWrapper #5295

merged 5 commits into from
Jul 7, 2021

Conversation

dirkgr
Copy link
Member

@dirkgr dirkgr commented Jun 30, 2021

Fixes #5285.

@dirkgr dirkgr requested a review from epwalsh June 30, 2021 23:40
@dirkgr dirkgr self-assigned this Jun 30, 2021
@david-waterworth
Copy link

I think positional_embedding_size is still missing. This is only used where positional_embedding='embedding' and defaults to 512.

self._positional_embedding = nn.Embedding(positional_embedding_size, input_dim)

@dirkgr
Copy link
Member Author

dirkgr commented Jul 6, 2021

Oops. I added the missing parameter.

Copy link
Member

@epwalsh epwalsh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@dirkgr dirkgr merged commit 436c52d into main Jul 7, 2021
@dirkgr dirkgr deleted the DocsUpdate branch July 7, 2021 01:34
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Stale docstring in constructor of PytorchTransformer
3 participants