Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

would be nice to add an input mask, so we can use arbitrary length input during the training. #6

Open
shamanez opened this issue Jul 13, 2022 · 0 comments

Comments

@shamanez
Copy link

The forward function of the TransformerEncoderLayer can have src_key_padding_mask. Maybe we can update it too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant