Skip to content
This repository has been archived by the owner on Apr 25, 2023. It is now read-only.

A question about the nn.Embedding #30

Open
zhang-qiang-github opened this issue May 1, 2021 · 0 comments
Open

A question about the nn.Embedding #30

zhang-qiang-github opened this issue May 1, 2021 · 0 comments

Comments

@zhang-qiang-github
Copy link

Thank you for sharing this project code, and I have a question for nn.Embedding.

In this project, the shape of src and trg is (maxLen, batch size). The forward of Encoder is:

    def forward(self, src, hidden=None):
        embedded = self.embed(src)
        outputs, hidden = self.gru(embedded, hidden)
        # sum bidirectional outputs
        outputs = (outputs[:, :, :self.hidden_size] +
                   outputs[:, :, self.hidden_size:])
        return outputs, hidden

When I debug it, the shape of src is (37, 32), in which 32 is the batch size.
However, when I read the explanation of nn.Embedding, the example code shows:

>>> # a batch of 2 samples of 4 indices each
>>> input = torch.LongTensor([[1,2,4,5],[4,3,2,9]])
>>> embedding(input)

Thus, the input of Embedding should be (batch size, maxLen).

This problem make me very confuzed.

Any suggestion is apprciated!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant