Skip to content

Commit

Permalink
add documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
JakobEliasWagner committed Jun 12, 2024
1 parent 3018d71 commit 4e9af62
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions src/continuiti/networks/attention/scaled_dot_product.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,14 @@


class ScaledDotProduct(Attention):
"""Scaled dot product attention module.
This module is a wrapper for the torch implementation of the scaled dot product attention mechanism as described in
the paper "Attention Is All You Need" by Vaswani et al. (2017). This attention mechanism computes the attention
weights based on the dot product of the query and key matrices, scaled by the square root of the dimension of the
key vectors. The weights are then applied to the value vectors to obtain the final output.
"""

def __init__(self, dropout_p: float = 0.0):
super().__init__(dropout_p)

Expand Down

0 comments on commit 4e9af62

Please sign in to comment.