Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add transformer counter to profile #149

Open
wants to merge 19 commits into
base: master
Choose a base branch
from

Conversation

HaoKang-Timmy
Copy link
Contributor

@HaoKang-Timmy HaoKang-Timmy commented Oct 7, 2021

  1. add transformer counter to rnn_hooks.py
  2. provide evaluate_transformer.py as an example

src = torch.rand((1, 1, 10)) # S,N,x


class Model_transformer(nn.Module):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Class name should be CamelCased.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

thop/profile.py Outdated
@@ -67,7 +67,7 @@ def prYellow(skk): fprint("\033[93m{}\033[00m".format(skk))
nn.RNN: count_rnn,
nn.GRU: count_gru,
nn.LSTM: count_lstm,

nn.Transformer: count_Transformer,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

function name should be lower case.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

@@ -196,3 +196,84 @@ def count_lstm(m: nn.LSTM, x, y):
total_ops *= batch_size

m.total_ops += torch.DoubleTensor([int(total_ops)])


def count_Transformer(m: nn.Transformer, x, y):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same issue here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, also changed its subfunction, sorry for forgeting changing it after learning camelcase

@liminn
Copy link

liminn commented Jan 5, 2022

why not merge this pull? It is very necessary to calculate the flops and prams of transformer.

@quancs
Copy link

quancs commented Mar 16, 2023

Is there any update now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants