-
Notifications
You must be signed in to change notification settings - Fork 471
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MM Eval tests #1887
Closed
Closed
MM Eval tests #1887
Changes from 10 commits
Commits
Show all changes
25 commits
Select commit
Hold shift + click to select a range
5cb9140
mm eval tests
SalmanMohammadi 63ba175
mm eval tests
SalmanMohammadi 0331778
Merge branch 'main' into mm_tests
SalmanMohammadi 578aa48
adding test values
SalmanMohammadi f0a94d7
reverting changes
SalmanMohammadi df3402c
Merge branch 'main' into mm_tests
SalmanMohammadi 60bccc6
whoops
SalmanMohammadi 6681749
whoops 2
SalmanMohammadi d214f52
tidy tidy tidy tidy fresh clean
SalmanMohammadi e3155a1
what is this rounding nonesense?
SalmanMohammadi 7add9af
fixing values
SalmanMohammadi c3246c0
fixing parameterize
SalmanMohammadi e3f8178
just put it on teh gpu?
SalmanMohammadi acd6763
Merge branch 'mm_tests' of github.com:SalmanMohammadi/torchtune into …
SalmanMohammadi ed3f02e
what a silly billy I am oh boy
SalmanMohammadi 8de3350
is it a python version thing?
SalmanMohammadi 3424c32
it is NOT. BACK TO THE CPU
SalmanMohammadi abca4d1
back to gpu.. it's a max_seq_len thing??
SalmanMohammadi 5ab8f83
that didn't work...
SalmanMohammadi 19c029e
this is a terrible experience for me
SalmanMohammadi a691a08
stg if this doesn't work
SalmanMohammadi e7018fa
Merge branch 'main' into mm_tests
SalmanMohammadi 3bb57fa
I don't even know at this point
SalmanMohammadi 76ff0fd
OKAY this should work right?
SalmanMohammadi 24e24b5
????
SalmanMohammadi File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -170,6 +170,7 @@ def llama3_2_vision_decoder( | |
by :func:`~torchtune.modules.KVCache`. | ||
encoder_max_seq_len (int): maximum sequence length the encoder will be run with, as used | ||
by :func:`~torchtune.modules.KVCache`. | ||
rope_base (int): base for the rotary positional embeddings. Default: 500_000 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. How did our linter not pick this up... |
||
intermediate_dim (Optional[int]): intermediate dimension for MLP. If not specified, | ||
this is computed using :func:`~torchtune.modules.scale_hidden_dim_for_mlp`. | ||
|
||
|
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we have this in
tests/torchtune/modules/test_common_utils.py
, opportunity to unify?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It made the most sense to me to define the fixture where it was being used - do you have strong opinions here?