Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix tests ORTModel #1517

Merged
merged 3 commits into from
Nov 6, 2023
Merged

Fix tests ORTModel #1517

merged 3 commits into from
Nov 6, 2023

Conversation

fxmarty
Copy link
Contributor

@fxmarty fxmarty commented Nov 6, 2023

As per title.

@fxmarty fxmarty requested a review from echarlaix November 6, 2023 09:46
Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!

self.use_past = len(self.key_value_input_names) > 0
self.use_past = len(self.key_value_output_names) > 0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why this change ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was updated in a more recent commit. Basically self.use_past = len(self.key_value_input_names) > 0 +

if self.use_past is False or use_merged_no_cache:
out_past_key_values = tuple(
out_past_key_values[i : i + self.num_pkv] for i in range(0, len(out_past_key_values), self.num_pkv)
)
does not really make sense and it results in failing tests.

We don't want an empty tuple as output for KV cache, but we want None.

optimum/onnxruntime/modeling_decoder.py Show resolved Hide resolved
@echarlaix echarlaix merged commit 72a402e into huggingface:main Nov 6, 2023
45 of 52 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants