Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in prefix_recognize for TA #18

Open
qzfnihao opened this issue Feb 3, 2021 · 0 comments
Open

Bug in prefix_recognize for TA #18

qzfnihao opened this issue Feb 3, 2021 · 0 comments

Comments

@qzfnihao
Copy link

qzfnihao commented Feb 3, 2021

I trained model with ta use transformer on aishell1 with encoder left window 15, right window 15, decoder window left 15, right 2. I got better acc on train data. But when decode in prefix_recognize, the wer is 9.3 on test set! It was worse than chunk32 with wer 6.3
But compare chunk and Ta training log, the acc in ta was better than chunk. So I doubt the algorithm wrong. By removing the hat_att, which acting like cache, ta got wer 6.5 when ctc_weight 0.5, and terrible Rtf, maybe 8-10.

Could you modify the algorithm to fix ta decoding with better wer and rtf ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant