You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, thanks for this open-source codebase which is detailed and almost works fine for training and evaluation.
But as mentioned in your Blog Part2, we're also fighting for A100s, let alone a single H100. As ThunderKittens only supports H100, we cannot use lolcats_llama_window_tk_gen attn type, resulting in way slower inference speed and GPU memory than an original LLM.
As mentioned in your paper's Appendix A.1, you used A100 to conduct the results in Figure 9 and Table 13. However, it's extremely slow and memory-consuming when evaluating those benchmarks (especially MMLU) without ThunderKittens kernel on an A100.
The two questions we're curious about are:
How long does it take to complete an MMLU evaluation in your experiments?
Is there any possible way to make Lolcats inference efficiently as we expect on A100? Or do you plan to release a new kernel supporting Lolcats for A100s?
The text was updated successfully, but these errors were encountered:
z76316
changed the title
ThunderKittens Hedgehog doesn't support A100 (inconsistent with Appendix A.1)
ThunderKittens Hedgehog doesn't support A100 (cannot use lolcats_llama_window_tk_gen)
Dec 30, 2024
Hi there, thanks for this open-source codebase which is detailed and almost works fine for training and evaluation.
But as mentioned in your Blog Part2, we're also fighting for A100s, let alone a single H100. As ThunderKittens only supports H100, we cannot use lolcats_llama_window_tk_gen attn type, resulting in way slower inference speed and GPU memory than an original LLM.
As mentioned in your paper's Appendix A.1, you used A100 to conduct the results in Figure 9 and Table 13. However, it's extremely slow and memory-consuming when evaluating those benchmarks (especially MMLU) without ThunderKittens kernel on an A100.
The two questions we're curious about are:
The text was updated successfully, but these errors were encountered: