Pinned Loading
-
transformers-in-supercomputers
transformers-in-supercomputers PublicTransformers training in a supercomputer with the 🤗 Stack and Slurm
Python 14
-
MN5-Distributed-PyTorch
MN5-Distributed-PyTorch PublicEntry point for developing distributed applications with PyTorch, Slurm and Singularity on MareNostrum5 Supercomputer
Shell 1
-
Megatron-LM
Megatron-LM PublicForked from NVIDIA/Megatron-LM
Debugging Megatron. 3D Parallelism, models, training and more!
Python 2
-
nanotron
nanotron PublicForked from swiss-ai/nanotron
Minimalistic large language model 3D-parallelism training
Python
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.