Data processing with ML, LLM and Vision LLM
-
Updated
Nov 14, 2024 - Python
Data processing with ML, LLM and Vision LLM
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
中文nlp解决方案(大模型、数据、模型、训练、推理)
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
Social networking platform with automated content moderation and context-based authentication system
Label, clean and enrich text datasets with LLMs.
Simple UI for LLM Model Finetuning
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built using ⚡ Pytorch Lightning and 🤗 Transformers. For access to our API, please email us at [email protected].
Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E including jump starting GPT-4, speech-to-text, text-to-speech, text to image generation with DALL-E, Google Cloud AI,HuggingGPT, and more
Learn Cloud Applied Generative AI Engineering (GenEng) using OpenAI, Gemini, Streamlit, Containers, Serverless, Postgres, LangChain, Pinecone, and Next.js
Multimodal model for text and tabular data with HuggingFace transformers as building block for text data
Fast Inference Solutions for BLOOM
[EMNLP 2022] Unifying and multi-tasking structured knowledge grounding with language models
Exact structure out of any language model completion.
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Low latency JSON generation using LLMs ⚡️
Add a description, image, and links to the huggingface-transformers topic page so that developers can more easily learn about it.
To associate your repository with the huggingface-transformers topic, visit your repo's landing page and select "manage topics."