Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc
    Favicon for huggingfaceh4

    Hugging Face H4

    Browse models from Hugging Face H4

    2 models

    Tokens processed on OpenRouter

    Not enough data to display yet.

    • Zephyr 141B-A35BZephyr 141B-A35B

      Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets. It is an instruct finetune of Mixtral 8x22B. #moe

      by huggingfaceh466K context
    • Hugging Face: Zephyr 7BZephyr 7B

      Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr-7B-β is the second model in the series, and is a fine-tuned version of mistralai/Mistral-7B-v0.1 that was trained on a mix of publicly available, synthetic datasets using Direct Preference Optimization (DPO).

      by huggingfaceh44K context