Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Nous: Hermes 2 Mixtral 8x7B SFT

    nousresearch/nous-hermes-2-mixtral-8x7b-sft

    Created Jan 16, 202432,768 context

    Nous Hermes 2 Mixtral 8x7B SFT is the supervised finetune only version of the Nous Research model trained over the Mixtral 8x7B MoE LLM.

    The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.

    #moe

    Recent activity on Hermes 2 Mixtral 8x7B SFT

    Total usage per day on OpenRouter

    Not enough data to display yet.