AI Models Helix Uses

AI Models Helix Uses

Text Models

We use Model aliases for different use cases. We update the underlying model to the most performant and efficient model available for that role.

You can specify the

Helix 3.5

  • Utilizes Llama3-8B for fast and efficient performance, ideal for everyday tasks.
  • Ollama tag llama3:instruct

Helix 4

  • Powered by Llama3-70B, this model offers deeper insights and although a bit slower, it’s smarter for complex queries.
  • Ollama tag llama3:70b

Helix Mixtral

  • Features CodeLlama-70B from Meta, which excels in programming and coding tasks, surpassing the capabilities of GPT-4 in software development contexts.
  • Ollama tag mixtral:instruct

Helix JSON

  • Operates on Nous Hermes 2 Theta Llama3 7B, specialized for function calling and generating JSON outputs, enhancing automation and integration tasks.
  • Ollama tag adrienbrault/nous-hermes2theta-llama3-8b:q8_0

Helix Small

  • A smaller model based on Phi-3 Mini 3.8B, fast and memory efficent.
  • Ollama tag phi3:instruct

Helix Text Fine Tuning

Text models use Ollama for inference and axolotl for fine-tuning (and inference on a fine-tune).

See this for the fine-tuning configuration we use.

Image Models

Image models use the excellent cog (in particular cog-sdxl).

See this for the settings we use.

Video Models

Coming soon.

What else would you like to see? Let us know on Discord!

Last updated on