Skip to content

Conversation

@Michael-A-Kuykendall
Copy link

Adds shimmy to the Interface & Pipeline & AutoML section.

Project Details:

Key ML Features:

  • OpenAI-compatible API for seamless integration
  • GGUF + SafeTensors format support
  • Hot model swapping for dynamic serving
  • Auto-discovery of available models
  • Single binary deployment (no Python dependencies)
  • GPU acceleration support
  • LLaMA, ChatML, and various model architectures

Why this belongs in Awesome Rust ML:

  • Pure Rust implementation for ML inference serving
  • Production-ready inference server with significant adoption
  • Fills important gap for OpenAI-compatible Rust inference solutions
  • Zero-dependency deployment suitable for ML production environments
  • Active development with regular feature additions

Placement: Added alphabetically to Interface & Pipeline & AutoML section alongside other inference servers like orkhon, wonnx, and tract.

Following contribution guidelines: "Please just update the README.md"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant