User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
- Updated
Jul 7, 2025 - JavaScript
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
LLMX; Easiest 3rd party Local LLM UI for the web!
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Fully-featured, beautiful web interface for vLLM - built with NextJS.
c4 GenAI Suite
🔬 Experiment is an experiment is an experiment is an experiment is an experiment is an e̴x̷p̶e̶r̶i̶m̸e̸n̸t̴ ̷i̵s̴ ̷a̵n̷ è̷̜x̴̝͝p̵̨̐e̴̯̐r̴͔̍ì̸̻m̴̛͎e̵̥̔n̶̠̎t̷̠͝ ̶̼̳̕ǐ̷̞͍͂s̷͍̈́ ̶̫̀a̵̠͌n̵̲͊ ̶̣̼̆ḛ̸̀x̵̰͋p̵͉̺̎e̶̛͈̮ř̸̜̜̅ì̵̜̠͗ṃ̴̼͆ė̴̮n̶̪̈́t̸̢͖͋͂
A NextJS "Local First" AI Interface
Intentflow is a YAML-based UX flow engine that lets you define, trigger, and optimize user journeys in your frontend. It supports dynamic flags, conditional components (modals, tooltips, banners), optional LLM logic for adaptive rendering.
Dive-APP is a Flutter-based mobile application that brings your own powerful AI agents to your pocket via connect your own MCP-Client.
RotinaPy: Simplify your daily life and maximize productivity with an integrated app for task management, study tracking, flashcards, and more. Built with Streamlit and Python.
A modern, feature-rich web interface built with Next.js and shadcn/ui for interacting with local Ollama large language models.
Add a description, image, and links to the llm-ui topic page so that developers can more easily learn about it.
To associate your repository with the llm-ui topic, visit your repo's landing page and select "manage topics."