A cross-platform, offline-first LLM interface built for developers who love control, speed, and privacy.
π§ What is PyllamaUI?
PyllamaUI is a lightweight, fully offline interface to interact with local large language models (LLMs) via Ollama.
It uses Python for logic and customtkinter for the GUI β wrapped in a developer-friendly experience across multiple platforms.
π¦ Platform Support
Platform | Status |
---|---|
π» Python GUI (via PyPI) | β
Available: pip install pyllamaui |
π§© VS Code Extension | β Published on VS Marketplace |
πͺ Windows .exe | π§ͺ In testing |
π§ Linux .deb Package | π§ͺ In testing |
All versions support models like TinyLLaMA, DeepSeek-Coder, Gemma and more. Just run them through your local Ollama server.
π₯ Teaser Preview
Hereβs a sneak peek of PyllamaUI running fully offline inside VS Code:
πΉ Full demo, tutorials, and walkthroughs are coming soon β follow for updates!
β¨ Key Features (so far)
- π₯οΈ GUI-based LLM interface (built with
customtkinter
) - π§ Runs on any Ollama-supported model
- π¨ Theme switching (light/dark)
- π Model selector (TinyLLaMA, DeepSeek, Gemma, etc.)
- π No internet required after setup
- π οΈ Future updates: Output copy buttons, smoother UX
π Open Source & Contributing
The project is open-sourced and actively maintained.
If you're interested in contributing:
- π GitHub: github.com/bhuvanesh-m-dev/pyllamaui
- π¬ Raise issues, submit PRs, or suggest features
π Project Links
- π Website: https://bhuvaneshm.in/pyllamaui
- π GitHub: github.com/bhuvanesh-m-dev/pyllamaui
- π VS Code Extension: marketplace.visualstudio.com/items?itemName=bhuvanesh-m-dev.pyllamaui
π Final Thoughts
PyllamaUI is just getting started. If you're passionate about LLMs, privacy, and offline tools β this is for you.
Try it. Fork it. Shape it.
Drop a comment or connect with me on LinkedIn.
π‘ Built with β€οΈ by Bhuvanesh M
Top comments (0)