A beautiful, modern AI chat interface built with FastAPI and vanilla JavaScript, featuring Leo as your friendly AI assistant.
- π¨ Beautiful gradient UI with smooth animations
- π Dark/Light Mode Theming - Toggle between themes with full CSS variable support
- π¬ Session-based chat with persistent history
- π Message Search - Search across all conversations with real-time highlighting
- π Sound Notifications - Toggle audio notifications on/off with persistent preference
- π Interactive visual effects (balloons, stars, hearts, confetti, fireworks)
- π± Fully responsive design with advanced hover effects
- π¦ Meet Leo - your AI companion with personality
- π Command history with arrow key navigation
- πΎ SQLite database for conversation storage
- π― Click-to-navigate search results
- ποΈ Enhanced session management with sidebar navigation
- β¨ Smooth transitions and polished UI interactions
- π Database Analysis - Ask Leo about your chat statistics and patterns via MCP integration
- Python 3.8 or higher
- Ollama installed and running locally
- Git (optional, for cloning)
- Node.js and npm (optional, for enhanced MCP database analysis)
git clone <your-repo-url> cd chat_bot# Create virtual environment python -m venv .venv # Activate virtual environment # On macOS/Linux: source .venv/bin/activate # On Windows: .venv\Scripts\activatepip install -r requirements.txtMake sure Ollama is installed and running:
# Install Ollama (if not already installed) # Visit https://ollama.ai/ for installation instructions # Pull the required model ollama pull llama3.1 # Verify Ollama is running ollama listpython main.pyThe application will:
- Automatically create an empty SQLite database (
chat_history.db) - Start the web server on
http://localhost:8000 - Initialize all necessary database tables
Navigate to http://localhost:8000 and start chatting with Leo!
For advanced database analysis features, install the MCP SQLite server:
# Install Node.js (if not already installed) # Visit https://nodejs.org/ for installation instructions # Install the MCP SQLite server globally npm install -g mcp-sqliteLeo will automatically detect and use the MCP server if available, providing enhanced database analysis capabilities.
- Type your messages and press Enter or click Send
- Leo will respond using the Ollama AI model
- Each conversation is automatically saved
- Click the π Theme button in the top-right corner to toggle between dark and light modes
- Theme preference is automatically saved and restored
- All UI components seamlessly adapt to the selected theme
- Click the π Sound button in the top-right corner to toggle audio notifications
- When enabled, Leo plays a pleasant C major chord when responding to messages
- Sound preference is automatically saved and restored across sessions
- Visual feedback shows π when enabled or π when muted (with red highlighting)
- Use the π Search input at the top of the sidebar to find messages
- Search works across all conversations and sessions
- Matching text is highlighted in real-time
- Click on search results to jump directly to that conversation
Try these special commands:
- "Show me balloons" π
- "Show me stars" β
- "Show me hearts" π
- "Show me confetti" π
- "Show me fireworks" π
Or use the "β¨ Show me" dropdown button in the top-right corner!
Leo can analyze your chat history and provide insights! Try asking:
- "Show me my chat statistics"
- "How many conversations have I had?"
- "What's my recent activity?"
- "Analyze my database"
- "Show me session stats"
Leo will automatically detect database-related questions and provide:
- π Total session and message counts
- π Recent activity trends (last 7 days)
- π₯ Most active conversations
- π Usage patterns and statistics
Note: For enhanced database analysis, Leo attempts to start an MCP SQLite server. If Node.js is not installed, basic analysis mode is used.
- Arrow Keys: Navigate through your command history (Up/Down arrows in input field)
- Enter Key: Send messages quickly without clicking the send button
- New Chat: Start a fresh conversation
- Sidebar: View and switch between previous chat sessions
- Search Results: Click any search result to navigate to that conversation
- Session Switching: Seamlessly switch between conversations with full history
- Typing Indicators: Animated dots show when Leo is processing your message
- Auto-scroll: Messages automatically scroll to the latest response
- Error Handling: Graceful error messages and fallback behaviors
- Input Validation: Message length limits and input sanitization
- Persistent State: All preferences and session data saved automatically
Edit main.py and modify these variables:
OLLAMA_API_URL = "http://localhost:11434/api/chat" MODEL_NAME = "llama3.1" # Change to your preferred modelPopular Model Options:
llama3.1(8B) - Good balance of speed and quality (recommended)llama3.1:70b- Higher quality, requires more resourcescodellama- Specialized for programming tasksmistral- Fast and efficient alternative
To switch models:
ollama pull <model-name> # Then update MODEL_NAME in main.pyThe SQLite database is automatically created as chat_history.db in the project directory. This file is excluded from git via .gitignore to keep your conversations private.
chat_bot/ βββ main.py # FastAPI backend server βββ index.html # Frontend interface with theming & search βββ app.js # JavaScript functionality (modular organization) βββ database.py # Database operations and schema βββ models.py # Pydantic models and data structures βββ ollama_utils.py # Ollama API integration utilities βββ mcp_sqllite.py # MCP SQLite server management βββ requirements.txt # Python dependencies βββ .gitignore # Git ignore rules (protects database) βββ chat_history.db # SQLite database (auto-created, excluded from git) βββ chat_bot.log # Application logs - Theming System: CSS custom properties (variables) for comprehensive dark/light mode support
- Search Implementation: Real-time search with debouncing and result highlighting
- Session Management: UUID-based sessions with full conversation history
- Visual Effects: Canvas-based animations triggered by keywords or manual selection
- Responsive Design: Mobile-first approach with smooth transitions
- Database Security: SQLite database automatically excluded from version control
GET /- Serve the main chat interfaceGET /app.js- Serve the JavaScript module filePOST /chat- Send message and get AI responseGET /sessions- List recent chat sessionsGET /sessions/{id}- Get specific session historyPOST /sessions- Create new session
- Theme Toggle: Persistent dark/light mode switching
- Sound Notifications: Audio feedback with Web Audio API and localStorage persistence
- Search Functionality: Cross-session message search with highlighting
- Visual Effects: Keyword-triggered canvas animations
- Session Navigation: Smooth switching between conversation histories
- Responsive Layout: Adaptive design for all screen sizes
- Modular JavaScript: Clean separation of concerns with organized app.js file
- The SQLite database containing your chat history is automatically excluded from git
- Each fresh clone starts with an empty database
- Chat sessions are identified by UUIDs for privacy
- No data is sent to external services except your local Ollama instance
- Ensure Ollama is running:
ollama list - Check if the model is available:
ollama pull llama3.1 - Verify Ollama is accessible at
http://localhost:11434
- The database is automatically created on first run
- If you encounter issues, delete
chat_history.dband restart the application
If port 8000 is in use, modify the uvicorn command in main.py:
if __name__ == "__main__": import uvicorn uvicorn.run(app, host="0.0.0.0", port=8080) # Change port hereFeel free to submit issues and enhancement requests!
This project is open source and available under the MIT License.
Enjoy chatting with Leo! π¦