Mattin AI is a comprehensive AI toolbox that provides a wide range of artificial intelligence capabilities and tools. This project offers various AI functionalities including:
- Large Language Models (LLMs) integration and management
- Retrieval-Augmented Generation (RAG) systems
- Semantic search capabilities
- Vector database management
- AI agents and automation
- And more...
The project aims to simplify the integration and use of AI technologies, providing a unified platform for various AI-powered solutions.
- LLM Integration: Easy access and management of various Large Language Models
- RAG Systems: Implementation of Retrieval-Augmented Generation for enhanced AI responses
- Semantic Search: Advanced search capabilities using semantic understanding
- Vector Databases: Efficient storage and retrieval of vector embeddings
- AI Agents: Framework for building and deploying AI agents
- Modular Architecture: Easy to extend and customize for specific needs
- Python 3.11 or higher
- Node.js 18 or higher
- PostgreSQL with pgvector extension
- Docker and Docker Compose (optional)
-
Clone the repository:
git clone https://github.com/your-username/ia-core-tools.git cd ia-core-tools -
Set up the backend:
cd backend cp env.example .env # Edit .env with your configuration pip install -r requirements.txt
-
Set up the frontend:
cd frontend npm install -
Set up the database:
# Using Docker Compose (recommended) docker-compose up -d postgres # Or install PostgreSQL with pgvector manually
-
Run the application:
# Backend cd backend python main.py # Frontend (in another terminal) cd frontend npm run dev
The easiest way to get started:
# Copy environment file cp backend/env.example .env # Edit .env with your configuration # Start all services docker-compose up -dKey environment variables you need to configure:
DATABASE_*: PostgreSQL database configurationOPENAI_API_KEY: Your OpenAI API keyANTHROPIC_API_KEY: Your Anthropic API keySECRET_KEY: Secret key for session managementGOOGLE_CLIENT_ID/GOOGLE_CLIENT_SECRET: Google OAuth credentials
See backend/env.example for a complete list of configuration options.
The platform supports multiple AI providers:
- OpenAI (GPT models)
- Anthropic (Claude models)
- Azure OpenAI
- Mistral AI
- Ollama (local models)
Configure these through the web interface or environment variables.
The project consists of several main components:
- Backend: FastAPI-based REST API with Python
- Frontend: React-based web interface with TypeScript
- Database: PostgreSQL with pgvector for vector storage
- AI Services: Modular integration with various LLM providers
We welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is available under a dual licensing model:
- Open Source: GNU Affero General Public License v3.0 (AGPL 3.0)
- Commercial: Proprietary license with enhanced rights and features
- Free to use for development and personal use
- Community contributions welcome
- Source code disclosure required for network use
- Copyleft obligations for modifications
- Full AICT functionality without restrictions
- Commercial use rights without copyleft obligations
- Client modification rights for specific projects
- Enterprise features and support
- No source code disclosure requirements
For more information, see:
- LICENSING.md - Detailed licensing information
- COMMERCIAL_LICENSE.md - Commercial license terms
- CLIENT_LICENSE_AGREEMENT.md - Client agreement template
Contact LKS Next for commercial licensing inquiries.
- Create an issue for bug reports or feature requests
- Check the documentation for common questions
- Join our community discussions