Skip to content

A clean, provider-agnostic message classification service using AI models. Built with modern Python practices, featuring robust configuration management and optimized performance through intelligent caching.

License

Notifications You must be signed in to change notification settings

theakashrai/ai-classifier-sample

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– AI Classifier Sample

Python 3.13+ Poetry License: MIT Code style: black

A clean, provider-agnostic message classification service using AI models. Built with modern Python practices, featuring robust configuration management and optimized performance through intelligent caching.

✨ Features

  • πŸ€– Dual Classification Modes: Both single-turn and multi-turn conversational classification
  • πŸ’¬ Context-Aware: Tracks conversation state and intent transitions across multiple turns
  • πŸ“Š Rich Output: Detailed responses including intent, confidence, reasoning, and transitions
  • ☁️ Provider Agnostic: Currently supports AWS Bedrock Claude with extensible architecture
  • βš™οΈ Smart Configuration: Pydantic Settings with environment variable support
  • πŸš€ High Performance: LRU cached singleton settings for optimal performance
  • πŸ” Secure Authentication: Cloud provider profile support for secure access
  • πŸ“¦ Modern Python: Built with Python 3.13+ and Poetry dependency management

πŸš€ Quick Start

Installation

# Clone the repository git clone https://github.com/theakashrai/ai-classifier-sample.git cd ai-classifier-sample # Install with Poetry poetry install # Or install with pip pip install ai-classifier-sample

Basic Usage

from ai_classifier_sample.service.classifier import MessageClassifier # Initialize classifier (uses cached settings) classifier = MessageClassifier() # Classify a message message = "Hello, how are you?" category = classifier.classify(message) print(f"Message: {message}\nCategory: {category}")

βš™οΈ Configuration

The application uses environment variables for configuration. You can set these in your environment or create a .env file:

Environment Variables

Variable Description Default
CLOUD_REGION Cloud provider region for AI service us-east-1
CLOUD_PROFILE Cloud provider profile to use for authentication None
MODEL_ARN AI model ARN or identifier arn:aws:bedrock:us-east-1:123456789:inference-profile/us.anthropic.claude-sonnet-4-20250514-v1:0
MAX_TOKENS Maximum tokens for AI model responses 5000
PROVIDER AI service provider type aws

Example .env file

# Cloud Provider Configuration CLOUD_REGION=us-west-1 CLOUD_PROFILE=my-cloud-profile # AI Model Configuration MODEL_ARN=arn:aws:bedrock:us-east-1:123456789:inference-profile/us.anthropic.claude-sonnet-4-20250514-v1:0 MAX_TOKENS=5000 PROVIDER=aws

🎯 Usage Examples

Basic Classification

import json from ai_classifier_sample.service.classifier import MessageClassifier # Initialize classifier (uses cached settings) classifier = MessageClassifier() # Single-turn classification message = "Hello, how are you?" result = classifier.classify(message) # Parse JSON response response_data = json.loads(result) print(f"Message: {response_data['message']}") print(f"Category: {response_data['category']}")

Conversational Classification

from ai_classifier_sample.service.classifier import MessageClassifier, ConversationState # Initialize classifier and conversation state classifier = MessageClassifier() conversation_state = ConversationState() # Multi-turn conversation messages = [ "Hi, I need help with my order", "I placed it last week but haven't received tracking info", "The order number is #12345" ] for message in messages: response = classifier.classify_conversational(message, conversation_state) print(f"Intent: {response.intent}") print(f"Transition: {response.intent_transition}") print(f"Confidence: {response.confidence}")

Settings Management

from ai_classifier_sample.config.settings import get_settings # Get cached settings instance settings = get_settings() print(f"☁️ Cloud Region: {settings.cloud_region}") print(f"πŸ€– Model ARN: {settings.model_arn}") print(f"πŸ”§ Max Tokens: {settings.max_tokens}")

πŸ§ͺ Testing

Run the tests using pytest:

# Run all tests poetry run pytest # Run tests with verbose output poetry run pytest -v # Run specific test file poetry run pytest tests/test_settings.py # Run with coverage poetry run pytest --cov=ai_classifier_sample --cov-report=html # Run legacy test script directly poetry run python tests/test_settings.py

πŸ—οΈ Architecture

  • βš™οΈ Settings: Pydantic Settings with LRU cache for singleton pattern
  • πŸ€– Classifier: Message classification service using AI models
  • 🌍 Environment Variables: Full support for configuration via environment variables
  • πŸ” Cloud Provider Profile: Automatic cloud provider profile setting when specified
  • πŸ“¦ Modern Dependencies: Poetry for dependency management and virtual environments

πŸ”§ Development

Setting up for Development

# Clone the repository git clone https://github.com/theakashrai/ai-classifier-sample.git cd ai-classifier-sample # Install development dependencies poetry install --with dev # Install pre-commit hooks poetry run pre-commit install

Code Quality

# Format code poetry run black . poetry run isort . # Remove unused imports poetry run autoflake --remove-all-unused-imports --recursive --in-place . # Run linting poetry run flake8 . # Type checking (if mypy is added) # poetry run mypy src/

Running Tests

# Run all tests poetry run pytest # Run tests with coverage poetry run pytest --cov=ai_classifier_sample --cov-report=html # Run specific test categories poetry run pytest -m unit poetry run pytest -m integration

πŸ› οΈ Supported AI Providers

AWS Bedrock

Currently supported with Claude models:

  • Claude 4: Next-generation model with advanced capabilities
  • Claude 3.5 Sonnet: Latest and most capable model with enhanced reasoning
  • Claude 3.5 Haiku: Fast, cost-effective option with improved performance
  • Claude 3 Sonnet: Balanced performance and speed
  • Claude 3 Haiku: Ultra-fast and lightweight for simple tasks
  • Claude 3 Opus: Maximum performance for complex reasoning tasks

Adding New Providers

The architecture is designed to be extensible. To add a new provider:

  1. Create a new provider class in src/ai_classifier_sample/providers/
  2. Implement the required interface methods
  3. Update the configuration settings
  4. Add provider-specific tests

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

Development Workflow

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Make your changes and add tests
  4. Run the test suite (poetry run pytest)
  5. Run code quality checks (poetry run black . && poetry run flake8 .)
  6. Commit your changes (git commit -m 'Add some AmazingFeature')
  7. Push to the branch (git push origin feature/AmazingFeature)
  8. Open a Pull Request

Code Standards

  • Follow PEP 8 style guidelines
  • Use type hints for all function signatures
  • Write comprehensive tests for new features
  • Document public APIs with docstrings
  • Keep commits atomic and well-described

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ”§ Dependencies

Core Dependencies

  • boto3: AWS SDK for Python
  • langchain-aws: LangChain AWS integration
  • langgraph: Graph-based AI workflows
  • langchain: Framework for developing LLM applications
  • pydantic-settings: Settings management with validation

Development Dependencies

  • black: Code formatting
  • flake8: Linting and style checking
  • pytest: Testing framework
  • pre-commit: Git hooks for code quality
  • isort: Import sorting
  • autoflake: Unused import removal

🎯 Why This Tool?

Clean Architecture

  • βœ… Provider-agnostic design for future extensibility
  • βœ… Separation of concerns with clear module boundaries
  • βœ… Configuration management with validation
  • βœ… Performance optimization through intelligent caching

Developer Experience

  • βœ… Modern Python 3.13+ with type hints
  • βœ… Poetry for reliable dependency management
  • βœ… Comprehensive testing with pytest
  • βœ… Code quality tools integrated
  • βœ… Clear documentation and examples

Production Ready

  • βœ… Environment-based configuration
  • βœ… Secure cloud provider authentication
  • βœ… Error handling and logging
  • βœ… Extensible architecture for scaling

Built with ❀️ using modern Python practices

About

A clean, provider-agnostic message classification service using AI models. Built with modern Python practices, featuring robust configuration management and optimized performance through intelligent caching.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages