Skip to content

🧠 Elasticsearch-powered MCP server with hierarchical memory categorization, intelligent auto-detection, and batch review capabilities

License

Notifications You must be signed in to change notification settings

fredac100/elasticsearch-memory-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 Elasticsearch Memory MCP

PyPI MCP License: MIT Python

A powerful Model Context Protocol (MCP) server that provides persistent, intelligent memory using Elasticsearch with hierarchical categorization and semantic search capabilities.

✨ Features

🎯 V6.2 - Latest Release

  • 🏷️ Hierarchical Memory Categorization

    • 5 category types: identity, active_context, active_project, technical_knowledge, archived
    • Automatic category detection with confidence scoring
    • Manual reclassification support
  • πŸ€– Intelligent Auto-Detection

    • Accumulative scoring system (0.7-0.95 confidence range)
    • 23+ specialized keyword patterns
    • Context-aware categorization
  • πŸ“¦ Batch Review System

    • Review uncategorized memories in batches
    • Approve/reject/reclassify workflows
    • 10x faster than individual categorization
  • πŸ”„ Backward Compatible Fallback

    • Seamlessly loads v5 uncategorized memories
    • No data loss during upgrades
    • Graceful degradation
  • πŸš€ Optimized Context Loading

    • Hierarchical priority loading (~30-40 memories vs 117)
    • 60-70% token reduction
    • Smart relevance ranking
  • πŸ’Ύ Persistent Memory

    • Vector embeddings for semantic search
    • Session management with checkpoints
    • Conversation snapshots

πŸ› οΈ Installation

Quick Start (Recommended)

Install directly from PyPI:

pip install elasticsearch-memory-mcp

Prerequisites

  • Python 3.8+
  • Elasticsearch 8.0+

Step 1: Start Elasticsearch

# Using Docker (recommended) docker run -d -p 9200:9200 -e "discovery.type=single-node" elasticsearch:8.0.0 # Or install locally # https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html

Step 2: Configure MCP

For Claude Desktop

Add to ~/.config/Claude/claude_desktop_config.json:

{ "mcpServers": { "elasticsearch-memory": { "command": "uvx", "args": ["elasticsearch-memory-mcp"], "env": { "ELASTICSEARCH_URL": "http://localhost:9200" } } } }

Note: If you don't have uvx, install with pip install uvx or use python -m elasticsearch_memory_mcp instead.

For Claude Code CLI

claude mcp add elasticsearch-memory uvx elasticsearch-memory-mcp \ -e ELASTICSEARCH_URL=http://localhost:9200

Alternative: Install from Source

If you want to contribute or modify the code:

# Clone repository git clone https://github.com/fredac100/elasticsearch-memory-mcp.git cd elasticsearch-memory-mcp # Create virtual environment python3 -m venv venv source venv/bin/activate # Install in development mode pip install -e .

Then configure MCP pointing to your local installation:

{ "mcpServers": { "elasticsearch-memory": { "command": "/path/to/venv/bin/python", "args": ["-m", "mcp_server"], "env": { "ELASTICSEARCH_URL": "http://localhost:9200" } } } }

πŸ“š Usage

Available Tools

1. save_memory

Save a new memory with automatic categorization.

{ "content": "Fred prefers direct, brutal communication style", "type": "user_profile", "importance": 9, "tags": ["communication", "preference"] }

2. load_initial_context (Resource)

Loads hierarchical context with:

  • Identity memories (who you are)
  • Active context (current work)
  • Active projects (ongoing)
  • Technical knowledge (relevant facts)

3. review_uncategorized_batch πŸ†• V6.2

Review uncategorized memories in batches.

{ "batch_size": 10, "min_confidence": 0.6 }

Returns suggestions with auto-detected categories and confidence scores.

4. apply_batch_categorization πŸ†• V6.2

Apply categorizations in batch after review.

{ "approve": ["id1", "id2"], // Auto-categorize "reject": ["id3"], // Skip "reclassify": {"id4": "archived"} // Force category }

5. search_memory

Semantic search with filters.

{ "query": "SAE project details", "limit": 5, "category": "active_project" }

6. auto_categorize_memories

Batch auto-categorize uncategorized memories.

{ "max_to_process": 50, "min_confidence": 0.75 }

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Claude (MCP) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ MCP Server (v6.2) β”‚ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ β”‚ β”‚ Auto-Detection β”‚ β”‚ β”‚ β”‚ - Keyword matching β”‚ β”‚ β”‚ β”‚ - Confidence score β”‚ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β”‚ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ β”‚ β”‚ Batch Review β”‚ β”‚ β”‚ β”‚ - Review workflow β”‚ β”‚ β”‚ β”‚ - Bulk operations β”‚ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Elasticsearch β”‚ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ β”‚ β”‚ memories (index) β”‚ β”‚ β”‚ β”‚ - embeddings (vector) β”‚ β”‚ β”‚ β”‚ - memory_category β”‚ β”‚ β”‚ β”‚ - category_confidence β”‚ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ 

πŸ“Š Category System

Category Description Examples
identity Core identity, values, preferences "Fred prefers brutal honesty"
active_context Current work, recent conversations "Working on SAE implementation"
active_project Ongoing projects "Mirror architecture design"
technical_knowledge Facts, configs, tools "Elasticsearch index settings"
archived Completed, deprecated, old migrations "Refactored old auth system"

🎯 Auto-Detection Examples

High Confidence (0.8-0.95)

"Fred prefere comunicaΓ§Γ£o brutal" β†’ identity (0.9) "RefatoraΓ§Γ£o do sistema SAE concluΓ­da" β†’ archived (0.85) "PrΓ³ximos passos: implementar dashboard" β†’ active_context (0.8) 

Multiple Keywords (Accumulative Scoring)

"Fred prefere comunicaΓ§Γ£o brutal. Primeira vez usando este estilo." β†’ Match 1: "Fred prefere" (+0.9) β†’ Match 2: "primeira vez" (+0.8) β†’ Total: 0.95 (normalized) 

πŸ”„ Migration from V5

The v6.2 system includes automatic fallback for v5 memories:

  1. Uncategorized memories β†’ Loaded via type/tags fallback
  2. Visual separation β†’ Categorized vs. fallback sections
  3. Batch review β†’ Categorize old memories efficiently
# Review and categorize v5 memories review_uncategorized_batch(batch_size=20) apply_batch_categorization(approve=[...])

πŸš€ Performance

  • Load initial context: ~10-15s (includes embedding model load)
  • Save memory: <1s
  • Search: <500ms
  • Batch review (10 items): ~2s
  • Auto-categorize (50 items): ~5s

πŸ§ͺ Testing

# Run quick test python test_quick.py # Expected output: # βœ… Elasticsearch connected # βœ… Context loaded # βœ… Identity memories found # βœ… Projects separated from fallback

πŸ“ Changelog

V6.2 (Latest)

  • βœ… Improved auto-detection (0.4 β†’ 0.9 confidence)
  • βœ… 23 new specialized keywords
  • βœ… Batch review tools (review_uncategorized_batch, apply_batch_categorization)
  • βœ… Visual separation (categorized vs fallback)
  • βœ… Accumulative confidence scoring

V6.1

  • βœ… Fallback mechanism for uncategorized memories
  • βœ… Backward compatibility with v5

V6.0

  • βœ… Memory categorization system
  • βœ… Hierarchical context loading
  • βœ… Auto-detection with confidence

🀝 Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

πŸ“ž Support


Made with ❀️ for the Claude ecosystem

About

🧠 Elasticsearch-powered MCP server with hierarchical memory categorization, intelligent auto-detection, and batch review capabilities

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages