sheawinkler/ContextLattice
Apache-2.0API key required](https://glama.ai/mcp/servers/sheawinkler/context-lattice) 🏠 🍎 🪟 🐧 🦀 🏎️ - Private-by-default memory and context layer for agents with Go/Rust runtime, staged retrieval across fused data backend
Install
npx -y sheawinkler-contextlatticeRequired environment variables
ORCH_KEYSet in your MCP configCONTEXTLATTICE_ORCHESTRATOR_API_KEYSet in your MCP configCONTEXTLATTICE_WORKER_API_KEYSet in your MCP configCONTEXT_EXPANSION_L0_BUDGET_TOKENSet in your MCP configCONTEXT_EXPANSION_L1_BUDGET_TOKENSet in your MCP configCONTEXT_EXPANSION_L2_BUDGET_TOKENSet in your MCP configclaude_desktop_config.json
{
"mcpServers": {
"sheawinkler-contextlattice": {
"command": "npx",
"args": [
"-y",
"sheawinkler-contextlattice"
],
"env": {
"ORCH_KEY": "<YOUR_ORCH_KEY>",
"CONTEXTLATTICE_ORCHESTRATOR_API_KEY": "<YOUR_CONTEXTLATTICE_ORCHESTRATOR_API_KEY>",
"CONTEXTLATTICE_WORKER_API_KEY": "<YOUR_CONTEXTLATTICE_WORKER_API_KEY>",
"CONTEXT_EXPANSION_L0_BUDGET_TOKEN": "<YOUR_CONTEXT_EXPANSION_L0_BUDGET_TOKEN>",
"CONTEXT_EXPANSION_L1_BUDGET_TOKEN": "<YOUR_CONTEXT_EXPANSION_L1_BUDGET_TOKEN>",
"CONTEXT_EXPANSION_L2_BUDGET_TOKEN": "<YOUR_CONTEXT_EXPANSION_L2_BUDGET_TOKEN>"
}
}
}
}Add this to your Claude Desktop config file. Find it at ~/Library/Application Support/Claude/claude_desktop_config.json on macOS.
More Python MCP servers
microsoft/markitdown
🎖️ 🐍 🏠 - MCP tool access to MarkItDown -- a library that converts many file formats (local or remote) to Markdown for LLM consumption.
netdata/netdata#Netdata
🎖️ 🏠 ☁️ 📟 🍎 🪟 🐧 - Discovery, exploration, reporting and root cause analysis using all observability data, including metrics, logs, systems, containers, processes, and network connections
upstash/context7
📇 ☁️ - Up-to-date code documentation for LLMs and AI code editors.
mindsdb/mindsdb
Connect and unify data across various platforms and databases with .