
cogneeâmcp - Run cogneeâs memory engine as a Model Context Protocol server
Demo . Learn more · Join Discord · Join r/AIMemory
Build memory for Agents and query from any client that speaks MCPÂ â in your terminal or IDE.
âš Features
- Multiple transports â choose Streamable HTTP --transport http (recommended for web deployments), SSE --transport sse (realâtime streaming), or stdio (classic pipe, default)
- Integrated logging â all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
- Local file ingestion â feed .md, source files, Cursor ruleâsets, etc. straight from disk
- Background pipelines â longârunning cognify & codify jobs spawn offâthread; check progress with status tools
- Developer rules bootstrap â one call indexes .cursorrules, .cursor/rules, AGENT.md, and friends into the developer_rules nodeset
- Prune & reset â wipe memory clean with a single prune call when you want to start fresh
Please refer to our documentation here for further information.
đ Quick Start
- Clone cognee repo
git clone https://github.com/topoteretes/cognee.git
- Navigate to cognee-mcp subdirectory
cd cognee/cognee-mcp
- Install uv if you don't have one
pip install uv
- Install all the dependencies you need for cognee mcp server with uv
uv sync --dev --all-extras --reinstall
- Activate the virtual environment in cognee mcp directory
source .venv/bin/activate
- Set up your OpenAI API key in .env for a quick setup with the default cognee configurations
LLM_API_KEY="YOUR_OPENAI_API_KEY"
- Run cognee mcp server with stdio (default)
or stream responses over SSEpython src/server.py
or run with Streamable HTTP transport (recommended for web deployments)python src/server.py --transport sse
python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
You can do more advanced configurations by creating .env file using our template. To use different LLM providers / database configurations, and for more info check out our documentation.
đł Docker Usage
If youâd rather run cognee-mcp in a container, you have two options:
- Build locally
- Make sure you are in /cognee root directory and have a fresh
.env
containing only yourLLM_API_KEY
(and your chosen settings). - Remove any old image and rebuild:
docker rmi cognee/cognee-mcp:main || true docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main .
- Run it:
# For HTTP transport (recommended for web deployments) docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http # For SSE transport docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse # For stdio transport (default) docker run --env-file ./.env --rm -it cognee/cognee-mcp:main
- Make sure you are in /cognee root directory and have a fresh
- Pull from Docker Hub (no build required):
# With HTTP transport (recommended for web deployments) docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http # With SSE transport docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse # With stdio transport (default) docker run --env-file ./.env --rm -it cognee/cognee-mcp:main
đ» Basic Usage
The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).
Available Tools
-
cognify: Turns your data into a structured knowledge graph and stores it in memory
-
codify: Analyse a code repository, build a code graph, stores it in memory
-
search: Query memory â supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS, INSIGHTS
-
list_data: List all datasets and their data items with IDs for deletion operations
-
delete: Delete specific data from a dataset (supports soft/hard deletion modes)
-
prune: Reset cognee for a fresh start (removes all data)
-
cognify_status / codify_status: Track pipeline progress
Data Management Examples:
# List all available datasets and data items
list_data()
# List data items in a specific dataset
list_data(dataset_id="your-dataset-id-here")
# Delete specific data (soft deletion - safer, preserves shared entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft")
# Delete specific data (hard deletion - removes orphaned entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")
Remember â use the CODE search type to query your code graph. For huge repos, run codify on modules incrementally and cache results.
IDE Example: Cursor
-
After you run the server as described in the Quick Start, create a run script for cognee. Here is a simple example:
#!/bin/bash export ENV=local export TOKENIZERS_PARALLELISM=false export EMBEDDING_PROVIDER="fastembed" export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2" export EMBEDDING_DIMENSIONS=384 export EMBEDDING_MAX_TOKENS=256 export LLM_API_KEY=your-OpenAI-API-key uv --directory /{cognee_root_path}/cognee-mcp run cognee
Remember to replace your-OpenAI-API-key and {cognee_root_path} with correct values.
-
Install Cursor and navigate to Settings â MCP Tools â New MCP Server
-
Cursor will open mcp.json file in a new tab. Configure your cognee MCP server by copy-pasting the following:
{ "mcpServers": { "cognee": { "command": "sh", "args": [ "/{path-to-your-script}/run-cognee.sh" ] } } }
Remember to replace {path-to-your-script} with the correct value of the path of the script you created in the first step.
That's it! You can refresh the server from the toggle next to your new cognee server. Check the green dot and the available tools to verify your server is running.
Now you can open your Cursor Agent and start using cognee tools from it via prompting.
Development and Debugging
Debugging
To use debugger, run:
bash mcp dev src/server.py
Open inspector with timeout passed:
http://localhost:5173?timeout=120000
To apply new changes while developing cognee you need to do:
poetry lock
in cognee folderuv sync --dev --all-extras --reinstall
mcp dev src/server.py
Development
In order to use local cognee:
-
Uncomment the following line in the cognee-mcp
pyproject.toml
file and set the cognee root path.#"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users/<username>/Desktop/cognee"
Remember to replace
file:/Users/<username>/Desktop/cognee
with your actual cognee root path. -
Install dependencies with uv in the mcp folder
uv sync --reinstall
Code of Conduct
We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT
for more information.