Skip to main content
The Mem0 REST API server exposes every OSS memory operation over HTTP. Run it alongside your stack to add, search, update, and delete memories from any language that speaks REST.
You’ll use this when…
  • Your services already talk to REST APIs and you want Mem0 to match that style.
  • Teams on languages without the Mem0 SDK still need access to memories.
  • You plan to explore or debug endpoints through the built-in OpenAPI page at /docs.
Add your own authentication and HTTPS before exposing the server to anything beyond your internal network. The default image does not include auth.

Feature

  • CRUD endpoints: Create, retrieve, search, update, delete, and reset memories by user_id, agent_id, or run_id.
  • Status health check: Access base routes to confirm the server is online.
  • OpenAPI explorer: Visit /docs for interactive testing and schema reference.

Configure it

Run with Docker Compose (development)

  • Steps
  1. Create server/.env with your keys:
OPENAI_API_KEY=your-openai-api-key
  1. Start the stack:
cd server
docker compose up
  1. Reach the API at http://localhost:8888. Edits to the server or library auto-reload.

Run with Docker

  • Pull image
  • Build locally
docker pull mem0/mem0-api-server
  1. Create a .env file with OPENAI_API_KEY.
  2. Run the container:
docker run -p 8000:8000 --env-file .env mem0-api-server
  1. Visit http://localhost:8000.

Run directly (no Docker)

pip install -r requirements.txt
uvicorn main:app --reload
Use a process manager such as systemd, Supervisor, or PM2 when deploying the FastAPI server for production resilience.
The REST server reads the same configuration you use locally, so you can point it at your preferred LLM, vector store, graph backend, and reranker without changing code.

See it in action

Create and search memories via HTTP

curl -X POST http://localhost:8000/memories \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {"role": "user", "content": "I love fresh vegetable pizza."}
    ],
    "user_id": "alice"
  }'
Expect a JSON response containing the new memory IDs and events (ADD, etc.).
curl "http://localhost:8000/memories/search?user_id=alice&query=vegetable"

Explore with OpenAPI docs

  1. Navigate to http://localhost:8000/docs.
  2. Pick an endpoint (e.g., POST /memories/search).
  3. Fill in parameters and click Execute to try requests in-browser.
Export the generated curl snippets from the OpenAPI UI to bootstrap integration tests.

Verify the feature is working

  • Hit the root route and /docs to confirm the server is reachable.
  • Run a full cycle: POST /memoriesGET /memories/{id}DELETE /memories/{id}.
  • Watch server logs for import errors or provider misconfigurations during startup.
  • Confirm environment variables (API keys, vector store credentials) load correctly when containers restart.

Best practices

  1. Add authentication: Protect endpoints with API gateways, proxies, or custom FastAPI middleware.
  2. Use HTTPS: Terminate TLS at your load balancer or reverse proxy.
  3. Monitor uptime: Track request rates, latency, and error codes per endpoint.
  4. Version configs: Keep environment files and Docker Compose definitions in source control.
  5. Limit exposure: Bind to private networks unless you explicitly need public access.