Semantic Memory for Claude AI

Remember Everything with
Semantic Search

Custom MCP memory server deployed on Vercel Edge. OpenAI embeddings meet PostgreSQL pgvector for blazing fast, intelligent memory retrieval.

Available 24/7
Zero Infrastructure
Open Source MIT
Unlimited Memories
<100ms
Search Latency
1536
Vector Dimensions
24/7
Always Available

Everything You Need

Production-ready features for building intelligent memory systems

Semantic Search
Find memories by meaning, not just keywords. Powered by OpenAI embeddings.
Secure API
Optional API key authentication. Input validation with Zod schemas.
Cloud-Hosted
Deployed on Vercel Edge Functions. Zero infrastructure management.
pgvector Storage
PostgreSQL with pgvector extension for efficient vector similarity search.
Edge Runtime
Fast response times with Vercel Edge Functions and optimized queries.
Cross-Device Sync
Access your memories from any device running Claude Desktop.

Get Started in Minutes

Deploy your own memory server with these simple steps

1
Clone and Install
Clone the repository and install dependencies
Terminal
git clone https://github.com/evgenygurin/vercel-mcp-memory.git
cd vercel-mcp-memory
npm install
2
Configure Environment
Set up your environment variables
OPENAI_API_KEY=sk-...
POSTGRES_URL=postgresql://...
MCP_API_KEYS=your-secret-key
3
Deploy to Vercel
Deploy with a single command
vercel --prod

MCP Tools API

Five powerful tools for memory management

add_memory
Store new memories with semantic embeddings
content: string (required)
category: string (optional)
metadata: object (optional)
search_memory
Semantic search across all memories
query: string (required)
limit: number (default: 10)
threshold: number (default: 0.5)
list_memories
List memories with pagination
category: string (optional)
limit: number (default: 50)
offset: number (default: 0)
delete_memory
Delete memory by UUID
id: string (UUID required)
memory_stats
Get statistics about stored memories
Returns total count, categories, users, and date ranges