MongoDB Integration for AI Knowledge Persistence

MongoDB stores your app data as flexible documents. REM Labs turns those documents into searchable AI memory with semantic retrieval, entity graphs, and temporal awareness. This guide shows how to connect them using Change Streams for real-time sync.

Why Not Just Use Atlas Vector Search?

MongoDB Atlas Vector Search handles embedding similarity, but AI memory needs more. REM Labs combines vector search with full-text retrieval, entity graph traversal, and neural reranking -- the same multi-signal fusion that scores 90% on LongMemEval. You also get temporal decay, namespace isolation, and memory consolidation out of the box.

Step 1: Install the SDK

npm install @remlabs/sdk mongodb

Step 2: Bulk Import Existing Documents

Start by importing your existing MongoDB collection into REM Labs:

import { RemClient } from "@remlabs/sdk"; import { MongoClient } from "mongodb"; const mongo = new MongoClient(process.env.MONGODB_URI); const rem = new RemClient({ apiKey: process.env.REMLABS_API_KEY }); await mongo.connect(); const db = mongo.db("myapp"); const notes = db.collection("notes"); const cursor = notes.find({ archived: false }); let count = 0; for await (const doc of cursor) { await rem.remember({ content: doc.title + "\n\n" + doc.body, namespace: "mongodb:notes", tags: doc.tags || [], metadata: { mongo_id: doc._id.toString(), collection: "notes", author: doc.author } }); count++; } console.log(`Imported ${count} documents`);

Step 3: Real-Time Sync with Change Streams

Use MongoDB Change Streams to push new and updated documents to REM as they arrive:

const pipeline = [ { $match: { operationType: { $in: ["insert", "update", "replace"] } } } ]; const changeStream = notes.watch(pipeline); changeStream.on("change", async (change) => { const doc = change.fullDocument; if (!doc) return; await rem.remember({ content: doc.title + "\n\n" + doc.body, namespace: "mongodb:notes", tags: doc.tags || [], metadata: { mongo_id: doc._id.toString(), collection: "notes" } }); console.log(`Synced ${doc._id} to REM`); });

Step 4: Query AI Memory

Your AI features can now recall context from MongoDB documents using natural language:

// In your AI chat endpoint or agent const memories = await rem.recall({ query: "What were the key decisions from the Q3 planning session?", namespace: "mongodb:notes", limit: 5 }); // Use memories as context for your LLM prompt const context = memories.map(m => m.content).join("\n---\n"); const response = await openai.chat.completions.create({ model: "gpt-4o", messages: [ { role: "system", content: `Use this context:\n${context}` }, { role: "user", content: userQuery } ] });

Handling Deletes

When documents are removed from MongoDB, clean up the corresponding memory:

const deleteStream = notes.watch([ { $match: { operationType: "delete" } } ]); deleteStream.on("change", async (change) => { const mongoId = change.documentKey._id.toString(); // Search for the memory by metadata const results = await rem.recall({ query: mongoId, namespace: "mongodb:notes", limit: 1 }); if (results.length > 0) { await rem.forget({ id: results[0].id }); } });

Replica set required: MongoDB Change Streams require a replica set or sharded cluster. If you use MongoDB Atlas, this is already enabled. For local development, start mongod with --replSet.

Turn MongoDB documents into AI memory

Free tier. Change Stream sync. Multi-signal retrieval.

Get Started