PostgreSQL + REM Labs: Structured Memory for AI

PostgreSQL is already the backbone of most production applications. This guide shows how to connect your existing Postgres tables to REM Labs, giving your AI features structured memory with semantic search, full-text retrieval, and entity graph queries -- without migrating data.

The Architecture

Your app writes to Postgres as normal. A lightweight sync layer pushes relevant rows to REM Labs, where they are indexed for vector search, full-text search, and entity extraction. When your AI needs context, it queries REM instead of running expensive LLM calls over raw SQL results.

Step 1: Install Dependencies

npm install @remlabs/sdk pg

Step 2: Create a Sync Script

This script reads from a Postgres table and pushes each row to REM Labs:

import { RemClient } from "@remlabs/sdk"; import { Pool } from "pg"; const pool = new Pool({ connectionString: process.env.DATABASE_URL }); const rem = new RemClient({ apiKey: process.env.REMLABS_API_KEY }); async function syncTable(table: string, contentCol: string) { const { rows } = await pool.query( `SELECT id, ${contentCol}, updated_at FROM ${table} WHERE updated_at > NOW() - INTERVAL '1 hour'` ); for (const row of rows) { await rem.remember({ content: row[contentCol], namespace: `postgres:${table}`, metadata: { pg_id: row.id, table }, tags: [table] }); } console.log(`Synced ${rows.length} rows from ${table}`); } await syncTable("meeting_notes", "body"); await syncTable("project_decisions", "summary");

Step 3: Trigger Sync with LISTEN/NOTIFY

For real-time sync, use Postgres LISTEN/NOTIFY to push changes as they happen:

-- In Postgres: create a trigger that notifies on changes CREATE OR REPLACE FUNCTION notify_rem_change() RETURNS trigger AS $$ BEGIN PERFORM pg_notify('rem_sync', json_build_object( 'table', TG_TABLE_NAME, 'id', NEW.id, 'content', NEW.body, 'op', TG_OP )::text); RETURN NEW; END; $$ LANGUAGE plpgsql; CREATE TRIGGER notes_rem_trigger AFTER INSERT OR UPDATE ON notes FOR EACH ROW EXECUTE FUNCTION notify_rem_change();
// Node.js listener import { Client } from "pg"; import { RemClient } from "@remlabs/sdk"; const pg = new Client({ connectionString: process.env.DATABASE_URL }); const rem = new RemClient({ apiKey: process.env.REMLABS_API_KEY }); await pg.connect(); await pg.query("LISTEN rem_sync"); pg.on("notification", async (msg) => { const data = JSON.parse(msg.payload); await rem.remember({ content: data.content, namespace: `postgres:${data.table}`, metadata: { pg_id: data.id } }); });

Step 4: Query AI Memory

// Your AI endpoint can now recall structured context const memories = await rem.recall({ query: "What architecture decisions were made about auth?", namespace: "postgres:project_decisions", limit: 10 }); // Returns scored results with original Postgres metadata memories.forEach(m => { console.log(`[${m.score}] ${m.content} (pg_id: ${m.metadata.pg_id})`); });

Performance note: REM Labs uses multi-signal fusion (vector + FTS5 + entity graph) to rerank results. This outperforms raw pgvector queries because it combines semantic similarity with keyword matching and relationship awareness.

When to Use This Pattern

Add AI memory to your Postgres app

Free tier. No data migration. Multi-signal search on your existing tables.

Get Started